Microsoft developer Saqib Shaikh lost his sight when he was only 7 years old. Flash forward to now, and Shaikh is building Seeing AI, a cognitive app that aims to help those who are visually impaired or blind have a better understanding of the world around them.
The app was demonstrated in a video shown at the Microsoft Build 2016 Developer Conference in San Francisco Wednesday. In it, Shaikh demonstrates how the app could help serve as something of a digital seeing eye dog. For instance, in a restaurant, he can take a photo of the menu on his phone -- a voice in the app guides him until he's got the image centered -- and the artificial intelligence will read for him the contents of the menu.
The app is designed to run on smartphones and also works with special smart glasses that have a tiny camera built in. The camera can "see" people or things in their path, the app recognizes who or what they are, and a digital voice relays the information to the user in real time.
In the video demo, the system helped Shaikh know what was going on around him as he walked down the street, and even described who was sitting around the table at business meeting.
The project is part of Microsoft's larger push to advance artificial intelligence and incorporate it into more aspects of life in the near future. The software used to develop Seeing AI is part of the larger Cortana Intelligence Suite, which makes "big data, machine learning, perception, analytics, and intelligent bots" available to developers, according to a Microsoft press release.