MIT’s Media Lab is known for creating some of the most advanced technology for accessibility. One of those innovations is the EyeRing built by MIT’s Fluid Interfaces Group.
The EyeRing is innovative in that it allows a person who is blind to point the ring at an object and speak what they want to know, such as “What color is this t-shirt?,” and the mobile app responds almost instantly with the answer. The ring would also be able to help a person shop by telling them what currency they were pulling out of their wallet, among other things.
The EyeRing works by using a small camera that connects wirelessly by bluetooth to a mobile app, currently operating on the Android OS. The app processes the image and uses text-to-speech capabilities to find an answer within seconds. While the EyeRing is still two years away from being commercially available, MIT is already working on an iOS app for the device.
View a demo of the MIT EyeRing: