AT&T has paired up with Aira, a La Jolla, California company, to develop technology to assist those with low vision to read important text, such as labels on medication bottles. Their system relies on Aira’s augmented reality smart glasses, which contain a camera and allow people with low vision to talk with a remote agent over a cell phone connection. The agent can view the user’s environment through the camera mounted on the glasses, and speak to them through an earpiece. Interacting with an agent can help users to make sense of their environment, or perform specific tasks such as crossing the street, or helping their children with homework.
In this new collaboration, AT&T has created an AI (Artificial Intelligence) platform to reduce the reliance on human agents for text recognition tasks. The system, called Chloe, can automatically read text and relay it to the user through their earpiece. Users simply state “Hey Chloe, read this” while holding the text in view of the camera, and the AI system will do the rest, using algorithms to analyze and comprehend the text.
Here’s a quick video introducing the technology:
Medgadget had the opportunity to ask Tad Reynes, regional vice president, Internet of Things Healthcare Solutions at AT&T and Greg Stilson, Director of Product Development at Aira, some questions about the product and the concept.
Conn Hastings, Medgadget: How did you get involved in this area? How did this collaboration arise?
Greg Stilson, Aira: I have been a product manager in the assistive technology industry for the past 12 years. I also am blind myself and had been using the Aira service for about a year prior to joining Aira as the director of product. We have always had a fantastic partnership with AT&T, really working together since Aira started rolling out to customers.
Because the Aira experience requires high quality data signal coverage to empower our Aira agents with high quality video from the Explorer’s glasses or phone camera, we have been working with AT&T to ensure our Explorers have the best signal coverage and quality possible.
Tad Reynes, AT&T: We have been working with Aira for about 2 years now. Aira recognized that they needed an IoT carrier to provide connectivity to their smart glasses and initially starting working with AT&T to fill that need.
Soon after, we realized that AT&T had capabilities in our Foundry that could assist Aira with the development of their glasses and work on a project to enhance their glasses using AT&T assets and technology.
Medgadget: Can you explain the Aira smart glasses concept?
Greg Stilson: Absolutely. Currently we are rolling out our new Aira Horizon smart glasses, which consist of a pair of stylish sun glasses with a center mounted camera. These glasses are connected to the dedicated Horizon phone that drives the entire experience. This Horizon phone manages the AT&T data, allows an explorer to call an agent just by double pressing the home button, and also supports Chloe, Aira’s AI agent.
The camera on the glasses provides an Aira agent a 120 degree field of view, as compared to our previous glasses option which provided a 65 degree field of view. It is like our agents all of a sudden got peripheral vision.
The glasses connect to the dedicated horizon phone for two reasons, one, the phone battery is what drives the glasses. This allows an explorer to use the Aira service for up to 7 hours of video streaming. And two, it is possible now to use Aira even if you are not a smart phone user. In the past, a user had to download the Aira app to use the service. With Horizon this is no longer a requirement. Someone can literally purchase the service, put on the glasses and double press the call button on the Horizon phone and they will be connected to an Aira agent instantly. This drastically reduces the learning curve.
Medgadget: What advantages does the new AI platform offer users? How does the system work, and how accurate is it at interpreting text?
Greg Stilson: Currently Chloe, our AI agent is in her early stages of learning. She can help Explorers with voice-based tasks like checking the time and date, identifying the phone’s signal strength or battery level, and of course can call an agent for the explorer for a hands-free experience.
Text is present virtually everywhere in our lives and comes on different colored backgrounds and in varying styles and contrasts. As a general rule, the more contrasted the text, the better the AI will be able to read it. We are currently working on image optimization methods for even more accurate reading.
Tad Reynes: We take for granted all of the things that vision allows us to do every day. Those that are blind or suffering from low vision cannot perform simple tasks that we take for granted. Our AI project focused on recognizing pill bottles and providing feedback on what medications patients were taking and when they should take them. Aira users (Explorers) asked for help from Chloe, who is Aira’s version of Alexa, and Chloe provided feedback on the pill bottles that the Explorer was holding. The project proved to be very successful.
Medgadget: Is the AI text recognition function specifically conceived for interpreting prescriptions or labels on medicines, or is a greater range of text recognition tasks possible?
Tad Reynes: There is a far greater range of possibilities for AI outside of prescription recognition. In fact, one of the Explorers that used AI for prescription recognition also used AI to sort through junk mail. This seems like a minor task to the rest of us, but he has been paying someone an hour a week to do this task for him.
Medgadget: How has the new system been received by users?
Greg Stilson: We currently have users who use the reading feature on a regular basis and have found tasks which it reads with great accuracy. As the feature is used more frequently, we will be able to further improve it based on common task data.
Medgadget: Do you have any future plans for the system?
Greg Stilson: Absolutely. This spot reading feature is only the beginning for Aira’s AI path. We obviously want to improve the reading capabilities. But down the road, we plan on further augmenting the agent and explorer experience with many AI benefits, from object recognition, to potentially assisting with navigation support. When it comes to ways that AI can help those who are blind or low vision, the sky is the limit.
Link: Aira