Amazon has announced a new feature to make Alexa more accessible to people with visual impairments. From now on, the assistant is able to inform which object is being pointed at Echo Show, the version of the smart speaker that has a screen.
Called Show and Tell (“show and tell”, in free translation), the feature allows Alexa to analyze and inform the name of products that are not easily recognized by touch, such as canned and boxed foods.
To do this, just approach the Echo Show and say phrases like “Alexa, what am I holding?” or “Alexa, what’s in my hand?”. If necessary, the assistant will provide audio guidance so that the user can position the item in front of the device’s camera.
The solution, which is available to Echo Show users in the United States, was created in conjunction with Amazon employees who are visually impaired. The company also states that the return of its customers was essential for the option to be developed.
“We heard that product identification can be a challenge and something that customers wanted help from Alexa,” said Amazon. “Whether a customer is choosing a bag of food or trying to determine which item was left on the counter, we want to make these moments simpler by helping to identify those items and providing customers with the information they need at that moment.”
This is not the only accessibility feature of Amazon Echo. The device allows Alexa to speak more slowly and highlight shortcuts such as weather, news and alarm clock on the Echo Show screen. The device also offers an option to display subtitles with the assistant’s response.
With information: VentureBeat.