3 December marks the International Day of Persons with Disabilities, a day to spread awareness of disability issues and promote the rights and well-being of people with disabilities. One significant and ongoing discussion surrounding disability is about digital accessibility.
According to research conducted by Amazon Devices and Services, more than three-quarters (77%) of adults living with a disability use technology to help them with everyday tasks, doing so an average of 13 times per day.
Earlier this year, Amazon announced the release of Eye Gaze on Alexa, our first feature that helps customers with mobility or speech impairments to control Alexa with their eyes. This feature is now available to Amazon customers in the UK on Fire Max 11 tablets.
Eye Gaze on Alexa: The latest accessibility feature for Fire Max 11
Alexa’s new feature is a simple and affordable way for customers with speech and mobility disabilities to gain more independence and perform specific actions, such as playing music or turning on lights.
Caregivers can also customise Eye Gaze on Alexa dashboards with different Alexa actions, colours, and icons that appear on the tablet screen as tiles, making the feature adaptable to a customer’s unique needs.
Customers can now access Eye Gaze on Alexa on a Fire Max 11 tablet by going to Settings > Accessibility > Data. This feature is available at no additional cost.
Customers with smart home products can also use Alexa to control them hands-free, such as monitoring the home with smart cameras, turning on appliances such as kettles, adjusting the heating and managing entertainment.
“The simplicity of interacting with Alexa is gold”
As Head of Digital Inclusion at UK charity AbilityNet, Robin Christopherson, who is blind, wants to highlight how technology can open doors for people with disabilities.
Robin said: “We can't underestimate the importance of the ease of use of smart speakers for people with a range of difficulties, who would otherwise have to have a much more complicated interaction. Just to be able to ask something and instantly get a succinct response or transaction is amazing.”
“For me, the utility and ease of use of the smart speaker is light years better than interacting with a website, for example, which is a really complicated proposition for someone like myself who can't see. We can't just glance at the middle of a webpage; we have to plough through lots of words, links, buttons, and things like that. The simplicity of interacting with Alexa is important, and for me that interaction is gold.”
Show and Tell
People who are blind or partially sighted can use the Show and Tell feature on any Echo Show to identify common packaged food goods that are hard to distinguish by touch alone, such as canned or boxed foods. Simply ask, “Alexa, what am I holding?” or “Alexa, what’s in my hand?” to get started, and Alexa will give you verbal and audio cues to help you place the item in front of the device’s camera.
VoiceView Screen Reader
Alexa’s VoiceView feature is a screen reader for any Echo device with a screen. When enabled, it lets you use gestures to navigate the device while the feature reads out the actions performed on the screen.
Tap to Alexa
Enable Tap to Alexa on your Echo Show device to use Alexa without your voice. Instead, tap the touchscreen to access helpful features such as the weather, news, timers, and other information.
The Adaptive Listening feature gives you more time to finish speaking before Alexa responds, making it easier to interact with Alexa and helping you get the most out of your experience.
Preferred Speaking Rate
If you’re finding that Alexa talks too fast or too slowly for your liking, just ask Alexa to adjust the speaking rate to your preference.
Discover more about how Echo devices support accessibility.