Disabled users can now access Android phones that support facial gestures

Disabled users can now access Android phones that support facial gestures

SAN FRANCISCO: With an eyebrow raised or a smile, people who have physical or speech Disabled users can now use Android smartphones hands-free. Google announced on Thursday.

Two new applications allow front-facing cameras and machine learning on phones to detect the movements of eyes and faces. Users can scan their phone’s screen and pick an option by smiling, raising their eyebrows or opening their mouths, or looking at either the right, left, or upwards.

Disabled users

“To make Android more accessible for everyone, we’re launching new tools that make it easier to control your phone and communicate using facial gestures,” Google announced.

The Centers for Disease Control and Prevention estimates that 61 million people living in the United States live with Disabled users that have driven Google and its competitors Apple along with Microsoft to develop their products more accessible for those with disabilities.

“Every day, people use voice commands, like ‘Hey Google’, or their hands to navigate their phones,” the tech giant stated in an article on its blog.

“However, that’s not always possible for people with severe motor and speech disabilities.”

The new features are the result of two brand new features. One is called “Camera Switches,” which allows users to use their faces instead of tapping and swipes to interact with smartphones.

The alternative one is Project Activate, a new Android application that lets users use gestures that initiate an action, for example, hearing a phone play an audio message or send a text message, or even make a phone call.

“Now it’s possible for anyone to use eye movements and facial gestures that are customized to their range of movement to navigate their phone – sans hands and voice,” Google declared.

The no-cost Activate application can be downloaded in Australia, Britain, Canada and Canada, and United States at the Google Play shop.

Apple, Google, and Microsoft have continuously introduced technological innovations that make the internet accessible to those with disabilities or those who feel that aging has made certain tasks, like reading, more difficult.

Digital assistants that can be controlled by voice built into smartphones and speakers could assist people with vision or mobility issues to instruct computers on what to do.

The software recognizes the text on websites or images and reads it aloud in addition to the automatic creation of captions that show what’s being said in videos.

An “AssistiveTouch” feature that Apple integrated into the software that powers its smartwatch lets touchscreens be controlled via detecting movements like fingers clenching or finger pinches.

“This feature also works with VoiceOver so you can navigate Apple Watch with one hand while using a cane or leading a service animal,” Apple announced in a blog post.

Computing giant Microsoft defines accessibility as crucial in empowering everyone using technological tools.

“To enable transformative change accessibility needs to be a priority,” Microsoft declared in a blog post.

“We aim to build it into what we design for every team, organization, classroom, and home.”

CATEGORIES
TAGS
Share This
× How can I help you?