Saturday, July 6, 2024
HometechnologyAndroid Apps to Be Controlled with Facial Expressions

Android Apps to Be Controlled with Facial Expressions

Google is introducing technology to Android that allows users to control the computer cursor with facial and head movements. Applications will now be controllable with facial expressions.

At the I/O 2023 event, Google introduced Project Gameface, an open-source technology that enables users to control the computer cursor with head and facial movements, making gaming and other tasks more accessible. Inspired by the story of Lance Carr, a paralyzed game streamer with muscular dystrophy, Google collaborated with him to develop the Gameface Project. Now, this technology is coming to Android.

Project Gameface Comes to Android At the I/O ’24 event, Google announced that the code for Project Gameface is being made open source for Android developers. Developers can now integrate accessibility features into their applications, allowing users to control the cursor with facial expressions or by moving their heads. For example, users can move the cursor by opening their mouth or click and drag by raising their eyebrows.

Project Gameface utilizes the device’s camera and the database of facial expressions in Google’s MediaPipe Face Landmark API to control the cursor. Google says, “It seamlessly tracks facial expressions and head movements through the device’s camera, turning them into intuitive and personalized controls. Developers can now build applications where users can customize their experiences by adjusting facial expressions, movement size, cursor speed, and more.”

Overall, Project Gameface aims to make technology more inclusive and accessible to all users regardless of physical abilities. By enabling disabled individuals to control their devices with natural movements, Google is paving the way for a more inclusive digital future.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

RELATED ARTICLES

Most Popular

Recommended News