Face Detection and Gestures May Complement Touchscreens

By Wesley Fenlon

Most of our mobile devices now have front-facing cameras that can recognize our faces. Can we use those cameras to automate how we interact with technology?

We're still getting better at touch controls. As touch becomes the standard form of computer interaction with smartphones, tablets, and now Windows 8 laptops, we're still learning how to make better touch sensors to eliminate tracking delays. We're still learning how to develop better touch software to make interaction more accurate and efficient. At best, it's an incredibly intuitive technology--there's no hardware standing between you and direct control of a virtual environment--but some developers are banking on gesture recognition, not touch, as the future (or at least a future) of tech.

Technology Review recently profiled PredictGaze, a startup working on gesture technology that uses cameras to analyze faces and hand signals. For example, a camera attached to your TV could see you make a finger-to-lips "shoosh" gesture to mute the television. But the real appeal of PredictGaze is in its combination of machine learning and computer vision, which allow it to act on its own and predict what your behavior indicates. Another TV example: Walk out of the living room, and the camera knows no one is watching TV anymore. It pauses a video or turns the screen off.

Photo Credit: Flickr user dpup via creative commons.

If touch is the most direct way we can control electronics right now, the next big thing will either be something more direct--like a brain-computer interface--or it will be predictive and automated. PredictGaze can detect faces and identify genders and smiles. Its applications extend beyond the living room; Technology Review mentions the software could watch you while driving and make sure you don't fall asleep at the wheel, or be used in a store to read customers' reactions to products.

Though there's some truly impressive technology behind motion systems like Microsoft's Kinect, that tech hasn't amounted to much yet. We're using gestures in place of mouse controls or using facial recognition to unlock our smartphones. PredictiveGaze is hardly the only company working on software that recognizes human faces and movements.

The challenge is making that software smart--really smart--and accurate. When we mess up controls with a touchscreen, it may be a touch sensor issue or a design issue, but our own finger or thumb is likely partially to blame. With predictive technology, the entire burden of failure lands on the system. The first time it mutes our TV while we're sitting there watching it, we pick up a remote and go back to manual control.