Quantcast

Disney Research's Touché Project Adds Complex Touch to Everyday Objects

By Norman Chan

The trick is to track changes in capacitance on multiple frequences instead of just an on or off state.

The Internet of Things is a movement to connect everyday objects to the cloud, typically by attaching them to hardware with sensors and data connectivity. Devices like the Ninja Box can add motion or environmental sensors to something like a door or window and then perform programmed actions when the sensor reaches a specified threshold. For example, a device can monitor a refrigerator door and trigger a camera whenever the door is opened or if the temperature inside the refrigerator raises above a certain point. But this kind of interaction is facilitated through a hardware proxy--the door in the refrigerator example itself isn't detecting changes to its state. That's what scientists at Disney's Research arm want to change with its Touché project.

Being presented today at the Conference on Human Factors in Computing Systems, the Touché project is a new way of using capacitive sensors--like the one found on smartphone displays--so that they can detect more than just touch on a screen. On a smartphone, a transparent layer of capacitive material detects changes in the electrical charge constantly flowing through membrane. When your finger touches the screen, the charge is disrupted and the system picks up the signal change at a fixed frequency. Even on multi-touch screens, the signal detection is binary: it just knows if and when the screen is being touched. What Touché adds is capacitive signal detection along a range of frequencies. The project's research paper (PDF) explains why this is useful:

Objects excited by an electrical signal respond differently at different frequencies, therefore, the changes in the return signal will also be frequency dependent. Thus, instead of measuring a single data point for each touch event, we measure a multitude of data points at different frequencies. We then use machine learning and classification techniques to demonstrate that we can reliably extract rich interaction context, such as hand or body postures, from this data. Not only can we determine that a touch event occurred, we can also determine how it occurred.

With the ability to detect complex interaction, touch control can become much more nuanced--and natural--than pointing at icons on a screen. For example, Touché applied to a doorknob could detect not only when the handle is being touched, but what type of grasp the user is using to unlock the door. A table could sense the body postures of people sitting around it by detecting elbows placed on its surface. And a smartphone screen could detect whether a user is touching its screen, pinching it, or holding it just for reading text or watching video. This contextual information can be used to infer how someone is using their device and change the behavior of certain touch actions accordingly.

Furthermore, Touché works with a single electrode and doesn't require the capacitive film that a smartphone needs for its screen. That means it can work on multiple types of surfaces, including even liquids and the human body (both of which are receptive to electrical signal flow). The researchers envision use cases where we'll be able to use touch gestures on our own body, like drawing a figure on a palm, to control electronic devices. Disney Research's demo video below lays out a summary of the project:

Walt Disney Research was founded in 2008 after the acquisition of Pixar as an informal partnership with universities to apply academic discoveries to industrial and commercial applications. Its research areas span from robotics and computer graphics to behavioral science and human-computer interaction--all fields that have applications in Disney ventures like film, theme parks, and commercial toys. The Touché project was developed in conjunction with Carnegie Mellon University and University of Tokyo.