What's the most counterintuitive way to interact with a touchscreen device like a smartphone or tablet? We'd have to go with not touching it. Samsung's playing around with exactly that with an experiment to see how effectively people can control a Galaxy tablet with nothing but the power of their minds. And, of course, a brain-computer interface, which uses several electrodes to monitor electrical activity within the brain.
Just last month we wrote about a new implant that's a promising step for translating brain activity into computer interaction. Samsung's study isn't so invasive or complicated, since it involves wearing an EEG cap rather than having a wireless device drilled into your head. But turning brain impulses into virtual actions is still no simple task.
Technology Review describes how the study allowed participants to open apps:
"To use EEG-detected brain signals to control a smartphone, the Samsung and UT Dallas researchers monitored well-known brain activity patterns that occur when people are shown repetitive visual patterns. In their demonstration, the researchers found that people could launch an application and make selections within it by concentrating on an icon that was blinking at a distinctive frequency."
The researchers point out that pushing interaction forward is an important part of the evolution of technology. The same is true for the move from buttons to touchscreens. But there are still major challenges for brain-computer interfaces. "The initial focus for the team was to develop signal processing methods that could extract the right information to control a device from weak and noisy EEG signals, and to get those methods to work on a mobile device," writes Technology Review.
Samsung's Galaxy Note was tested with a prototype EEG device that doesn't use the classic system of wet or gel electrodes to encourage conductivity through the scalp. The dry electrodes are much faster to set up--the gap is as wide as 10 seconds versus 45 minutes--but they're less accurate. Participants were able to make a basic app selection with 80 to 95 percent accuracy every five seconds or so, which is plenty accurate, but far, far slower than manipulating a touchscreen by hand.