Quantcast

Understanding the Uncanny Valley: Robotic Motion Breeds Discomfort

By Wesley Fenlon

We're okay with robots that move like robots and humans that move like humans, but robots that try to move like humans just don't quite sit right with our brains.

In the last decade, digital art and animation--particularly 3D computer generated movies--have grown sophisticated enough to stumble into the uncomfortable territory of the uncanny valley more and more frequently. Some filmmakers like Robert Zemeckis have jumped headfirst into that weird zone between CG and reality, giving us creepy unforgettable experiences like the soulless eyes of the Polar Express children.

According to a recent international study led by researchers at the University of California San Diego, it's motion, and not just dead eyes and weird CG hair, that tells our brains that the uncanny valley should feel disconcerting. Using MRIs and androids, the researchers concluded that we're fine with anything that tries to look human--until it starts to move. If those movements aren't distinctly human, our brains don't know how to process what they're seeing.

The uncanny valley is an interesting phenomenon, especially when it involves robots or androids. The more humanlike these devices are, the more we like them: you wouldn't give a robotic arm in a manufacturing plant a second glance, but what about the robotic arm in Iron Man that's a bit of a klutz but ultimately saves Tony Stark's life? Just a hint of personality changes everything. Same reason we love R2-D2, C-3PO, WALL-E and Keepon. But as soon as the robots get too human, things get awkward. Observe:

The goal of the study was to determine if our reaction to too-human robots owed more to appearance or motion. To find out, the researchers grabbed 20 subjects who had little exposure to robots and showed them three videos. Each video depicted a set of basic actions like waving, nodding and taking a drink of water, but with different performers: one was human, while the other was Japanese android Repliee Q2 shown above. The third video was also Repliee Q2, but stripped down to the robot parts. And so the stage was set with a human, a human lookalike and an unmistakably mechanical robot.

The android unsurprisingly created the strongest reaction, and the fMRI scanner used to map brain activity revealed what it is about robotic movement that triggers our uncanny valley response:

The biggest difference in brain response the researchers noticed was...in the parietal cortex, on both sides of the brain, specifically in the areas that connect the part of the brain’s visual cortex that processes bodily movements with the section of the motor cortex thought to contain mirror neurons (neurons also known as “monkey-see, monkey-do neurons” or “empathy neurons”).

The brain “lit up” when the human-like appearance of the android and its robotic motion “didn’t compute.”

“The brain doesn’t seem tuned to care about either biological appearance or biological motion per se,” said Saygin, an assistant professor of cognitive science at UC San Diego and alumna of the same department. “What it seems to be doing is looking for its expectations to be met – for appearance and motion to be congruent.”

To put the experiment in duck test terms: if it looks like a duck and quacks like a duck but doesn't move like a duck, our brains freak out. The researchers theorize our perceptual systems may shift as androids become more commonplace--or they may not, causing us to move away from robots that mimic humanity as closely as possible. They're currently looking into replicating the test with EEG equipment, which would enable focus groups or scientists to test audience waters with animations/robots before spending millions on development. fMRI machines are too expensive to be practical for frequent tests.

What's your strongest memory of an uncanny valley moment? The Polar Express? Final Fantasy: The Spirits Within? Or something more recent, like not-quite-Jeff-Bridges in Tron Legacy?