We love reading about human-robot interactions; how an air hockey-playing robot studies human opponents, how Data practices the art of stand-up comedy, and how robots may eventually change how we define emotions. But a big part of human-robot interaction is more about us than them. We tend to assign personalities, feelings, even motivations to robots when they perform the simplest tasks. In a new report, New York Times explores why that is.
And the answer, again, has more to do with humans than robots. "When a robot moves on its own, it exploits a fundamental social instinct that all humans have: the ability to separate things into objects (like rocks and trees) and agents (like a bug or another person)," writes BoingBoing science editor Maggie Koerth-Baker for the Times. "Its evolutionary importance seems self-evident; typically, kids can do this by the time they’re a year old.
"The distinction runs deeper than knowing something is capable of movement. 'Nobody questions the motivations of a rock rolling down a hill,' says Brian Scassellati, director of Yale’s social robotics lab. Agents, on the other hand, have internal states that we speculate about. The ability to distinguish between agents and objects is the basis for another important human skill that scientists call 'cognitive empathy (or 'theory of mind,' depending on whom you ask): the ability to predict what other beings are thinking, and what they want, by watching how they move."
As an example, the Times cited a 2011 study in which a "robot" was placed in a room with a human. The robot was actually just a piece of balsa wood attached to some gears, and someone controlled the contraption using a joystick outside the room. The study participants didn't know the "robot" was human controlled, and the "vast majority assumed the stick had its own goals and internal thought processes. They described the stick as bowing in greeting, searching for hidden items, even purring like a contented cat."
The study of human-robot interaction is important in military and crisis bots designed to save lives.
The study of human-robot interaction will be important in all types of robotics, but especially in military and crisis bots designed to save lives. Koerth-Baker writes about the challenge NASA JPL's Robosimian will face when it's deployed to help rescue humans in disasters. Or maybe the challenge won't be for the robot, but for the EMTs and crisis personnel who have to use the robots. Should they become attached, or look at the robots as nothing but tools? Will they be able to stay completely detached, even if they want to?
Empathy might show up whether they want it or not. A recent study by Julie Carpenter, who did her dissertation on ordinance disposal bots, found that soldiers get attached to the mine disposal robots they use on the battlefield"
"Soldiers told Carpenter their first reaction to a robot being blown up was anger at losing an expensive piece of equipment, but some also described a feeling of loss.
" 'They would say they were angry when a robot became disabled because it is an important tool, but then they would add ‘poor little guy,’ or they’d say they had a funeral for it,' Carpenter said. 'These robots are critical tools they maintain, rely on, and use daily. They are also tools that happen to move around and act as a stand-in for a team member, keeping Explosive Ordnance Disposal personnel at a safer distance from harm.' "