Emotions Reconsidered: How Robots May Experience Feelings

By Wesley Fenlon

Studies into how humans experience emotions, and how robots are increasingly able to read those emotions, point towards Marvin the Paranoid Android becoming real before we know it.

Robert Downey Jr. owned the screen in all of his scenes in 2008's Iron Man, but the next-most popular characters in the film weren't human beings--they were robots. Tony Stark's robotic assistants, which he constantly chides and quips at, are imbued with a ton of personality through simple sound effects and exaggerated mannerisms, drooping sheepishly when they fail Stark. Those are emotions--not real ones, because Iron Man is a movie. But a very interesting, and very detailed, article from science publication Nautilus questions whether robots may be capable of the kinds of emotions Iron Man's robots exhibit. The answer starts with reconsidering how we define emotions.

Having feelings, we usually assume, and the ability to read emotions in others, are human traits," writes Nautilus' Neil Savage. "We don’t expect machines to know what we’re thinking or react to our moods...Special and indecipherable, except by us—our whims and fancies are what makes us human. But we may be wrong in our thinking. Far from being some inexplicable, ethereal quality of humanity, emotions may be nothing more than an autonomic response to changes in our environment, software programmed into our biological hardware by evolution as a survival response."

Neuroscientist Joseph LeDoux compares emotions to survival circuits ingrained in living things, from humans down to amoebas. A stimulus in the environment flips that circuit and makes us react in a certain way to encourage survival. "Neurons firing in a particular pattern might trigger the brain to order the release of adrenaline, which makes the heart beat faster, priming an animal to fight or flee from danger. That physical state, LeDoux says, is an emotion."

Obviously not all organisms share the same circuitry--our brains, and emotional reactions, are more complex than an amoeba's or even another mammal's. But there are other elements of how we express our emotions (and how we're coming to understand them) that bring us a step closer to seeing how robotic "emotions" could be real.

"We are also beginning to understand that the mechanics of how we express emotion are deeply tied into the emotion itself," writes Savage. "Oftentimes, they determine what we are feeling. Smiling makes you happier, even if it’s because Botox has frozen your face into an unholy imitation, author Eric Finzi says in his recent book The Face of Emotion...But if our emotional states are indeed mechanical, they can be detected and measured, which is what scientists in the field of affective computing are working on. They’re hoping to enable machines to read a person’s affect the same way we display and detect our feelings—by capturing clues from our voices, our faces, even the way we walk...They’re trying to break down feelings into quantifiable properties, with mechanisms that can be described, and quantities that can be measured and analyzed...Some are breaking down emotion into mathematical formalism that can be programmed into robots, because machines motivated by fear or joy or desire might make better decisions and accomplish their goals more efficiently."

If you think of an emotion as an innate reaction to an environmental trigger, robots "feeling" those emotions in a similar way isn't so implausible.

Normally, we'd argue that even if a robot is taught to accurately read emotion--by studying someone's facial movements when they're happy or sad, by detecting changes in pitch or volume of a voice when someone's angry or excited--it wouldn't necessarily feel, or understand, those emotions the same way a person would. But if you think of an emotion as an innate reaction to an environmental trigger, robots "feeling" those emotions in a similar way isn't so implausible. Nautilus' story delves into research in both of the detection methods mentioned above, and how computers are becoming more adept at both of them.

Here's how the argument is presented in Nautilus' feature: "Could a machine actually have emotions? Arvid Kappas, a professor of psychology who runs the Emotion, Cognition, and Social Context group at Jacobs University in Bremen, Germany, believes that it comes back to the definition of emotion. By some definitions, even a human baby, which operates mostly on instinct and doesn’t have the cognitive capacity to understand or describe its feelings, might be said to have no emotions. By other definitions, the trait exists in all sorts of animals, with most people willing to ascribe feelings to creatures that closely resemble humans. So does he believe a computer could be emotional? 'As emotional as a crocodile, sure. As emotional as a fish, yes. As emotional as a dog, I can see that.'

"But would robots that felt, feel the same way we do? 'They would probably be machine emotions and not human emotions, because they have machine bodies,' says Kappas. Emotions are tied into our sense of ourselves as physical beings. A robot might have such a sense, but it would be of a very different self, with no heart and a battery meter instead of a stomach. An android in power-saving mode may, in fact, dream of electric sheep."