Quantcast

Predicting the Ethical Dilemmas of Fully Autonomous Cars

By Wesley Fenlon

The last thing you want is a car that drives the speed limit as it's taking you to the hospital.

Right now, Google and some other major tech companies are working very hard to build cars that can drive themselves. Obviously autonomous cars exist already, but they always have human drivers on-board for safety. They're not yet at the point where they can chauffeur us around a la I, Robot. Typically we'd be thinking about the technology that will facilitate that fine degree of automation, but today we're thinking about a different issue--what will govern the behavior of those cars. California Polytechnic's director of Ethics + Emerging Sciences Group, Patrick Lin, wrote a post for The Atlantic highlighting exactly how difficult programming autonomous cars will be.

Think about how most humans drive cars. We obey the majority of traffic laws--stop at red lights, drive in the proper lane, and so on. Programming autonomous cars to follow the letter of the law will be easy, and in some ways they'll be better than human drivers. They'll never forget their turn signal, or think they have right-of-way at an intersection.

As much as the law guides how we drive, though, instinct and judgment play a bigger role. And programming self-driving cars to make judgment calls will be very, very tough. "If a small tree branch pokes out onto a highway and there’s no incoming traffic, we’d simply drift a little into the opposite lane and drive around it," Lin throws out as an example. "But an automated car might come to a full stop, as it dutifully observes traffic laws that prohibit crossing a double-yellow line. This unexpected move would avoid bumping the object in front, but then cause a crash with the human drivers behind it."

Programming autonomous cars will make us ask, and attempt to answer, all sorts of ethical questions. Should autonomous cars be allowed to go over the speed limit? Most human drivers exceed it by 5 or 10 miles per hour. Will robo cars adhere to the posted limit? What if they're on a Texas highway with no other cars visible for miles? What if an autonomous car is carrying an injured passenger? Will it break the speed limit to deliver that person to a hospital?

Theoretical ethical dilemmas also pose problems for autonomous vehicles.

It doesn't matter how unlikely a traffic situation is--it's something programmers still have to account for.

If a car somehow enters a scenario where it's going to be in a collision, how does it respond? If swerving one way would collide with three pedestrians, but swerving another would collide with five, does the car make a judgment call? Lin argues that it doesn't matter how unlikely a situation like this is--it's something programmers still have to account for. And this puts a lot of pressure on programmers.

"Human drivers may be forgiven for making an instinctive but nonetheless bad split-second decision, such as swerving into incoming traffic rather than the other way into a field," Lin writes. "But programmers and designers of automated cars don’t have that luxury, since they do have the time to get it right and therefore bear more responsibility for bad outcomes."

Lin also brings up the interesting issue of ownership, and whether a car values the life of its owner more than others. How would that apply to, say, public busses? Sooner or later, lawmakers will have to start shaping regulations for self-driving vehicles, and the makers of those vehicles are going to have a lot to think about to make them road-worthy.