The ethical dilemmas facing self-driving cars

Self-driving cars offer many promises for an improved experience on the road. They can increase traffic efficiency, reduce pollution, and eliminate up to 90 percent of traffic accidents, developers claim. Unlike human drivers, they don't get distracted, they don't fall asleep or text behind the wheel, and they don't drive drunk.

But as autonomous cars move closer to hitting the roads, a new ethical dilemma is coming to light - how will they make potentially life-or-death decisions that are instinctive to human beings?

"You're also going to have to program ethics in a way that society hasn't dealt with. How do you deal with a situation where somebody's crossing the road in front of you and to avoid it you have to swerve off a bridge? That is something you'll have to write into the software," Thompson said. "How you make those decisions is very complicated.

A new study, published in Science, looked at how people thought self-driving cars should act. More than 75 percent of participants in one survey favored cars that would sacrifice one passenger rather than kill 10 pedestrians. But overall, the study found people prefer to ride in a driverless car that protects the occupants at all costs.

According to Thompson, the study's findings indicate that people lean toward "classic utilitarianism."

"People, when they think about how they want the system to be designed, want the most good for the most number," Thompson said. "(But) when they think about their own vehicles, they want the most good, maybe for 'me.'"

The ethical dilemma also brings with it legal questions. In the case of any accidents, lawyers could blame the car manufacturers for the way they "programmed morality into the machine."

These issues are important to think about and discuss, Thompson said. Whether the government issues federal standards or the decisions are left strictly up to the manufacturers, Thompson said we as a society "haven't really thought these through."