Who decides the morals of a driverless car? | Science| In-depth reporting on science and technology | DW | 25.10.2018
  1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages

Science

Who decides the morals of a driverless car?

What are the moral decisions that driverless cars should make when an accident is inevitable? And who decides on those programmed answers? A new study opened that question to the public and found some surprising results.

While technical aspects of driverless cars have been widely reported and discussed, the debate has now reached the morality of autonomous vehicles.

Researchers at the Massachusetts Institute of Technology (MIT) published a new survey this week detailing preferences for the ethical decisions that autonomous vehicles may need to make when faced with unavoidable accidents. Such information could be used to inform the software within driverless cars, as well as the policy and laws in countries where they operate.

Read more: Driverless cars now really without a driver in California tests

German crosswalk sign

When accidents are unavoidable, how should driverless cars decide which pedestrian to run over?

The results showed a preference for human lives being spared over those of animals. More value was placed on a group of people as opposed to one or a few, and there was an expressed desire to save children before older people. 

However, these results were not universal. Cultural differences were found in the survey, such as a weaker preference to save young people over the elderly in many Asian countries.

Millions of players participated in 'Moral Machine'

The survey included a multilingual online game, which researchers named the "Moral Machine." Participants were asked their preferred outcomes in a series of hypothetical dilemmas faced by autonomous vehicles.

The result of the game was a compilation of almost 40 million decisions made by more than two million participants around the world. Additional survey data was collected from respondents in 130 countries. Overall 491,921 participants offered their demographic data to be studied. The researchers analyzed the data as a whole and by demographic subgroups like gender, income, age, education and religious views.

Still from the Moral Machine game

The MIT's 'Moral Machine' game is still available for you to play

The main differences found were based on geographic and cultural divides, though all of the regions preferred sparing law-abiding bystanders when having to choose between hitting them or jaywalkers.

Read more: Uber suspends autonomous car testing after fatal crash

While few codified policies have been passed for driverless cars, the German Ethics Commission on Automated and Connected Driving proposed the following in 2017: "In the event of unavoidable accident situations, any distinction based on personal features (age, gender, physical or mental constitution) is strictly prohibited. It is also prohibited to offset victims against one another. General programming to reduce the number of personal injuries may be justifiable."

The game remains accessible online here, so you can make some of the hard decisions yourself!

DW recommends

WWW links

Audios and videos on the topic

Advertisement