You are viewing 1 of 2 articles without an email address.

All our articles are free to read, but complete your details for free access to full site!

Already a Member?
Login Join us now

MIT survey poses ethical questions over driverless cars

MIT wants to create discussion and public engagement on the ethics of autonomous driving

Ethical questions involving autonomous vehicles are the focus of a new survey by MIT. Picture: MIT
Ethical questions involving autonomous vehicles are the focus of a new survey by MIT. Picture: MIT

A survey by MIT researchers has revealed some distinct global preferences concerning the ethics of autonomous vehicles, as well as some regional variations in those preferences.


More than two million online participants from over 200 countries took part in the survey, which focused on what MIT calls versions of a “classic ethical conundrum”: the Trolley Problem. This involves scenarios in which an accident involving a vehicle is imminent, and the vehicle must opt for one of two potentially fatal options. In the case of driverless cars, that might mean swerving toward a couple of people, rather than a large group of bystanders.


Driverless moral decisions


“The study is basically trying to understand the kinds of moral decisions that driverless cars might have to resort to,” said Edmond Awad, a postdoc at the MIT Media Lab and lead author of a new paper outlining the results of the project. “We don’t know yet how they should do that.”


The survey found that there are three elements that people seem to approve of the most with the most emphatic global preferences; sparing the lives of humans over the lives of other animals; sparing the lives of many people rather than a few; and preserving the lives of the young, rather than older people.


“The main preferences were to some degree universally agreed upon,” said Awad. “But the degree to which they agree with this or not varies among different groups or countries.” For instance, the researchers found a less pronounced tendency to favour younger people, rather than the elderly, in what they defined as an “eastern” cluster of countries, including many in Asia.

“What we have tried to do in this project, and what I would hope becomes more common, is to create public engagement in these sorts of decisions”

To conduct the survey, the researchers designed a “Moral Machine,” a multilingual online game in which participants could state their preferences concerning a series of dilemmas that autonomous vehicles might face. For instance: should autonomous vehicles spare the lives of law-abiding bystanders, or, alternately, law-breaking pedestrians who might be jaywalking? (Most people in the survey opted for the former.)


The game compiled nearly 40 million individual decisions from respondents in 233 countries. The researchers analysed the data as a whole, while also breaking participants into sub-groups defined by age, education, gender, income, and political and religious views. There were 491,921 respondents who offered demographic data.


Clusters of preferences


MIT did not find marked differences in moral preferences based on these demographic characteristics, but they did find larger “clusters” of moral preferences based on cultural and geographic affiliations. They defined “western,” “eastern,” and “southern” clusters of countries, and found some more pronounced variations along these lines. For instance, respondents in southern countries had a relatively stronger tendency to favour sparing young people rather than the elderly, especially compared to the eastern cluster.


Awad suggested that acknowledgement of these types of preferences should be a basic part of informing public-sphere discussion of these issues. In all regions, since there is a moderate preference for sparing law-abiding bystanders rather than jaywalkers, knowing these preferences could, in theory, inform the way software is written to control autonomous vehicles.


“The question is whether these differences in preferences will matter in terms of people’s adoption of the new technology when [vehicles] employ a specific rule,” he said.


Beyond the results of the survey, Awad suggested seeking public input about an issue of innovation and public safety should continue to become a larger part of the dialogue surrounding autonomous vehicles.


“What we have tried to do in this project, and what I would hope becomes more common, is to create public engagement in these sorts of decisions,” he said.


A paper, The Moral Machine Experiment, will be published in Nature.


You might also like:


Simulated road users help create CAV testing environment

A consortium of 11 organisations are involved in the UK government-funded project

Read more


Virtual eyes make contact with pedestrians

Jaguar Land Rover is using intelligent pods with eyes to better understand how humans will trust self-driving vehicles in the future

Read more


Cities must drive autonomous and connected strategies

Report finds Amsterdam has successfully introduced MaaS while San Francisco and Singapore are putting CAV at the heart of their mass transit future

Read more


Add New Comment