Category: Science and Technology / Ethics / Engineering / Automotive / Transport
Should your driverless car sacrifice your life?
Friday, 24 Jun 2016 10:41:45 | Bernie Hobbs

Cars will need to be programmed to make life-and-death decisions (Getty Images)
When driverless cars hit our streets, they will rely on a combination of sensors and sophisticated programming to navigate safely - but how do we want them to handle thorny situations?
Key points:
- Scientists asked people what driverless cars should do in a range of fatal accident scenarios
- They made different choices if they were inside the car, or observing
- The findings illustrate a thorny issue for programmers: save lives overall, or prioritise passengers?
If, for example, either a pedestrian or a passenger will get hurt, the car will have to make a judgement call. And we might not like the pre-programmed decision that it makes.
But is this the car's fault, or the programmer's? Or is it ours?
A study published today in the journal Science found we want the cars to act in different ways depending on whether we're inside them or not.
Almost 2,000 participants in a Massachusetts Institute of Technology study were presented with different traffic scenarios.
Faced with a situation that will cause certain injury, they overwhelmingly supported the car taking the action that injured the least number of people. The cars should run over an individual, for example, instead of ploughing into a larger group of people.
But the answers were very different when participants' families — and especially their kids — were passengers in the driverless car. The participants would apparently not rush out to buy a car that sacrificed their own family's safety to save pedestrians.
So what are the programmers to do?
Let's talk about this, now
We can't have it both ways. Autonomous cars will be pre-programmed to deal with this type of scenario, and they will either act for the greater good (injuring the least number of people all up), or they will prioritise the passengers.
And it's more than a moral dilemma — there is little incentive for driverless car manufacturers to opt for a 'greater good' model if it won't sell.
Should regulators force the adoption of greater good programming, just as in some places only vaccinated children are allowed to enroll in school?
Or should we let the market decide?
Driverless public transport vehicles could be programmed with a greater good focus, while private buyers opt for cars that put passenger safety first.
With evidence suggesting that driverless vehicles could cut accidents by up to 90 per cent, and the technology already being tested on real streets, this study doesn't just highlight the challenge of programming ethical algorithms.
It shows that we need to have this discussion, and have it quickly.
Want more science from across the ABC?
- About Us
- |
- Terms of Use
- |
-
RSS
- |
- Privacy Policy
- |
- Contact Us
- |
- Shanghai Call Center: 962288
- |
- Tip-off hotline: 52920043
- 沪ICP证:沪ICP备05050403号-1
- |
- 互联网新闻信息服务许可证:31120180004
- |
- 网络视听许可证:0909346
- |
- 广播电视节目制作许可证:沪字第354号
- |
- 增值电信业务经营许可证:沪B2-20120012
Copyright © 1999- Shanghai Daily. All rights reserved.Preferably viewed with Internet Explorer 8 or newer browsers.