November 2, 2027.
You’re in your self driving car, on a 2-lane, one way road. A truck, too wide for the road tries to overtake you from the right.
Drunk driver.
Your car senses the danger, and lunges forward, a bit to the left, towards 8 students on the walkway.
Three options.
1: Ram into them, while slowing your car down.
2: Swerve to the left, into a wall, effectively killing you.
3: Choose between [1] and [2] at random.
What should it do?
Because situations like these are completely unavoidable, the answer to this question will largely determine public adoption, and by extension, the future of autonomous motor vehicles. This is a particularly interesting dilemma, because 90% of all motor vehicle accidents are caused by human error, and self driving cars are supposed to be technology’s answer to the road safety problem. So logically, instant, world-wide adoption of self driving vehicles (because two computers are infinitely less likely to collide, than a computer and a human) should be underway but that’s not happening. Probably not for another 20 years. Income gaps, current production capacity, etc have rendered that solution infeasible from the start.
Jean-Francois Bonnefon, and his team from the Toulouse School of Economics, published a paper, discussing this ethical dilemma. “It is a formidable challenge to define the algorithms that will guide AVs [Autonomous Vehicles] confronted with such moral dilemmas,” the researchers wrote. They surveyed hundreds of people on Amazon’s Mechanical Turk, presenting each participant with a bunch of scenarios, varying the number of people in the car, number of pedestrians on the road, presence of children, mean age of the people in the car, etc, and the results weren’t surprising – people are generally willing to sacrifice the driver of the car, to save more humans, but only as long as they weren’t the driver involved. Obviously, before this tech gets anywhere near mainstream adoption, this dilemma needs to be resolved.
Do manufacturers create different models, with different algorithms, for different levels of morality? Do we pass new laws to force the car to sacrifice the owner in each case? Do we let the car decide randomly? While self driving vehicles are the logical future in the world’s transport system, figuring out how to build ethical, autonomous machines, is one of the challenges that needs to be addressed.