Self-driving cars and ethics: would you drive a car that would sacrifice you instead of others?

I stumbled upon this nice article with the title: Why Self-Driving Cars Must Be Programmed to Kill

Not many ask this question now, but it has to be asked.

How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?

 

Who would buy a car programmed to sacrifice the owner?

Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?

 

Ethical%20cars[1]

What do you think?

I honestly don’t know how to answer this question.

But then, I am thinking to my behavior… If I would be behind the wheel in the position described in the picture above, I would probably drive in the wall or in the passenger on the side.

You don’t think too much who to kill when you are in the adrenaline rush…

 


© Copyright 2015 Sorin Mustaca, All rights Reserved. Written For: Sorin Mustaca on Cybersecurity


Check www.endpoint-cybersecurity.com for seeing the consulting services we offer.

Visit www.itsecuritynews.info for latest security news in English
Besuchen Sie de.itsecuritynews.info für IT Sicherheits News auf Deutsch