Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


523 posts

Ultimate Geek
+1 received by user: 153
Inactive user


Topic # 183938 4-Nov-2015 08:21
Send private message

Found this article very interesting. What are your thoughts? Should self driving cars be programmed to kill?

Self-driving cars are already cruising the streets. But before they can become widespread, carmakers must solve an impossible ethical dilemma of algorithmic morality.


How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random? (See also “How to Help Self-Driving Cars Make Ethical Decisions.”)

The answers to these ethical questions are important because they could have a big impact on the way self-driving cars are accepted in society. Who would buy a car programmed to sacrifice the owner?


These problems cannot be ignored, say the team: “As we are about to endow millions of vehicles with autonomy, taking algorithmic morality seriously has never been more urgent.”



http://www.technologyreview.com/view/542626/why-self-driving-cars-must-be-programmed-to-kill/

View this topic in a long page with up to 500 replies per page Create new topic
 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | ... | 13
3372 posts

Uber Geek
+1 received by user: 652

Trusted

  Reply # 1420380 4-Nov-2015 08:32
Send private message

Personally kill who is in the wrong... if a person is on the road jaywalking (or a group of people), kill them.  Don't drive onto the footpath killing someone who isn't doing anything wrong.  Don't kill the people in the car (unless they are in the wrong, which would be hard, unless there were a s/w issue then in that case the car wouldnt think it was in the wrong anyway).

However, if the people its killing is kids (assuming it can detect kids, i'm sure google could), never kill the kids (unless you have kids in your car).

ok, yeah thats a tough one.....

we need to hurry up and get transporters, to avoid this (of course every time you step in one of those, you would die and a clone would take your place...)



523 posts

Ultimate Geek
+1 received by user: 153
Inactive user


Reply # 1420383 4-Nov-2015 08:34
Send private message

My personal preference, and if I was a programmer who programmed these cars, would be to take out the cyclist first. ;-p

Glurp
8230 posts

Uber Geek
+1 received by user: 3788

Subscriber

  Reply # 1420390 4-Nov-2015 08:43
14 people support this post
Send private message

This is a fascinating and serious subject and I was going to contribute, until I saw the stupid remark about cyclists. Grow up, people.
 




I reject your reality and substitute my own. - Adam Savage
 




523 posts

Ultimate Geek
+1 received by user: 153
Inactive user


  Reply # 1420398 4-Nov-2015 08:52
Send private message

reven: Personally kill who is in the wrong... if a person is on the road jaywalking (or a group of people), kill them.  Don't drive onto the footpath killing someone who isn't doing anything wrong.  Don't kill the people in the car (unless they are in the wrong, which would be hard, unless there were a s/w issue then in that case the car wouldnt think it was in the wrong anyway).

However, if the people its killing is kids (assuming it can detect kids, i'm sure google could), never kill the kids (unless you have kids in your car).



Many other variables too.

- What if the car is stolen, does it still protect its occupant?
- What if you being followed by a serial killer? 
- What if driving through a riot? People stoning/petrol bombing the car? Does it stop, or move on regardless?

So many variables at play.

Who does the code reviews on the software? Will the car owners be rooting their cars like their phones? How will this be controlled? After the VW emissions episode, I guess proprietary software is a No No. Will car owners have access to source code?



523 posts

Ultimate Geek
+1 received by user: 153
Inactive user


  Reply # 1420418 4-Nov-2015 09:07
Send private message

Rikkitic: This is a fascinating and serious subject and I was going to contribute, until I saw the stupid remark about cyclists. Grow up, people.
 


What if the cyclist is:

- pointing a gun at the car.
- a wanted criminal
- its the cyclist on the pavement, or 10kids standing in the road. 

Would that make it OK?

3194 posts

Uber Geek
+1 received by user: 911

Trusted

  Reply # 1420420 4-Nov-2015 09:09
Send private message

....

BDFL - Memuneh
61314 posts

Uber Geek
+1 received by user: 12062

Administrator
Trusted
Geekzone
Lifetime subscriber

  Reply # 1420432 4-Nov-2015 09:28
5 people support this post
Send private message

DizzyD:
Rikkitic: This is a fascinating and serious subject and I was going to contribute, until I saw the stupid remark about cyclists. Grow up, people.
 


What if the cyclist is:

- pointing a gun at the car.
- a wanted criminal
- its the cyclist on the pavement, or 10kids standing in the road. 

Would that make it OK?


You made a generalised comment before. This comment doesn't add to it.

Please stop trolling.






1243 posts

Uber Geek
+1 received by user: 530


  Reply # 1420464 4-Nov-2015 09:58
2 people support this post
Send private message

But surely these "ethical" decisions are contrary to the first two laws of robotics and a self-driving car having to make a choice between harming its occupants or harming other road users would irreparably damage its positronic brain?

3343 posts

Uber Geek
+1 received by user: 1089

Trusted
Vocus

  Reply # 1420471 4-Nov-2015 10:09
Send private message

IMO the algorithm should try to cause the *least* harm overall.  There's very few scenarios in reality where, when a car is being driven safely, avoiding eg. a pedestrian would cause more harm to the vehicle occupants than the potential harm to the pedestrian.  I think people are introducing a false dichotomy into the equation.

3343 posts

Uber Geek
+1 received by user: 1089

Trusted
Vocus

  Reply # 1420475 4-Nov-2015 10:12
Send private message

Of course the cop-out option is that in this hypothetical extreme situation, the vehicle hands control to the human driver.



523 posts

Ultimate Geek
+1 received by user: 153
Inactive user


  Reply # 1420476 4-Nov-2015 10:13
Send private message

andrew027: But surely these "ethical" decisions are contrary to the first two laws of robotics and a self-driving car having to make a choice between harming its occupants or harming other road users would irreparably damage its positronic brain?


The problem here is you can't (or should not) program a robot to be "ethical". Doing so just increases the problem. 

I guess the real question here is, should these cars to be programmed to obey the rules of the road above anything else? Ie, not be programmed to jump pavements to avoid an accident. 



523 posts

Ultimate Geek
+1 received by user: 153
Inactive user


  Reply # 1420495 4-Nov-2015 10:14
Send private message

ubergeeknz: Of course the cop-out option is that in this hypothetical extreme situation, the vehicle hands control to the human driver.


The assumption there is that the occupant can drive a car. 

4591 posts

Uber Geek
+1 received by user: 2088

Trusted
Subscriber

  Reply # 1420498 4-Nov-2015 10:16
Send private message

andrew027: But surely these "ethical" decisions are contrary to the first two laws of robotics and a self-driving car having to make a choice between harming its occupants or harming other road users would irreparably damage its positronic brain?


Tsk. You never read the later books, then? The Zeroth Law?




iPad Air + iPhone SE + 2degrees 4tw!

These comments are my own and do not represent the opinions of 2degrees.


4591 posts

Uber Geek
+1 received by user: 2088

Trusted
Subscriber

  Reply # 1420499 4-Nov-2015 10:17
Send private message

DizzyD:
ubergeeknz: Of course the cop-out option is that in this hypothetical extreme situation, the vehicle hands control to the human driver.


The assumption there is that the occupant can drive a car. 


And that there are manual controls to take over with.




iPad Air + iPhone SE + 2degrees 4tw!

These comments are my own and do not represent the opinions of 2degrees.


13130 posts

Uber Geek
+1 received by user: 6162

Trusted
Subscriber

  Reply # 1420501 4-Nov-2015 10:17
Send private message

This is why I believe autonomous vehicles have limited applications, e.g Military, Agricultural, specific transport like at an Airport. But not even close to being ready for general usage in urban or metro situations.




Mike
Retired IT Manager. 
The views stated in my posts are my personal views and not that of any other organisation.

 

 Mac user, Windows curser, Chrome OS desired.

 

The great divide is the lies from both sides.

 

 


 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | ... | 13
View this topic in a long page with up to 500 replies per page Create new topic

Twitter »

Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.