![]() ![]() ![]() |
|
Plesse igmore amd axxept applogies in adbance fir anu typos
Rikkitic: Admittedly this was a long time ago, but when Apollo 11 landed on the moon, the computer panicked and screamed 'I'm going to die!', while the human saved the day.
DizzyD: Found this article very interesting. What are your thoughts? Should self driving cars be programmed to kill?Self-driving cars are already cruising the streets. But before they can become widespread, carmakers must solve an impossible ethical dilemma of algorithmic morality.How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random? (See also “How to Help Self-Driving Cars Make Ethical Decisions.”)
The answers to these ethical questions are important because they could have a big impact on the way self-driving cars are accepted in society. Who would buy a car programmed to sacrifice the owner?These problems cannot be ignored, say the team: “As we are about to endow millions of vehicles with autonomy, taking algorithmic morality seriously has never been more urgent.”
http://www.technologyreview.com/view/542626/why-self-driving-cars-must-be-programmed-to-kill/
_____________________________________________________________________
I've been on Geekzone over 16 years..... Time flies....
DizzyD: Found this article very interesting. What are your thoughts? Should self driving cars be programmed to kill?Self-driving cars are already cruising the streets. But before they can become widespread, carmakers must solve an impossible ethical dilemma of algorithmic morality.
roobarb:DizzyD: Found this article very interesting. What are your thoughts? Should self driving cars be programmed to kill?Self-driving cars are already cruising the streets. But before they can become widespread, carmakers must solve an impossible ethical dilemma of algorithmic morality.
Let us summarise;
In order for X to do Y, Z must do the impossible.
Seeing as Z cannot do the impossible, X cannot do Y.
tdgeek: We just apple basic rules of physics to avoid a crash if we can and stop as quick as we can.
roobarb:tdgeek: We just apple basic rules of physics to avoid a crash if we can and stop as quick as we can.
We don't even do that, hence why people flip their cars in ditches on country roads trying to avoid rabbits. However the car still dutifully applies the basic rules of physics, hence the unintended end-state.
Jarle Dahl Bergersen | Referral Links: Want $50 off when you join Octopus Energy? Use this referral code
Are you happy with what you get from Geekzone? Please consider supporting us by making a donation or subscribing.
jarledb: Interesting about Tesla vs Google self driving cars on Quora
ubergeeknz: I'm just going to leave this here
http://existentialcomics.com/comic/106
Plesse igmore amd axxept applogies in adbance fir anu typos
|
![]() ![]() ![]() |