richms:
petes117:
Yes that's true... but what about accidents? If a self-driving car swerves to avoid a collision and save its passenger's life, but kills a pedestrian, who is to blame? Or if the car knew it was going to be a choice between saving the passenger or the pedestrian, who should it save?
The person who owns it should be saved. Anyone thinking that the stuff will sell doing anything other than that is an idiot.
That makes sense from an economic standpoint, yes. But how far should the car go to protect its passengers, where do you draw the line? What if the pedestrian that's killed is a mother and baby... could the owner live with themselves, even if it's the car to blame?
That's the issue with combining of AI and robotics, life-changing split-second decisions such as that would be taken out of our direct control, and not everyone will answer it (or react, if they had control) in the same way.
Same principle very much applies to military use, what point is deemed "acceptable losses" say with a drone strike? Humans have enough trouble making that decision, let alone an autonomous robot. Highly recommend checking out the movie 'Eye in the Sky' from 2015 if you guys are interested in how a situation like this might play out.