From ScienceDaily.com (June 15):
Imagine you are in charge of the switch on a trolley track. The express is due any minute; but as you glance down the line you see a school bus, filled with children, stalled at the level crossing. No problem; that's why you have this switch. But on the alternate track there's more trouble: Your child, who has come to work with you, has fallen down on the rails and can't get up. That switch can save your child or a bus-full of others, but not both. What do you do?
Death in the driver's seat
So should your self-driving car be programmed to kill you in order to save others? There are two philosophical approaches to this type of question, Barghi says. "Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people," he explained. In other words, if it comes down to a choice between sending you into a concrete wall or swerving into the path of an oncoming bus, your car should be programmed to do the former.
Deontology, on the other hand, argues that "some values are simply categorically always true," Barghi continued. "For example, murder is always wrong, and we should never do it." Going back to the trolley problem, "even if shifting the trolley will save five lives, we shouldn't do it because we would be actively killing one," Barghi said. And, despite the odds, a self-driving car shouldn't be programmed to choose to sacrifice its driver to keep others out of harm's way.
Every variation of the trolley problem -- and there are many: What if the one person is your child? Your only child? What if the five people are murderers? -- simply "asks the user to pick whether he has chosen to stick with deontology or utilitarianism," Barghi continued. If the answer is utilitarianism, then there is another decision to be made, Barghi adds: rule or act utilitarianism. [read more]
Something to think about. Keep in mind with self-driving cars the “driver” (I put driver in quotes because is he/she actually driving if the car is self-autonomous? Isn’t the person actually a passenger? Anyway…) isn’t making the decisions. It’s the programmer who wrote the code who works for the car manufacturer that makes the decisions. It’s his/her ethics the car will follow. And if the programmer has to follow gov’t regulations from National Highway Traffic Safety Admin. then who knows what will happen.
No comments:
Post a Comment