Just as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
If a Google Car Has to Kill Someone, Who Should it Be?
- Pattern-chaser
- Premium Member
- Posts: 8268
- Joined: September 22nd, 2019, 5:17 am
- Favorite Philosopher: Cratylus
- Location: England
Re: If a Google Car Has to Kill Someone, Who Should it Be?
"Who cares, wins"
-
- Posts: 15
- Joined: November 7th, 2021, 4:52 pm
Re: If a Google Car Has to Kill Someone, Who Should it Be?
I do not think it is wise to take responsibility for result of someone's program execution. There are program bugs, security vulnerabilities etc. Even programmer of any software do not give you waranty of any kind, how can be user responsible for that?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
-
- Posts: 502
- Joined: May 11th, 2021, 11:20 am
Re: If a Google Car Has to Kill Someone, Who Should it Be?
Would you have a different view if the person in the driver seat were able to override the car’s control in an instant at any time?Pattern-chaser wrote: ↑December 2nd, 2021, 8:50 amJust as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
- Terrapin Station
- Posts: 6227
- Joined: August 23rd, 2016, 3:00 pm
- Favorite Philosopher: Bertrand Russell and WVO Quine
- Location: NYC Man
Re: If a Google Car Has to Kill Someone, Who Should it Be?
Whether a person or a computer is operating a car, if a car behind another can't accommodate a "full, quick stop," then it's driving too close. You can't follow so close to another person that you're only safe just in case they don't do something like suddenly jam on their brakes, lol.WanderingGaze22 wrote: ↑November 9th, 2021, 4:56 am Should it do a full-quick stop to avoid hitting a galloping deer knowing there is a speeding car right behind it?
- Pattern-chaser
- Premium Member
- Posts: 8268
- Joined: September 22nd, 2019, 5:17 am
- Favorite Philosopher: Cratylus
- Location: England
Re: If a Google Car Has to Kill Someone, Who Should it Be?
Pattern-chaser wrote: ↑December 2nd, 2021, 8:50 amJust as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
Your question takes us outside the OP, which refers to cars that 'drive themselves', not cars that are driven and controlled by a human, with robotic assistance.AverageBozo wrote: ↑December 2nd, 2021, 5:32 pm Would you have a different view if the person in the driver seat were able to override the car’s control in an instant at any time?
"Who cares, wins"
-
- Posts: 502
- Joined: May 11th, 2021, 11:20 am
Re: If a Google Car Has to Kill Someone, Who Should it Be?
Are there “self-driving” cars that are not equipped with a manual override for a human occupant to use?Pattern-chaser wrote: ↑December 3rd, 2021, 10:19 amPattern-chaser wrote: ↑December 2nd, 2021, 8:50 amJust as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.Your question takes us outside the OP, which refers to cars that 'drive themselves', not cars that are driven and controlled by a human, with robotic assistance.AverageBozo wrote: ↑December 2nd, 2021, 5:32 pm Would you have a different view if the person in the driver seat were able to override the car’s control in an instant at any time?
If no, then I was certainly outside of the OP.
-
- Posts: 15
- Joined: November 7th, 2021, 4:52 pm
Re: If a Google Car Has to Kill Someone, Who Should it Be?
But is it possible to override control instantly? The person on the seat have to hold steering wheel all the time, that person need to pay attention all the time. That person need to disable autopilot by a blink of an eye or will have to rely on some kind of auto-switch. If all this is needed, what is the purpose of autopilot?AverageBozo wrote: ↑December 2nd, 2021, 5:32 pmWould you have a different view if the person in the driver seat were able to override the car’s control in an instant at any time?Pattern-chaser wrote: ↑December 2nd, 2021, 8:50 amJust as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
- Pattern-chaser
- Premium Member
- Posts: 8268
- Joined: September 22nd, 2019, 5:17 am
- Favorite Philosopher: Cratylus
- Location: England
Re: If a Google Car Has to Kill Someone, Who Should it Be?
AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
Pattern-chaser wrote: ↑December 2nd, 2021, 8:50 am Just as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?
AverageBozo wrote: ↑December 2nd, 2021, 5:32 pm Would you have a different view if the person in the driver seat were able to override the car’s control in an instant at any time?
Exactly. We are discussing what might/could happen when self-driving cars 'drive themselves' without human interference.figliar0 wrote: ↑December 3rd, 2021, 3:32 pm But is it possible to override control instantly? The person on the seat have to hold steering wheel all the time, that person need to pay attention all the time. That person need to disable autopilot by a blink of an eye or will have to rely on some kind of auto-switch. If all this is needed, what is the purpose of autopilot?
"Who cares, wins"
-
- Posts: 502
- Joined: May 11th, 2021, 11:20 am
Re: If a Google Car Has to Kill Someone, Who Should it Be?
Excellent post. The only purpose I can envision for such an arrangement would be for the manufacturer to eliminate or reduce its exposure to lawsuits. Such a car would likely be made as long as it can be sold to early adapters or to the public in general.figliar0 wrote: ↑December 3rd, 2021, 3:32 pmBut is it possible to override control instantly? The person on the seat have to hold steering wheel all the time, that person need to pay attention all the time. That person need to disable autopilot by a blink of an eye or will have to rely on some kind of auto-switch. If all this is needed, what is the purpose of autopilot?AverageBozo wrote: ↑December 2nd, 2021, 5:32 pmWould you have a different view if the person in the driver seat were able to override the car’s control in an instant at any time?Pattern-chaser wrote: ↑December 2nd, 2021, 8:50 amJust as we - those of us who use Windows PCs - don't own Windows, and can't control it, so a 'self-driving' car is beyond the 'owner's' control too. It is not reasonable to hold the 'owner' responsible for the actions of a car that they are unable to control. That isn't just. But then, who is responsible in the case of an 'accident'?AwkwardPanda wrote: ↑December 1st, 2021, 11:54 pm Maybe the car will require a "morality customization" program. It presents a series of questions and choices to the owner of the car, about what the car should do in different moral scenarios. The car will not be able to drive before the owner selects their moral preferences. That way, when a self-driving car ends up killing someone, the owner of the car will be persecuted as they would now. It would give the company decent deniability to the crime, and it would allow the owner of the car to think over such moral scenarios, instead of choosing on the spot.
- Mounce574
- Premium Member
- Posts: 156
- Joined: October 8th, 2021, 2:24 am
- Location: Oklahoma
Re: If a Google Car Has to Kill Someone, Who Should it Be?
"If it ain't broke, don't fix it." NF from Motto
-
- Posts: 782
- Joined: January 27th, 2022, 5:12 am
Re: If a Google Car Has to Kill Someone, Who Should it Be?
The common law tradition has long distinguished deliberate harm from negligence from accident.
If the owner and manufacturer of the self-driving vehicle have each made every reasonable effort to avoid a collision, why should either be punished ? Why does every collision have to be somebody's culpable fault ? Whatever happened to the notion of accidents ?
- Pattern-chaser
- Premium Member
- Posts: 8268
- Joined: September 22nd, 2019, 5:17 am
- Favorite Philosopher: Cratylus
- Location: England
Re: If a Google Car Has to Kill Someone, Who Should it Be?
Yes, the question here is — is there such a thing as an accident, or is someone always 'to blame'?Good_Egg wrote: ↑August 5th, 2023, 4:41 am I'd question the assumption that if somebody dies in an accident then there should automatically be a lawsuit to make somebody compensate the victim's relatives. Maybe that's US culture for you ?
The common law tradition has long distinguished deliberate harm from negligence from accident.
If the owner and manufacturer of the self-driving vehicle have each made every reasonable effort to avoid a collision, why should either be punished ? Why does every collision have to be somebody's culpable fault ? Whatever happened to the notion of accidents ?
The law in many countries is moving toward the understanding that there are no 'accidents'; there is always blame, and thereby, liability. Accidents generate no income... Personally, I disagree, but I doubt this will have any effect on the rest of the world.
"Who cares, wins"
2023/2024 Philosophy Books of the Month
Mark Victor Hansen, Relentless: Wisdom Behind the Incomparable Chicken Soup for the Soul
by Mitzi Perdue
February 2023
Rediscovering the Wisdom of Human Nature: How Civilization Destroys Happiness
by Chet Shupe
March 2023