If the car is on the way to the snackbar, all bets are off.
I’d want that vehicle to be programmed like KITT. KARR would go after people.
I don’t plan on ever having a self-driving car, but it is extremely unlikely that the brakes will have a catastrophic mechanical failure without one of the cars other computers picking up on the weakness. There are better hypotheticals than this one.
We don’t need “self-driving” cars!!!!
PHOOEY!!!!
Ping.
Go straight.
I want to control where I go and how I get there...this makes as much sense as being on a motorcycle going 70 miles an hour without holding on to the handlebars and gears...
Liberalism is a mental disease: ‘Trying the same thing over and over again hoping for different results’...
What if there’s a mob blocking an Interstate?
Rock the Snack Bar. Didn’t the Clash perform that?
Put out the wings.
~ typical leftwinger sentiment ~
One of the problems I see with self-driving cars is much more likely than the stupid scenario of the brakes failing just at the moment where there is now a choice to kill two different groups of people:
When there are enough self-driving cars, I’m sure there is going to be plenty of automatic traffic-related re-routing.
Who gets the best routes? And in fact, will more “prestigious” drivers get things like solid green lights and other cars diverted out of their way... “automatically”?
Kinda like Obummer care
Now imagine you’re riding in the back of a self-driving car. How would it decide?
_______________________________________
What an idiotic “what if” scenario.
Self driving cars use so many thousands of smart algorithms per second that any such car with brakes about to fail would never even be on the road in the first place. It would self park itself before it ever got started.
As leery as I am (as anyone of us are) about riding in a self driving car; I’d sooner do that than ride with some idiot who knows their brakes could fail and as a result would have to pay the “what if” game by deciding who lives and who dies.
A more reasonable scenario: A two lane road with a cliff to your right. Suddenly, a group of people dash out into the highway. Too late to brake. Do you:
A) Swerve left into oncoming traffic? Maybe killing yourself, your family, and the occupants of the car you hit head-on?
B) Swerve right off the cliff, definitely killing yourself and your family riding with you.
C) Plow right into the people on the highway, thereby killing them.
I don’t know about the rest of you, but I would not ride in a self-driving car programmed for A or B. I want my family protected. I choose the manufacturer who programs C.
Yet, deliberately programming a car to choose C would open up the car manufacturer to all kinds of legal risk. It is because of programming issues like this...and their legal ramifications....that we will not have self-driving cars until the law catches up with the technology.
“As you speed toward a crowded crosswalk, you’re confronted with an impossible choice: veer right and mow down a large group of elderly people, or veer left into a woman pushing a stroller.”
I’m hitting the old people. They’ve led a full life, and might die tomorrow anyway for all I know. That baby in a stroller hasn’t had a chance yet.
Do I have time for eeny-meeny-miney-moe?