Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: fishtank

Now imagine you’re riding in the back of a self-driving car. How would it decide?

_______________________________________

What an idiotic “what if” scenario.

Self driving cars use so many thousands of smart algorithms per second that any such car with brakes about to fail would never even be on the road in the first place. It would self park itself before it ever got started.

As leery as I am (as anyone of us are) about riding in a self driving car; I’d sooner do that than ride with some idiot who knows their brakes could fail and as a result would have to pay the “what if” game by deciding who lives and who dies.


28 posted on 01/18/2017 10:24:10 AM PST by Responsibility2nd
[ Post Reply | Private Reply | To 1 | View Replies ]


To: Responsibility2nd
The biggest issue with how the self-driving car decides between two disastrous scenarios isn't the technical aspect of it -- it's a question of who then takes on the liability for the decision that is made.

I work in a field that is dealing with the infrastructure and operational aspects of self-driving cars, and I've said for a long time that the main impediment to adopting self-driving car technology is the legal challenge of transforming from user-based liability/insurance to manufacturer-based product liability.

43 posted on 01/18/2017 10:36:41 AM PST by Alberta's Child ("Yo, bartender -- Jobu needs a refill!")
[ Post Reply | Private Reply | To 28 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson