Now imagine you’re riding in the back of a self-driving car. How would it decide?
_______________________________________
What an idiotic “what if” scenario.
Self driving cars use so many thousands of smart algorithms per second that any such car with brakes about to fail would never even be on the road in the first place. It would self park itself before it ever got started.
As leery as I am (as anyone of us are) about riding in a self driving car; I’d sooner do that than ride with some idiot who knows their brakes could fail and as a result would have to pay the “what if” game by deciding who lives and who dies.
I work in a field that is dealing with the infrastructure and operational aspects of self-driving cars, and I've said for a long time that the main impediment to adopting self-driving car technology is the legal challenge of transforming from user-based liability/insurance to manufacturer-based product liability.