There are also moral decisions, value questions, presented to human drivers on occasion. I'm wondering how these will be programmed into cars.
That is another HUGE concern. I’m not sure there are any good answers.
Examples:
If an accident is inevitable and unavoidable and there is a choice between hitting one person or hitting two people, the “obvious” choice would be to hit one person, thus minimizing the damage. But what if the one person is in the car and the two people are on the road (a mother holding a child who has suddenly jumped into the street)? Should the car crash itself into a brick wall and kill the passenger or hit the two people in the street?
Who gets to decide?
If it were me in the car, I might (*MIGHT* - depending on additional circumstances) be willing to sacrifice myself. But if it were my child in the car, I’d want to save my child instead of the irresponsible person who jumped in front of the car.
The Trolley Problem is easy, and half the video games out there already have a variant in them. Distance. If force to hit A or B you “aim” for the one that’s further away, giving you more time to slow the vehicle, thus doing less damage giving whatever you hit the highest chance of survival.