Fair point, but if the alternative is hitting something that has an as high or higher likelihood of killing the occupants, I'm not sure what the alternative is (don't say "avoid the situation in the first place," that's obvious).
Nobody's going to buy an automated vehicle that might intentionally kill them to save a pedestrian.
I think the alternative is pretty much a no go. If some poor pedestrian that was on a side gets hit by a car to preserve the owner then the lawsuits will be huge. Killing bystanders is not going to be acceptable.
Let's look at a picture[1] from an article[2] that made it to HN[3]. If any program chose A as an acceptable solution, then they should be sued into oblivion. Killing the bystander will kill the driverless car.
People drive SUV's, which have a much higher likelihood than ordinary cars of killing the occupants of other vehicles or pedestrians. No lawsuits. Just for perspective, the lawsuit you are talking about is the family of one dead pedestrian involved in a freak accident suing a car company. I mean, how long could a car company hide the effects of proprietary closed software engineered specifically for flouting the law.
..also from your picture and as I have learned from TWD, if you see a herd of zombies on the road your self driving car should probably be programmed to drive straight through them.
That's not equivalent. We are talking about a vehicle that is programmed from the manufacture to do something. The SUV driver is responsible for his/her actions. When we are talking about a self driving car, we now have machines that decide what to do and killing the bystander tends not to be a sympathetic item in from of the jury.
As to the second part... I'm not sure how the software will judge the depth of the hoard to see if avoidance is a better answer than just pushing through. Also, a couple of zombies could really mess up a Tesla versus a SUV or Jeep. Zombie Apocalypse mode would be a much hard problem.
"The SUV driver is responsible for his/her actions."
Hey, that is what ford said about the Bronco II.
Anyway, I agree with you in principle that if presented with the clear evidence in court it would be as you say. My contention is it would take many years and a whistle-blower before anything like that would ever come to light. And there is much greater likelihood of deaths occurring accidentally due to the code, than by design.
Depth of herd is an issue, to be sure, and this even assumes the car could tell the difference between zombies and humans.. perhaps we need a waze for herd avoidance.
> Hey, that is what ford said about the Bronco II.
Programming is treated a bit different from a tire specs
I get the feeling a lot of programmers are going to be put on witness stands pretty early in the roll out. Lawyers already like to interview Sys Admins.
> Depth of herd is an issue, to be sure, and this even assumes the car could tell the difference between zombies and humans.. perhaps we need a waze for herd avoidance.
Classically, an IR sensor should tell the car the difference between zombies and humans. Herd avoidance is probably a better tactic because of resource depletion.
The "kill the pedestrian, not me" example is brought up... but what is a real life situation wherein the only two choices in retrospective was 'die' or kill a pedestrian?
Most of the time, you'll avoid killing a pedestrian and it'll result in non-critical injuries to you due to airbags and other safety features. Or it'll result in a dodge or just car damage, since the other cars are more prepared to react to you than a hobbled pedestrian carrying groceries is.
I think often times the car will 'choose' to risk injury you rather than absolutely a pedestrian.