You don't "choose" because, as I said, it is unsafe to program your system to make choices. You brake because that's what a car does.
Your problem isn't real because the car doesn't exist in a logical world with N or M discrete things, it exists in a real world where it can be mistaken about what's happening outside it. Letting it make choices like that would have a bad outcome if it hallucinates (occupant+1) grandmas in front of it and decides to heroically sacrifice itself and you.
> But if any self-driving car out there steers to avoid collision, then it is already facing the 'trolley problem'.
Even the tweet you've cited says:
> there is nothing in the street which you want to collide with. the correct response in every case is to evade the thing that's in the street.
(emphasis mine)
So braking is clearly not the only option.
I truly hope that you do not work on software or hardware that is in any way close to areas like this. You seem completely blind to the real world issues that driving (among other things) forces onto a system. Cars have brakes and steering wheels. Any real world system will use a combination of the two of them to try to keep the occupants and those outside the vehicle safe. Pretending that there will never be situations where there are conflicting choices to be made is ... well, I just find it unbelievable that anyone reading HN could try to deny that there will be situations like this.
I should point out that the guy I linked is an AI lawyer, so the replies aren't actually as valuable input in this case… also, I think he uses "evade" to mean "not hitting something" so braking still counts.
I've had other discussions with literal self-driving car company engineers where they told me it's not a real problem as usually defined. Though I can't link those, here's one where someone asks the Aurora people about it.
It's the best option because you're not the only moving thing on the street. Braking in response to a car in front of you is normal, but evasive maneuvers at speed aren't. You don't know what other people are going to do in response to that.
Oh, but I will let you turn or reverse as long as you signal first. I just don't think you should do it at speed with no warning even in a "least bad option" situation.
> You seem completely blind to the real world issues that driving (among other things) forces onto a system.
Sorry for being a theoretical murderer, but you weren't talking about real world issues, you're talking about a trolley problem! That's defined as:
- there's 2+ discrete paths you can take. (semi-true for cars)
- there really is something on each path you'll hit. (semi-true, in reality they'd react to you in good and bad ways)
- your knowledge about this is correct. (not true, SDCs' world-knowledge is not perfect)
- you are going fast enough to be dangerous. (semi-true, SDCs will drive at safe speeds more often)
- you must go forward. (not true, SDCs can brake or reverse)
#3 and #5 being the big problems making this unrealistic.
Maybe a real world problem would be driving on a mountain road and there's a boulder about to fall on you? In that case, I agree braking would not be safe.
Action A: kill N people, including occupant
Action B: kill M people, not including occupant
Both pathways are predictable. Which one do you choose?
ps. solve for: N == M, N > M, N < M