A lot of people seem to mention how LiDAR should have seen her coming: Generally speaking, that's true. But it's difficult. In a single frame the system would hopefully detect something on the left of the lane, but wouldn't know what it is (if they use AI it's probably not trained for "person shoving bike packed with plastic bags").
If you have the object detection in place, you still need a trajectory for that object; if you don't know the exact angle you're looking at that object it's difficult. And moving at any speed there are plenty of things that could go wrong with the calculation, especially for slow objects.
Also remember you're seeing a lot of oncoming traffic (for which you don't want to do an emergency brake) and cross traffic moving onto the road, letting you pass before they enter your lane (for which you might want to slow down a bit).
Maybe the bags reflected LiDAR light causing a wrong classification... Maybe their detection noise was too high and the quick fix was to turn down sensitivity.
Whatever happened exactly (and I hope we get more information), this is a terrible accident, because the object in question was not part of some controlled test, but a human being left dead.
Still, the technology is very promising, but companies should be more careful with testing; have drivers be aware of the road (is that IR light from the camera or a smartphone?), and maybe do more testing in controlled environments.
As a software engineer, I would only release an autonomous car into the wild after I trust it enough to walk into its path on a dark road (with enough space to brake; suicides can not be avoided)... I wonder if the Uber engineers would?
I don't know how you can watch this video and then say that they should "maybe do more testing in controlled environments".
This was a complete failure and Uber's cars should be immediately banned from testing on public roads and there should be investigation into their practices to ensure public safety while testing their vehicles. If they rushed it trying to play catch up with Waymo they should be punished.
It's a "maybe" because I don't have any more data points than what's present in the video - only assumptions.
But I strongly assume this could have been avoided by more thorough testing (e.g. use a contraption to shove various inanimate objects in the path of the car at various speeds [both object and car] in various light conditions).
And iff a proper investigation finds out people responsible for this accident (engineers doing dirty fixes, managers neglecting safety, the driver just being an unaware dummy; you name it), they should carry the consequences same as a human driver would.
And yes, their self driving cars should stay of the road until the fault has been found and fixed. I never claimed the opposite. (Aren't they? IMHO it's the obvious thing to do; but then we're talking about Uber and their poor sense of responsibility...)
Eventually "maybe" has too weak of a connotation in this context (as a non-native speaker I sometimes find it difficult to pick the correct variant for the exact thing I want to implicate) - for which I apologise if I mislead you there.
If you have the object detection in place, you still need a trajectory for that object; if you don't know the exact angle you're looking at that object it's difficult. And moving at any speed there are plenty of things that could go wrong with the calculation, especially for slow objects. Also remember you're seeing a lot of oncoming traffic (for which you don't want to do an emergency brake) and cross traffic moving onto the road, letting you pass before they enter your lane (for which you might want to slow down a bit).
Maybe the bags reflected LiDAR light causing a wrong classification... Maybe their detection noise was too high and the quick fix was to turn down sensitivity.
Whatever happened exactly (and I hope we get more information), this is a terrible accident, because the object in question was not part of some controlled test, but a human being left dead.
Still, the technology is very promising, but companies should be more careful with testing; have drivers be aware of the road (is that IR light from the camera or a smartphone?), and maybe do more testing in controlled environments.
As a software engineer, I would only release an autonomous car into the wild after I trust it enough to walk into its path on a dark road (with enough space to brake; suicides can not be avoided)... I wonder if the Uber engineers would?