This seems like a simple case. I would have expected the driver assist in my 2017 Subaru to have reacted to something in the road. I'm surprised that the much more sophisticated self driving system did not.
I can't count the number of commercials I've seen for Fords and Nissans and Subarus and every kind of modern car that has this exact feature, and this is exactly how the advertising plays out. Someone sprints in front of the car, the car stops automatically, everyone is safe, pedestrian continues with their life. I've never used it in real life, but I assume it works how the ads show it.
If Uber can't match a $25,000 off-the-shelf floor model mass-market midsized sedan for collision avoidance, it's hardly a self-driving car.
The system in my car does this but only under a certain speed, much lower than what the vehicle in this video appears to be doing. It's intended for stop-and-go city driving or to avoid hitting a kid that sprints out of a driveway in a slow residential area, not to slam on the brakes to avoid hitting a deer or person at 40mph.
I wonder if this was affected by the fact that the pedestrian was in another lane up until the very last second. Perhaps the car detected the pedestrian but failed to consider it an obstacle since it wasn't in its direct path. It could be unexpectedly difficult to account for pedestrian crossing speed if it caused automated cars to stop when a car in the next lane happened to "wobble" towards the automated car's lane.
I dunno, if it decided to ignore the pedestrian because she was in the other lane that's extremely troubling. The pedestrian was moving laterally across the road. If the car has detected that, it should infer that she might become an obstacle very shortly.
Driving is all about predicting the future. Think of every time you've been able to tell that someone is going to change lanes even though their blinker is off, or slowed down when a ball bounces into the street because you know there's going to be a kid following it. If the car isn't capable of that, it's not ready for public streets.
What's most troubling is it doesn't even matter if the uber car thought the slow object in the other lane was a pedestrian, a car, a tree, whatever. Even if it thought the object was say, the most "normal" thing it could be, another car, this would still be a special situation requiring action. Without knowing the objects classification the estimated speed is enough to decide. Why would a car be stopped in the middle of a lane on a fast road? It should be treated as an obstacle that could grow to the side. After all it could be a police, tow, construction, or disabled vehicle, and a cop or tow worker might be about to walk to the side.
Actually many states now require by law that you get as far as possible from a lane with a disabled vehicle, as many human accidents have happened.
I am convinced uber has been basically pretending to do the mountains of careful and sophisticated crap waymo actually has gone to great lengths to do, and is just racing to put anything out so they can keep stringing investors along as far as they can before the jig is up. Well the jig is up now.
I think target fixation also contributes to accidents involving emergency vehicles on the shoulder.
Back to Uber, the number I hear is that they are striving to go better than 13 miles/intervention to prevent an incident. For Waymo this is over 5000 miles. I'm convinced too.
When combined with her walking a bicycle, I suspect this is an edge case they never tested. It may have tripped up their pedestrian detection or path prediction (or both) if they got inconsistent hits on the bike.
There in lies what might be core to the issue though. "An edge case we hadn't tested" Perhaps how we DO AI right now just isn't quite ready for situations this complex yet? Maybe AI based on "oh I've experienced something like this before" is not quite enough? Are we able to do better in getting a system that can infer what's about to happen and make decisions without experiencing it directly... or even indirectly? I know some AI systems are able to do this now with limited cases... but it does make me still wonder if 0.05% of the time the complexity of this task is still just a little outside the capabilities of our learning systems. And maybe we only see the results when something bad happens in that tiny window of time when the AI is unsure.
I am obviously no AI expert, just what I know from loosely following the field. But things like this cross my mind from time to time.
This is the exact thing I've been thinking all the time: Something's not totally right here: Let's at least slow down to better assess the situation. Everything works better in slo-mo. My human brain would then have much better time to evaluate the scene. This would definitely also hold for the sensors and processing systems for an autonomous vehicle.
And even if it doesn't, and the system still concludes it's noise/an obstacle that isn't going to move etc., a low-speed collision is preferable to a high-speed one. Unless you can be completely confident you aren't going to hit something, slowing down is not a bad default action.
Even if you imagine it thought it was another vehicle stopped in the other lane of traffic, the SDV should have slowed. There's no way you should assume a stopped vehicle isn't hiding another danger and blowing through while speeding is what stupid humans do and what SDVs shouldn't.
If the car cannot predict "there's something in the next lane, could be in mine the next moment, better slow down just a bit", it has absolutely NO BUSINESS AT ALL driving around on public roads. At each and every occasion that I drove in a city, I have needed to take at least one such minor evasive maneuver due to someone suddenly getting into my lane, be it a car, a bike or a pedestrian. This is not even driving 101; if the car is unable to handle that, it is quite literally unfit for the road.
You would think that a reasonable project plan would attack the human safety/emergency situations first, and then move on to anything else. I guess having a car that avoids accidents but doesn't drive across the country on its own does not make for interesting headlines. This is the consequence of headlines-driven AI... "lets solve something hokey and grabs attention so management will be pleased instead of doing the more meaningful/long term r&d."
2017 Legacy with "Eye Sight." I have had one or two situations where the car emergency braked on my behalf. It happens when the car in front of me turns off the road and is nearly clear. I don't brake because I know that the car will clear the roadway before I get there but the system doesn't see it that way and brakes. I anticipate that now and avoid anything close enough to cause braking so I don't wind up the front end of a rear end collision.
The other thing this system does is to provide adaptive cruise control. If I'm behind a vehicle that results in slowing down and then switch lanes (to where there is another vehicle in front of me) the car seems to think it can resume speed and slip between the two vehicles. I've come to expect that too and disengage the speed control before switching lanes.
It also provides a warning when approaching the lane markings (unless I have indicated a lane change.) Occasionally it triggers on seams in the pavement. It also provides steering input if the land deviation increases. I've only experienced when I tested it in purpose. I'm not sure it would reliably keep the vehicle in the lane.
Overall the system seems to be pretty good and though not perfect, is a net asset.
That sounds awful. Driving is hard enough when you have to second-guess what other drivers and pedestrians are going to do. Why add to that by having to second-guess what your own car is going to do?
I've got to side with the people who want no auto-driving until we have always-better-than-human auto-driving. When cars only have back seats, and no driver controls beyond a way to state your destination, auto-driving will be acceptable.
I love our 2017 Outback w/ EyeSight. Luckily, we haven't had to test the auto braking system at high speeds but the lane assist and assisted cruise control are wonderful.
Not a suburu but my passat engaged emergency breaking once. It's was kinda spooky to go for the break and realize its already depressed. Definitely prevented an accident. I was paying attention too, but someone jumped into my lane and slammed their breaks on for who knows why, there wasn't anything in the road...
In some areas that maneuver is a strategy to collect an insurance payout. If the car had many passengers (who would all get soft tissue injuries) this might have been a "swoop and squat."
> someone jumped into my lane and slammed their breaks on for who knows why
Insurance fraud, possibly. Depending on jurisdiction if you rear-end someone you could automatically be 100% at fault (assuming the fraud is not discovered/proven)
Absolutely love my Outback + Eyesight. That said, the lane assist and automatic cruise control are not very sophisticated.
The automatic cruise control is great for freeway and some street driving, but don't expect it to brake very smoothly / like a human. I consider it outsourcing part of my concentration.
Lane assist is nice, but it won't auto-center -- if you were take your hands off the wheel on a straightaway it would "ping-pong" back and forth. I mostly like it on long drives, reduces the amount of effort on bends.