Hacker News new | past | comments | ask | show | jobs | submit login

> I'm... not so certain. Why?

In this specific case, because if a driver is watching the road in front of them, four seconds is an awful lot of time to watch yourself head toward and then accelerate into a cement barrier, all without touching the brakes. It's fine to want Autopilot to be better, but as a matter of law, it is the driver's responsibility to slam on the brakes in that situation, and move into a legal lane.

More broadly, it's because the contract of an L2 system is that a human is in control at all times. L2 systems are assistive and not autonomous. They will never disobey a driver's presets, nor override a driver's real-time inputs, even if the car thinks it is safe(r) to do so. This is a major design principle behind every L2 system, and the reason why there are no scenarios where an L2 system is considered at fault by law.

Now obviously, if there were a bug that caused an L2 system to override a user's input -- say disregarding someone hitting the brakes, or overpowering the steering wheel, that'd be a systemic failure and grounds for a recall. But we haven't seen any cases of that, and there are hardware precautions (e.g. limited torque in the Autosteer servo) to minimize that possibility.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: