Hacker News new | past | comments | ask | show | jobs | submit login

Maybe I'm being ignorant about something here but isn't paying less attention the whole point?



Unless you assume that self-driving software is perfect, no, it really isn't. That's the whole problem - the drivers would get complacent, so when there's an issue, they'd be caught by surprise and wouldn't be able to react.


Isn't the point to pay _no_ attention? The difference is when an accident occurs, was the person in the car at fault for not vigilantly watching everything.


You want the curve of total attention to be always above a baseline human in an unassisted car. The car can do some attention and the human can do some. But if the sum of the two falls below the threshold, you’re in trouble.


You can already do that by just closing your eyes and letting Jesus take the wheel. No, the point is doing so while maintaining safety.

It is materially less safe to operate a ADAS while distracted than driving manually. Humans are exceptionally good drivers on average, only encountering minor crashes on timeframes measured in years to decades. As such, if safety critical ADAS errors occur more frequently than every ~100,000 miles and you are attentive in less than 100% of all such occurrences, you are operating your vehicle multiple times more dangerously than the average driver (which is a number that includes drunks and distracted drivers).

That is why it is critical to deliberately downplay the capabilities, to avoid wishful over-reliance, and enforce strict driver awareness (through techniques such as driver monitoring) to avoid operating multi-ton killing machines in ways that are multiple times more dangerous to the occupants, other drivers, and pedestrians. Without that, people are prone to over-generalization of safety capabilities, extrapolating that a single success means robust, continued success thousands to tens of thousands of times in a row.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: