Hacker News new | past | comments | ask | show | jobs | submit login

I'd rather that than a computer hemming and hawing over whether a single pixel is an oncoming truck and not turning just in case.



a computer "hemming and hawing" as that one accident where it couldn't decide if it was a bicycle or a person has nothing to do with the training. It's what the developers decided to do with input that had a low confidence score. There will ALWAYS be low-confidence ratings on real world data regardless of how good your training is.

Instead of saying "oh crap there's SOMETHING there we should stop" they said "huh, no let's loop on testing it until we figure it out or run it over....whichever comes first."


Also if the car wants to stop too much because of low confidence, just turn the brakes off.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: