Hacker News new | past | comments | ask | show | jobs | submit login

That seems very unlikely. 'AI' driving systems are subjected to millions of hours of simulated driving. A very good human driver will have a few thousand hours of experience after decades of driving.



Has your computer or phone ever glitched? Have you ever suffered a Malware attack? Every single way that computers are vulnerable, machine driven systems are vulnerable. We just saw the SolarWinds hack that was installed through malicious software updates. That could happen in an autonomous driving system as well.

Humans are still far better at dealing with unforeseen circumstances than computers. An AI is trained on a dataset and can overfit for certain parameters.


Releasing unverified AV code would breach most every 'duty of care' legislation in existence. Releasing it in a way that compromises thousands of systems at a time would be the same. These are devops issues not capability.

An AV system needs to pass the driving test your 16 year old sister passed, that's it.


That has already happened with network security software (SolarWinds) used at the highest levels of government including the Pentagon, White House, Congress, and most of the Fortune 500. There is no reason to believe that a massive AI driving system could not be hacked in a similar way through a malicious exploit or update. In fact, adversial AI could be used to dupe these tests and achieve the effect. If 50 million cars on the road were using one particular software suite, this is a vector of attack that has to be considered. It is not an impossibility and you cannot hand wave it away.


The current L2 system developed by Comma.ai simulates the model and changes in CI on every commit from what I can tell.


The average human driver doesn't need millions of pictures of a stop sign to identify one either. Training current AIs sucks and the results need to be verified because they can be flacky, the current research really doesn't come close to match an organism that spend millions of years evolving the ability to move in a complex 3D world.


The AV problem is not as intractable as you imagine. Driving a car is not the same as dating a girl.


Which seems unlikely?

Do AI drivers face the same variety of conditions as humans? Or is it millions of hours on near [perfect] roads in a temperate climate?


You can't compare the times, because AI and humans "learn" in vastly different ways, at vastly different rates.


Can an AI system learn on the fly (that would negate the testing done by the way)? Because I’m fairly sure other than sensor input “normalization”, they can’t and humans can.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: