Hacker News new | past | comments | ask | show | jobs | submit login

I own a model 3. Often it says "the camera is blinded" (e.g. when the sun is shining at it), often there's rain on the lens... So I'd argue Tesla's cameras are much worse for driving under all conditions vs. the human eye. I'm not sure there's enough redundancy for depth perception either. I tend to agree with the argument that humans prove it's possible to drive with just visual sensors but Tesla's sensors, for better or worse, are not equivalent.

Then there's the part about the brain being what gets the sensor data and interprets it. I doubt Tesla's computer has all the capabilities of a human brain.

Driving behind a boat trailer, it's clear to me it's a boat, but Tesla's computer is confused. Bicycles can be a little iffy as well. In general the way the car perceives the world seems subpar to how a human does, by far.

I'm a fan of the car, I'm mostly a fan of Tesla/Elon, but I don't think these cars with this technology are ever gonna be truly autonomous so they can drive with no human intervention... IMHO.




> Often it says "the camera is blinded"

It could very well theoretically be a software issuse.


The human driver has a number of options to adjust their “cameras” (eyes) that the Tesla doesn’t. E.g., flipping down the sun shield, adjusting head position to reduce glare, or even putting on a pair of polarized sunglasses.

While there are technical solutions possible with more hardware, it’s not clear that Tesla could correct for some of these things in just software...


If these are uncommon enough, a Tesla could just stop on the side of the road and alert the driver. There will be a driver inside anyway. And there is a huge difference between "no L5, you have to keep an eye of the road every 10 seconds all the time", and "drives itself, but once every 2 hours, or in extreme weather conditions safely warns you and makes you take over and force you to drive yourself". This could very well be a great usecase, even if it won't drive itself for those 1% of cases.


That is not L5. That’s L3.

And if the car camera is blinded by the sun, that raises questions about whether the car could safely pull over to the side of the road. I suppose with advanced simulation the car could perhaps predict glare from the sun and pull over ahead of time (“it’s approaching sundown and we’re driving west, you need to take over in 5 minutes”)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: