Hacker News new | past | comments | ask | show | jobs | submit login

Yep that exposure control / sensor quality of the dash cam in the video was rubbish. My own Blackvues produce far, far better results than that. Just look at how nothing is illuminated by street lights, this clearly has the effect of making the poor rider appear "out of nowhere". Also agree it appeared driver was on smart phone most of the time, thus not in control of the vehicle, and had thus no business being on the road as these are systems UNDER TEST.

If that's the best Uber can produce then they ought to hang their heads in shame. Unless it was doctored with... as I find it hard to believe they'd put such rubbish quality cameras in their trials.




Do you trust Uber to provide all the data, or would they selectively produce data favorable to them?

Do you trust Uber to provide unedited raw video, or would they process it to increase contrast, make it appear that nothing was visible in the dark areas of the frame, reduce the resolution, drop frames, etc.?


It's funny how the internal camera which shows how distracted the driver was has way better night vision than the external road camera...


The key here is contrast; plus a IR light at 2 feet works great, at 60 feet...not so much.


The internal camera (let's be honest and call it the scapegoat camera, because that's the only practical use for human "safety drivers" when they are not permanently engaged) must take almost all its light from IR, because we don't see anything of the smartphone screen glare that the eye movement so clearly hints at.


I don't think the driver is looking at her smartphone. I think she's checking the car's monitor (as in a computer screen). Although to be fair, that should be showing the car's view of its surroundings so I don't know what's going on there.

Edit: Nevermind. Someone posted a picture of the car's interior, below and there's no computer screen.


Link?


Sorry - I can't find it. This thread has grown rather.


Ok so this is getting old now, but I just came across the following - which show what I'd expect the roads to look like, and geesh were Uber ever full of crap to release their video which pretty much had the effect of exonerating them.

https://arstechnica.com/cars/2018/03/police-chief-said-uber-...

Please check the videos out.


Yep, exactly..


> Yep that exposure control / sensor quality of the dash cam in the video was rubbish.

Is that the same cam used by the AI to detect obstacles?

I would expect a safe self driving car to include IR cameras that can be more cautious about moving warm blooded creatures.

Surely some more detailed telemetry data would reveal whether the main issue is with the sensors or with the algorithm.


I highly doubt that camera is part of the perception pipeline.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: