Hacker News new | past | comments | ask | show | jobs | submit login

This is quite cool, why cameras only though? Intel realsense, ultrasound? Are you avoiding them due to cost?



Hello and thanks! No I am not avoiding either of those due to cost. I’ve worked with ultrasonics on a robot before and I found them pretty unimpressive. I have seen a demo of some advanced ultrasonics but the guy also told me he patented the technology. Normal ping pong ultrasonics return such ambiguous information they aren’t very useful. Regarding intel realsense, they have two different technologies. Some realsense devices are stereo cameras, so they are “just cameras” in the sense of this discussion. But I’d rather run the algorithms myself and build a complete perception system than use the built in computation of for example the realsense tracking camera. The other technology they use is structured light, and such technology doesn’t work well or at all in sunny environments. I want to build a complete perception and navigation stack for outdoor robots, and aside from some sensors inside the robot, cameras are the technology I’m most interested in learning how to use. I just think cameras are the future. Passive vision is how most every living thing on earth perceives the world.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: