Hacker News new | past | comments | ask | show | jobs | submit login

That's what I was thinking too. Also, isn't this sort of thing pretty common now? As in, aren't there startups that have drones that follow you to take cool video of you [1]. I'd imagine they are doing some sort of realtime tracking.

[1] https://www.airdog.com/




None of the "follow me" drones folks are really doing object tracking yet. They basically just "fake it" by sending a bunch of guided waypoints with the offset of the phone or bracelet GPS location in airdogs case and hope it looks OK because of the really wide angle lens of the GoPro. There is a lot of room for improvement in this area that could be made by feeding that realtime tracking data back into the flight controller as you suggest.

The one startup in the drone space that is really putting computer vision stuff to work on drones is http://skyd.io - some ex google project wing folks. The demos are freaking amazing. Wait long enough on the website background video and you can see a six foot wingspan UAV flying itself through a parking garage.

Source: I live an breath drone stuff and write a UAV ground station app for iOS to go along with our Apple MFi certified hardware for the long range telemetry connection from iOS. It works with OPs flight controller by the way :-)


OP's video doesn't really have anything to do with drones as such, just loads an mp4 of drone footage. The fact that it's drone footage is completely incidental.

But lots of people are doing real computer vision and object tracking on drones and robots, and have been doing for years.

https://www.youtube.com/watch?v=Gj-5RNdUz3I

https://www.youtube.com/watch?v=HqzMSxBOV-U

https://www.youtube.com/watch?v=C95bngCOv9Q


I know. I was specifically addressing the "follow me" class of drones mentioned that are getting a lot of press these days such as airdog, hexoplus, 3dr IRIS+ (all APM:Copter based) and no one outside university or military settings that I know of have really integrated that data back into the flight controller as well as the skydio demo suggests.


Unless they've had a massive and sudden change of heart, Airdog is based on the PX4/NuttX stack, not on the APM:Copter.


Sorry, it gets confusing since APM:Copter autopilot (I guess that's what they are calling it now) can run on PX4 hardware running NuttX too. So they are using the full PX4 flight stack as well?


Yep, although as I understand it they've worked so hard on it since the Kickstarter I'm not sure how much is still recognisable as the core ETH code. At the very least their software is rooted firmly in the PX4/Lorenz design ethos. One of the guys behind the project is also the author of some very nice PX4 and MAVLink tools which you may well have used: https://github.com/DrTon

[we have a nightmare at work trying to distinguish between original PX4FMU running PX4, Pixhawks running PX4, and Pixhawks running APM codebase in conversations]


Ah, I have seen DrTon's commits on the project. He has been rather prolific in the last several months. I'd love to connect with you outside of HN and hear what you are working on. Shoot me an email (in profile) if you see this.


Are there many drones out there carrying enough processing power to do serious machine vision (SLAM, obstacle avoidance) in real time from the drone's onboard sensors?


I'm sure there is... It's a problem begging to be "solved". The majority of juice on a drone is taken up by the propellers, not the board running it. Though, to be fair, at that point "weight" is a bigger concern than electricity usage.


I don't know the exact hardware of these drones, but I managed to make SLAM work (with near-realtime dense mapping) on an iPhone 4S and up, so I think it's feasible.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: