Now that's a nice device, and is a true time of flight sensor. There's a similar model with 1.2m range.[1] Don't know whether it's a pulse LIDAR or an RF-modulated beam. The ones that use a pulse laser can have much more range and can work outdoors. They must far outshine the sun, but only at one wavelength and only for a nanosecond. That's quite possible.
> During the ranging operation, several VCSEL infrared pulses are emitted, then reflected back by the target object, and detected by the receiving array.
VCSEL is Vertical cavity surface emitting laser. Therefore I would say a pulse laser.
I am suspicious about this statement for the following reasons:
* Some ST patents [0] say they are doing time of flight but have actually came up with slight variation on reflected signal phase shift measurement (see reply by Animats).
* Seeing internal API functions like this [1] in APIs they provide for sensors of this family. Of course, this might be something else (e.g. phase shift of internal PLLs/whatever).
* The shortest time in which this sensor performs measurement is on the order of few tens of milliseconds, and the high accuracy modes take up to 100ms. True time-of-flight systems (e.g. [2]) should have the answer ready on the order of nanoseconds (for the distances this sensor works at) and I haven't yet seen designs where there is so significant post-processing of the data which could explain this latency.
Don't take me wrong, I've worked with VL6180X and VL53L0X (and look forward to work with VL53L1X), and these sensors are the best in this size but I'm just suspicious that they are directly measuring time for the signal to bounce back, but are instead inferring the time based on some other measurements.
Actually, this made me think that I probably have photodiodes with wide enough bandwidth laying around so I could check the transmitted signal on oscilloscope.
Short range low cost time of flight devices are usually modulated-beam things. You modulate the outgoing light with an RF carrier around 20MHz or so, and then detect that carrier on the receive side. Measure the phase difference between the two to get distance. There's a neat trick borrowed from FM radio to do this - down-convert both input and output signals with the same local oscillator. The resulting down-converted signals have the same phase difference but at a lower frequency, where you can count it easily.
Are those actually referred to as "time of flight" devices though? Phase-difference measurements are cheap and extremely effective, but I usually see them labeled as such.
Any laser scanner you'd put on a vehicle is going to be ToF, as far as I know. While this is useful for indoor sensing, you'd completely wash out the laser illumination once you have to compete with sunlight.
This would be better described as a structured light scanner. The principle is more similar to what something like a Kinect uses than a Velodyne/SICK/Hokuyo etc.
The scan rate and accuracy may not be the best, but the fact that they are using a linear photodiode array impresses me. The fact they were able to do all this for $35 is even more impressive, and I await the day I can purchase this from chinese electronics hobbyist websites.
Me too! I've been looking for something like this, but wasn't been able to find anything. Very nice to see this here, perhaps I'll build one if I have time at some point, or order one if it becomes available.
Thanks for this. When I first started working in LiDAR nearly ten years ago the cheapest hardware you could acquire was $25,000 and completely proprietary.
It’s nice to see a simple hardware hack for playing with the technology. Maybe this will encourage some intrepid engineers to build the next wave of hobbyist point cloud collection devices.
There are lots of fun applications like home surveying to 3D model generation.
At the time Velodyne brought out their HDL-64E for ~$75,000 you could buy a SICK LMS-200 for about 10% of that.
That's why loads of the DARPA Grand Challenge vehicles were plastered in SICK LIDARs [1] back when Velodyne only had one prototype, and it was on a truck they were putting in the competition themselves [2].
Course, Velodyne's product may be 10x the cost, but it has 64x the number of lasers, and good range and direct sunlight performance, so it's understandable that it took off.
Seems... dangerous? Project states it uses "3mw 780nm Infrared IR Diode Laser". 3mw is enough to damage your eye if you don't blink or turn away. Since it's infrared, you won't blink, or possible even notice anything (there are no pain receptors in the retina) until it's too late.
Wikipedia list "780–1400 nm (near-IR) - Pathological effect: cataract, retinal burn"
In the US 5mW is the legal limit for laser pointers. I think that can be used as a guideline, 3mW IR is much less energetic than a 5mW green/blue laser, perhaps it's not powerful enough to cause huge concern? But I agree no amount of laser is safe for the eye. Police LIDAR guns are 905nm at 50mW, that can technically damage you too.
The limit for visible light is as high as it is because of the blink reflex. An IR laser, having a beam you don't perceive as being bright, won't make you blink.
Well, actually it's near infrared so at this "high" power (3mw) you will definitely see it. You can actually see past 800nm if the source is bright, but I don't recommend it.
The wavelength range where optical radiation is visible does not have sharp borders. Here, the wavelength band of 380 nm to 780 nm is used.
Agreed - I don't enough to say about this particular project, but the lasers in LiDAR sensors do have the potential to be powerful enough to harm eyesight, so I would advise caution. This is not a concern for most commercial sensors, as they are constrained by regulations to a certain power for this same reason. I'm guessing there's no rules for DIY, but it's something to keep in mind if a product came out of this.
There's also a company called Scanse that has a very affordable LIDAR unit, and a ROS driver. Not open source hardware, but at $350, a pretty good deal. They use the Garmin Lidar Lite v3, which has an I2C interface, and all measurements are actually computed in the sensor. The sensor itself is around $120. They also use a shaftless gimbal motor, with a slip ring, which allows for continuous rotation.
scanse.io
They were also generous enough to leave an exploded CAD model of their design on their kickstarter page, about halfway down, just in case you wanted to see whats going on inside, or perhaps make your own, as its mostly commercial off the shelf parts, with a custom PCB to control the motor.
What kind of precision can you get with a device like this? The image on the github page looks very imprecise, like on the order of cm of error. If one were to scan a 3d model, could one get sub millimeter resoluton for example?
I still don't quite understand how this works. Main components are:
- Laser
- Lens
- TSL1401 line image sensor
I was always under the impression that LIDAR includes a time-of-flight measurement, which does not appear to be the case here - the TSL1401 sensor has integration times and pixel transfer times in the range of dozens of microseconds, a timespan in which light travels dozens of kilometers.
"Lidar [...] combine[s] laser-focused imaging with the ability to calculate distances by measuring the time for a signal to return using appropriate sensors and data acquisition electronics."
So this is not LIDAR. Still impressive though, I love the simplicity!
That's probably an overly specific definition. Their source for the lidar definition is a NOAA webpage [0] with a broader definition that doesn't include specifics about implementation:
> LIDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances) to the Earth.
Most lidar does use timing but I'd argue that's a type of implementation instead of a necessary part of the broader category of lidar devices.
The LIDAR acronym doesn't specify the specific measurement technique, although Time Of Flight is by far the most common. There are other designs with interesting differences, for example Strobe[0] which varies the frequency of the transmitted light and measures the freq and phase delay of the returning light rather than straight delay.
I believe this design still qualifies as LIDAR, just with significantly worse performance than a typical system.
As far as I know "varies the frequency of the transmitted light and measures the freq and phase delay" is the standard way of measuring ToF in case of lasers. You modulate the laser diode by a signal of a wide range of frequencies and you subtract returned signal (after cleanup). The observation is done on a spectrum. Low frequencies give you wide range but low precision and high frequencies higher precision but lower range but you can use them enhance precision of the former.
For me paralax-based sensors (in the sense of draw a line with swept laser and look at it with some kind of imager) are the first thing I imagine when hearing "LIDAR". For a long time most industrial lidar sensors used that principle (the 25k+ ones ten years ago).
About 10 years ago, I worked on a project that used triangulation to measure distances with a laser. The specular reflection is falling on a pair of photodiodes. The relative amounts of light on each diode is indicative of the distance to the specular reflection.
A good visual aide that helped me is to imagine a measuring stick laid horizontally across your field of vision, but at a 45 degree angle, such that the left side is closer to you, and the right side is further away.
If you abstract your vision to a 2D projection of a 3D volume, you can easily see that as your eyes follow the measuring stick to the left, the distances get smaller, because they actually are closer to you. Conversely, as your eyes follow the measuring stick to the right, the measurements get larger, because they actually are further away.
Did the same exact thing back in 2015 that used the omnivision sensor in wiimotes, having the benefit that the sensor readily reports the position of detected infrared spots, so no cpu-heavy postprocessing needed. The 20MHz clock for the sensor was taken from CLKOUT from the controlling avr cpu, so only three parts on the rotating part (laser, cpu, sensor).
I would recommend adding a slit and a bandgap filter to reduce the amount of non-laserlight, because i had big problems with false positives from stray light (sunlight etc).
This is super awesome! I've been taking Udacity's SDC and I wanted to play with some LIDAR mapping on my own. Been debating for the longest time to get the Neato XV-21 LIDAR but it doesn't really make sense to get an obsolete hardware that people salvage from old vacuums. I wish s/he will sell the PCB on a crowd-funded run like Crowd-Supply or Tindie
I was going to buy the RPLidar when they did a fire sale on the first generation. But I read there was some refresh rate error so I didn't pull the trigger. Do you know if the Sweep (by Scanse) is comparable to the RPLidar (Dev version)?
The thing you're looking for is the class of the laser. Class 1 is completely harmless, class 2 is harmless unless you intentionally stare at it, class 3 is immediately harmful and class 4 is the kind of laser that cuts bone and metal.
I'm not sure what class this is, but it should be mentioned somewhere.
The BOM[0] specifies that this is a "3mw 780nm Infrared IR Diode Laser" with a link to eBay[1].
I'm finding 780nm 2.5mw and 3mw laser diodes rated at classes 3B and 3R. See [2] and [3]. The 3B class rating is given for a high end Edmund Optics laser which most probably gives out the "full" 3mW. A 3B class laser is "hazardous for eye exposure" [4].
So... hard to say, but not great?
I fondly remember a sticker that was on a lab's (very scary) laser: "Do not look into laser with remaining eye"...
Slip rings are noisy and wear out. A lot of scanning LIDAR systems use a rotating mirror with stationary electronics to avoid them. You could also use stationary magnets with coils on the rotating part to transmit power (basically a mini-alternator on the spinning side), and IrDA or something to communicate.
They use a slipring. In some ways, its kind of ironic, because these are now most prevalent in brushless DC motors. Slip rings work in the same fashion as brushed motors. To achieve continuous rotation in a brushed motor, there are carbon brushes that are closing different circuits as the rotor spins. This allows the phase of the electricity to stay constantly ahead of the phase of the motor, giving it its acceleration. In a brushless DC motor, this same effect is achieved through a feedback loop, obviating the need for the brushes, which wear over time, and leave carbon deposits.
With the latest Tesla crash in the news, seeing the point cloud got me thinking.
Since mirrors do not create the specular reflections required for LIDAR to work, does this mean a box truck covered in mirrors would be able to render at least the LIDAR portion of an autonomous vehicle useless?
It would only apply to a mirror that's offset to point your LiDAR to the sky or at some odd angle. If it's a flat mirror, it will just see itself approaching from 2x the distance / speed between the car and mirror.
Do I interpret your comment correctly as saying there is some redundant system that is specifically looking for anomalies in sensor readings? Would a top of the line lidar system be able to "understand" that its current inputs were resulting in erroneous outputs?
I am reminded of one of the recent Japanese satellites that was effectively an infant mortality, because of some quirk in its redundant systems. I've forgotten the details, but more or less, there were cascading failures across a primary system, its secondary redundancy, and its tertiary redundancy. So the feedback loops designed by the engineers to be negative, and mitigating, ended up being positive, and therefore aggravating, in the cruel vacuum of space. It came down to some error in an orientation sensor, and somehow, the redundant system actually ended up relying on information from the primary system. The boosters designed to slow down the rotation relied on the notion of the satellite's actual rotation, according to the original sensor.
As the story goes "Does the machine ever give the right answer given the wrong inputs, Mr Babbage?" Nearly 200 years later, we may finally becoming around to seeing that the woman that asked the question had a point, and Charles simply dismissed it out of hand.
If you stand 1m away from a mirror, you will see "yourself" 2m away as a reflection. (1m you:mirror + 1m mirror:reflection) If you move 1m backward, you will see the reflection now 4m away (2m you:mirror + 2m mirror:reflection). Speed works the same way, if you step toward the mirror at 1m/s your reflection will approach you at 2m/s.
The same thing applies to LiDAR, either it sees the mirror and calculates the proper distance, or it sees the things reflected in the mirror at 2x actual distance and/or 2x relative speed.
As somebody with software engineering experience but little hardware experience, how do I get started with building my own Lidar? I notice he provides STL files for base plates and a component list https://hackaday.io/project/20628/components. Still not too sure how to start though.
There's a bit of work, especially on the PCB side (there are 3 separate ones). You would have to order PCBs based on the Gerber files, and all of the components, and have a way of soldering them (they are "surface mount", which adds to the complexity).
Overall, it would be much easier if the "final" PCBs were sold as a kit. Assembling this kit would be much easier (and fun) then starting from scratch.
> Assembling this kit would be much easier (and fun) then starting from scratch.
What is so fun on assembling a kit? The populate-the-PCB is probably the most mind-numbing part of the whole hardware engineering business. Design and debugging is where it's at
Oh man this is cool! Now we just need OpenSimpleLaserWindow and we can listen to conversations and track people's movements just like the CIA!
I forget where I read about this, but essentially, as you speak, your sound waves vibrate nearby windows, and these vibrations can be picked up with a laser and translated back into sound.
There's a few ways you can do this. The cheapest is to just get a laser and a photo diode. It's rough, but you can get sounds. An interferometer works better. It's harder to make and more expensive, but in the hobby range, at least to get decent quality. There's tons of videos and documents that you can find showing you how to make either. You can also do a lot of cool measurements with these setups
The design is extremely simple. The hard part is aligning things. That's actually the hard part about most optics. If you're really OCD you need to be even more.
This is great to hear. The new D400 devices look quite promising on paper but I have seen virtually no 3rd party reports on how well they perform in the real world.
We've only done some quick tests outdoors for now, as our device will only operate indoors. But it works well, even when pointed into the sun. Email in profile, send me a message if you want some sample video.
Well, do you want an eye-safe laser? And want to operate outdoors?
The Neato is about 5m indoors, 2m outdoors with a sun shade. That reflects a lot of effort fighting the "sun blindness" problem that having a powerful IR source shining in through your picture window causes for the vacuum cleaner.
Practically, you want to modulate the laser so that you can filter for an AC signal.
> "sun blindness" problem that having a powerful IR source.
Just point out it's not as bad as one would think since sunlight has much less IR than visible light energy. The deeper IR you can go and the narrower your filters on the receiving side the better.
Other issue I had dealing with an IR comm link 30 years ago was it sucks compared to RF because your sensor aperture is very small compared to an antenna. And as you mentioned the the ground floor is much higher with IR than RF.
You know the angle of light entering the lens. The tangent of that angle gives you the ratio of (the distance between the lens and the laser) and (distance between the laser and the target). Which lets you work out the distance between the laser and target.
No. I'm sure that you could argue that technically, if you go by the literal acronym, it is "LIDAR", I really don't think that flies. When you use the term LIDAR, you imply a certain amount of robustness and error rejection you only get using ToF.
Sticking the same parallax technology people have been using since the 80s on a board and calling it "LIDAR" isn't really honest.
Last article that was talking about Uber I asked why LIDAR wouldn’t have lit up that lady up. The response was “LIDAR might have been disabled”... as if that was any sort of acceptable scenario.
"Because I was driving with my eyes closed" or "Because I wasn't looking" would be the human equivalent. Imagine if the driver used the same excuse, when asked why she didn't see the pedestrian and take over.
So motorcyclists here have an acronym "SMIDSY" - stands for "Sorry mate, I didn't see you" - which is almost universally the first words out of a car drivers mouth after they've driven into a motorcycle.
Sounds believable to me. The human eye doesn't have as broad of a field of view as the brain's post-processing would lead you to believe, so it's very easy for its search pattern to miss small objects.
Which is because almost universaly motorbikes move faster than traffic and between lanes. So unless you stare at your rear view mirror constantly you may well miss them aproaching.
Alas, at least here seeing a motorcycle driving at/under speed limit is an exception, not a rule.
Not saying I agree, but he leaves room open for other alternatives such as hitting a motorcycle while weaving within a lane. You won't hit a car that way, but you might take out a lane splitting bike.
Here is another my Lidar project: https://github.com/iliasam/OpenLIDAR
That project has a wiki: https://github.com/iliasam/OpenLIDAR/wiki
A big article in Russian about it: https://geektimes.ru/post/275442/