Hacker News new | past | comments | ask | show | jobs | submit login
Man says CES lidar’s laser was so powerful it wrecked his camera (arstechnica.com)
191 points by 882542F3884314B on Jan 11, 2019 | hide | past | favorite | 134 comments



So vitreous fluid is opaque to 1550nm light and will block these new powerful lasers from damaging the retina. What about people that have recently had retinal detachment surgery, and had the vitreous fluid replaced with an oil, is this replacement oil also opaque to 1550nm?


So when a company says that it’s safe does this include testing for all of these cases? While I haven’t had retinal detachment surgery, I’ve had lasik done. I’m sure, being a common procedure, that it’s considered in whatever battery of tests are performed. But what of these other procedures?

I’d hate to be walking down the street and suddenly be affected because my eyes are just “different” https://youtu.be/zkD9SBP9AX4


Laser vendor: can we sell laser-blocking AR glasses to monetize our electromagnetic pollution?

Smith: these are safety glasses, not a facial-recognition firewall.


If we are to have streets full of Lidar-equipped vehicles all constantly blasting out laser light, I can imagine this will become a real problem.

Will digital cameras improve so that they are less sensitive to lasers? Or will Lidar improve to use less-powerful lasers?


Most current self driving car lidars operate close to the 900 nm range, and they are required to output 1000 times less power than the 1550 nm lidar in this article. [1] So far these seem totally harmless to cameras.

1550 nm lidars for self driving cars are a relatively recent thing with only a handful of companies (AEye, Luminar, Blackmore) making them. The benefit is that, thanks to the extra power, they have longer range. Unfortunately they fry cameras and are very expensive.

It's also worth pointing out that the 1550 nm pulsed lidars like AEye and Luminar with fiber lasers may have much shorter pulses than the 5 ns pulses of, say, a 905 nm Velodyne lidar. So, not only is the average power 1000 times higher, but the peak power may be even higher.

[1] https://en.wikipedia.org/wiki/File:IEC60825_MPE_J_nm.png


The relevant part of the article: "But it also has a big advantage: the fluid in the human eye is opaque to 1550nm light, so the light can't reach the retina at the back of the eye. This means lasers can operate at much higher power levels without posing an eye safety risk. AEye uses 1550nm lasers. And unfortunately for Chowdhury, cameras are not filled with fluid like human eyes are. That means that high-power 1550nm lasers can easily cause damage to camera sensors even if they don't pose a threat to human eyes."


I imagine it is doing something to the fluid in the eye with that kind of energy it absorbs. Maybe cataracts for example.


The exposure limits for lasers in the 1500nm range are almost certainly set by the intensity at which they cause photochemical cataracts. The geometry just doesn't work out for the exposure limits to be set by anything else; a non-visible laser 1500x as powerful as a legal-limit visible-light laser would still only heat the cornea up a hundredth as much as the legal-limit visible-light laser would the retina (intensity at cornea 1/200000th that of intensity at retina, see my comment below [4]). Physical damage to the retina starts around 10 degrees C, so this means that an exposure-limit non-visible laser might change the temperature of your cornea by a tenth of a degree. The normal range for corneal temperature is thirty celsius wide [1, 2] because it's strongly affected by air temperature and air movement. Corneal temperature is controlled to stay below 33-35 Celsius in extremely hot environments (45 C). Studies on microwave burns support this, saying that you have to get rabbit corneas up to 41 degrees C for cataracts to even start to form, implying a reasonable safety factor [3]. Under normal conditions (still air at room temperature), corneal temperature hovers closer to 30 degrees C. A tenth or a hundredth of a degree just isn't going to do anything when the system normally has ten degrees of safety. The vitreous humor varies less but still varies pretty significantly. Conclusion, the exposure limits for 1500nm lasers are not set by thermal damage.

It's generally safe to assume that you're not going to think about this for ten seconds and discover a danger that has been missed by every single person to ever contribute to the exposure limits by thought or by case study.

1: https://iovs.arvojournals.org/article.aspx?articleid=2127035 2: http://iovs.arvojournals.org/data/journals/iovs/933602/596.p... 3: https://en.wikipedia.org/wiki/Microwave_burn#Eyes 4: https://news.ycombinator.com/item?id=18887393


> It's generally safe to assume that you're not going to think about this for ten seconds and discover a danger that has been missed by every single person to ever contribute to the exposure limits by thought or by case study.

But then what would HN be for?


> It's generally safe to assume that you're not going to think about this for ten seconds and discover a danger that has been missed by every single person to ever contribute to the exposure limits by thought or by case study.

Not safe to assume the risks don't exist just because they aren't mentioned in an Ars Technica article.

I would have thought it would be safe to assume a company wouldn't mount a laser on a car that would permanently ruin peoples cameras, but here we are.


> Not safe to assume the risks don't exist just because they > aren't mentioned in an Ars Technica article.

https://www.lesswrong.com/posts/zsG9yKcriht2doRhM/inadequacy...

I don't believe that the risks don't exist. I believe that what risks exist are extremely unlikely to be something that can be pointed out with a one-line comment on HN. I don't believe that because of this article. I believe that because lasers are unbelievably useful and widely-deployed in industry and tend to cause immediate, visible, and unmistakable damage, so organizations like OSHA have studied them extensively and failures are expensive enough that operators put actual effort into minimizing risk.

> I would have thought it would be safe to assume a > company wouldn't mount a laser on a car that would > permanently ruin peoples cameras, but here we are.

I would have thought that that it would be safe to assume that people wouldn't strap 200kW motors to two-ton lumps of metal and send them hurtling around under purely manual control with no physical limits or safety barriers separating them from foot traffic, but here we are.

If the price we pay for eliminating the leading cause of violent death worldwide is that we have to stop pointing cameras at everything, so be it.


Exactly, it's the same with unshielded microwaves, and it's astonishing that someone would think it's safe to crank the power on these lasers with that justification.

> When injury from exposure to microwaves occurs, it usually results from dielectric heating induced in the body. Exposure to microwave radiation can produce cataracts by this mechanism,[28] because the microwave heating denatures proteins in the crystalline lens of the eye (in the same way that heat turns egg whites white and opaque). The lens and cornea of the eye are especially vulnerable because they contain no blood vessels that can carry away heat.

https://en.wikipedia.org/wiki/Microwave#Effects_on_health


But wouldn't that be true of all electromagnetic radiation that's absorbed by the fluid? I can't imagine this being more harmful than looking at a sunset from the pov of total energy being pumped into the eyeball. Or walking outside at mid-day among shiny buildings.


It is quite easy to believe that as-lased from lasers in the lidar units, the beams are generally safe. But could other optical mechanisms such as eye-glasses accidentally focus these dispersed beams in such a way that they could again be dangerous?


But if eye-glasses could do that, then people would get retinal burns from the sun, and so on. Remember that all glasses do is adjust the path of light reaching the lens of the eye so that when it arrives at the retina it is identical to the path of light focussed by a normal, healthy eye. Which also means that another way we know it isn't a problem is that people without glasses are safe.


Also, remember that this is medium-wavelength IR at 1500nm, which is between two and three times the wavelength of visible at 400-700nm approximately. The energy in a photon is calculated as E = hf = hc/l where h is Planck's constant and c is the speed of light, while l is the wavelength. Notice how the energy decreases nice and linearly as the wavelength increases - this is good!


the human eye might not be affected, but are any animals or insects susceptible to these wave lengths ?

plus are there any weather conditions the light become visible?


There are absolutely no weather conditions that can transform this near-IR into visible light.


It's worth noting that the Blackmore system is continuous wave and much lower power than AEye and Luminar, so idk if the same thing would happen to cameras


So why do they use pulsed light? What’s the benefit of time-of-flight over the continuou method? Can signal to noise of time of flight ever approach the correlation methods?


If it can do that to a camera, what's it doing to eyes?

edit: > "Cameras are up to 1000x more sensitive to lasers than eyeballs," Dussan wrote. "Occasionally, this can cause thermal damage to a camera's focal plane array." - I'm still not entirely sold.


The article actually has the answer:

> Other lidar makers use lasers with a wavelength of 1550nm. This tends to be more expensive because sensors have to be made out of exotic materials like indium-gallium arsenide rather than silicon. But it also has a big advantage: the fluid in the human eye is opaque to 1550nm light, so the light can't reach the retina at the back of the eye. This means lasers can operate at much higher power levels without posing an eye safety risk.


I asked a neuroscientist specializing in eyes I know on Twitter. His opinion is that it's not quite so simple as "it's safe".

https://twitter.com/BWJones/status/1083847128663769088

> That energy longer than 1400nm is generally absorbed by the cornea and lens, but it is still energy, and it is not a hard bandpass filter per se. Safety is relative at higher wattages.


Wonderful, the eye boiling future I have to look forward to..


> the fluid in the human eye is opaque to 1550nm light

sure, but does this hold true for most land-based animals and birds? How much range does this give us? reflections can shift by half a wavelength - is that still opaque too?

I'm not saying this isn't safe, but suddenly having a tonne of high power laser sources pointing everywhere at all times might actually have some consequences still...


If I am reading the chart labelled "The visible and UV spectra of liquid water" at http://www1.lsbu.ac.uk/water/water_vibrational_spectrum.html (about 2/3rds down the page) correctly, water is fairly opaque at 1550nm. If my hurried wikipedia education on the topic is correct, that chart is saying it's quite opaque at that frequency, even at bat eyeball scales. Corrections welcomed from people with more training in this field.

(I'm not sure I'm not getting some crossed units when I tried to resolve that into numbers normals like me would understand; is that really 1/e^1000 transmitted per centimeter travelled? e^1000 is a big number. There's even bigger ones on that chart. Then again, if a two-atom-thick layer of gold is enough to make something look like gold and completely obscure what's underneath, I guess that might make sense and my intuition is just off, because when I convert that into this sort of scheme I get big numbers there, too.)


If memory serves me well, water absorbs that wavelength, so it would basically function sort of like a microwave oven, but with such a much lower energy level that you can basically ignore it.


How will LIDAR work under the rain or snow then?


The same way regular vision works under snow. Snowflakes are opaque to visible light too, so it'd work the same way human eyesight works for drivers currently.


so maybe we don't need Lidar, because the cameras can see like the human eye in the snow, too. Depth can be given by doing like insects and adding many triangulation points.


Mammals, lizards, amphibians, and birds all have water-based eyes, but I wonder what this would do to insects.


the fluid in the human eye is opaque to 1550nm light, so the light can't reach the retina at the back of the eye.

So...it'll just fry the front of the eye?


The eye focuses light. Formally, this transforms incoming photons so that their position on the retina is a smooth function of their direction. Diffraction and atmosphere mean a laser will be distributed somewhat in position, so the incoming photons may be spread over a 2mm spot on the front of the eye. However, all of those photons have almost identical direction and end up being concentrated in a single spot on the retina, up to the limits of the eye's optics. 20/20 human vision can resolve down to about .02 of a degree so a point source should be very similar to a .02 degree circle, the visual nerve blind spot is 7.5 degrees wide and is caused by a 1.5mm obstruction on the retina, so a point source like a laser should land in a spot on the retina about .005mm across. 2mm diameter / .005mm diameter is 400 and area is proportional to the square, so this would be something like 160,000x decrease in area with equivalent increase in intensity. Result: Not even enough to warm your skin up but instantly boils a bit of your retina.

(I'm rather happy with how well my estimate corresponds to the figure from Wikipedia: "The eye focuses visible and near-infrared light onto the retina. A laser beam can be focused to an intensity on the retina which may be up to 200,000 times higher than at the point where the laser beam enters the eye.")


Water absorbs that wavelength if memory serves me well, so it's more like boil than fry -- kind of like what happens in a microwave oven. But the energy levels involved are so small it shouldn't be a concern.


Ok, this suggests a solution: use a similar fluid in front of cameras.


The fluid is opaque so all the energy is dumped in the fluid, heating it up?

Also why should I listen to some rando salesman named Dussan? Is he an ophthalmologist?


From the article "Cameras are up to 1000x more sensitive to lasers than eyeballs," Dussan wrote. "Occasionally, this can cause thermal damage to a camera's focal plane array."

Another site claims "However, there are no MPEs for sensors such as CMOS or CCD chips. This means a show may be perfectly safe for eyes, but could possibly damage a camera sensor. One reason is that camera lenses may gather more laser light, and concentrate it to a finer point. Another reason is that CMOS or CCD sensors are more easily damaged than the eye."[1]

[1] http://www.laserpointersafety.com/ilda-camera-info.html


I'd love to see some independent results from someone who doesn't have skin in the game. Whilst damage might be more obvious in a camera CCD, even small amounts of cumulative damage over time could be doing serious harm without it being immediately obvious.

I've seen f/1.6 lenses, where as the eyes are apparently f/8.3 - f/2.2, so yes the eyes take in less light, but it's substantially harder than an EF-S mount to swapout.

edit: also, what about animals with good night vision? they are more sensitive and take in more light - are they at danger?


They address in the article that the frequency they're using can't damage the retina because the fluid in the eye is opaque to light at this frequency.


A camera is not the human eye. Just because they are both technically optical sensors does not mean they are equivalent really at all.


Seems like something the FCC should regulate. It's the radio spectrum.


FCC regulates only to 3THz (~100µm wavelength). The IR laser emission is near 200THz. Regulatory jurisdiction here is the FDA under 21 CFR 1040. Unfortunately MPE for the regulation is all based on the biology of the eye, so that is where you'd have to find your argument. Probably the best avenue to argue would be that the peak power of the short pulse is above MPE although in a practical sense the total energy may not be a problem. My guess is that in this case there would be allowable measurement methodology that both passes and fails the standard, and the legal argument would center around that.


> Crucially, self-driving cars also rely on conventional cameras. So if those lidars are not camera-safe, it won't just create a headache for people snapping pictures with handheld cameras. Lidar sensors could also damage the cameras on other self-driving cars.


Certainly birds might not adapt to be less sensitive. I certainly would hope that research will be done to check how the type of EM pollution will affect nature. It might not but it also might over time.


Another option is that machine vision will improve to the point where Lidars are less of a necessity. This is in fact the thing Tesla is attempting to do, although they are far from pulling it off!


> Another option is that machine vision will improve to the point where Lidars are less of a necessity.

Highly doubt this will happen anytime soon. Machine vision has come a long way past decade but nonetheless it is far from being reliable for driving. I think lidar is pretty much a necessity for safety's sakes.


It is all relative (safety that is). If you’re driving on a new road that hasn’t been mapped with the super high res Lidar, it presents a problem. Machine vision absolutely is the holy grail here. Not you or anyone else can say either machine vision OR lidar are necessary as neither of them currently power a self driving car. There isn’t currently such a thing that can do say a cross country road trip on any highways. Not even Waymo.


>If you’re driving on a new road that hasn’t been mapped with the super high res Lidar, it presents a problem.

Not even remotely true. LIDAR is used to SLAM in real-time. Think of it as a radar on steroid. For successful reliable autonomous driving you need both machine vision and lidar. If you only use machine vision, you end up like Teslas and slam into stationary fire trucks. So you use machine vision for reading signs, lane localization, etc and lidar to prevent accidents.

> a cross country road trip on any highways.

That's not the agreed upon definition of self driving. Also, you should probably take a look at this: https://en.wikipedia.org/wiki/Self-driving_car#Testing

Waymo on average travels for 5,127.9 miles before a human disengagement is necessary measured over 635,868 miles. Cruise, 5,224 miles. Tesla, 2.9 miles.

> Not you or anyone else can say either machine vision OR lidar are necessary

Well since pretty much every single company that exists right now in the autonomous car space with the exception of Tesla uses LIDAR, I am inclined to think that's the way to do it.

All in all, machine vision is way too unreliable for anything above SAE level 2. Due to the nature of machine learning algorithms used (CNN) you can't be sure that a firetruck will always be recognized as a firetruck. The only way to reliably "see" a firetruck is via lidar/radar.


Fortunately, the military industrial complex has the solution to this. For some time now, there have been worries over lasers being used to blind sensors and people so a number of approaches have been developed to defend against this. Lasers only emit at a set of limited wavelengths so one can block out just these wavelengths. Another more exotic approach is to block the light coming in as soon as the laser has been detected. LCDs are an obvious solution to this, though I have seen microshutter arrays proposed for this in a Popular Science/Mechanics magazine a while back. Although, it's probably better to prevent LIDAR from destroying cameras in the first place.


I’m pretty sure you mean the defense industry, the military industrial complex isn’t the defense industry, it doesn’t make anything ;)

And the solution is far more simpler than you think the domes for the sensor packages are simply better rated, as they are rated for true 99.999% absorption for both UV and IR (beyond FLIR ranges) some of them also have an inner layer that would darken if the light penetrates that deep creating a dark spot that would block the light completely since replacing a dome is cheaper than replacing a cryogenic IR sensor array.

But beyond that you can blind the sensors with most pen lasers as well as damage it with visible light, the reason why it’s not much of an issue is that due to simple physics you won’t be able to focus a beam narrow enough to hit the sensor even atmospheric refraction alone would prevent that from happening and any laser capable of damaging the sensor at the point would likely be able to damage the drone.

The IR dome protection was mostly implemented to protect the sensor if the drone is painted with an illuminated.


Visible laser light kills cameras all the time. I think this is more likely what happened here.

https://photofocus.com/2013/09/14/beware-lasers-can-kill-you...


That he took a photo of a visible light laser and it wasn't the car's IR laser?

Or just that lasers in general are known to do this and the car's laser probably did it?



“Lidar sensors could also damage the cameras on other self-driving cars.“ so eventaulally driverless Cars spoil each other camera sensors and can’t recognize pedestrians, traffic signs. How can cameras be immune to high power laser beams?


By using low-pass optical filters on cameras, for example like this one (probably much cheaper in volume): https://www.edmundoptics.com/document/download/396359

I've used band-pass filters which reject everything except a narrow band, so that camera sees just the IR LED (chosen to fall in the middle of passed band) and they cost a few USD in small volumes.


Laser camera damage is a known thing. That Aeye would buy him a new camera was pretty cool. Not sure how scalable that is as people take broken cameras that can still get damaged, find an AEye car, and then set up a replacement.

If you went to a Diwali laser show you would get the ultimate camera death mix, fine powder and lasers :-).


> That Aeye would buy him a new camera was pretty cool.

I wonder if they had warning signs telling people about the active lasers and possible camera damage. If they didn't, I feel like replacing the camera is just the right thing to do. It was CES, people are bound to have cameras.


Silicon is transparent above 1.1um this laser is at 1.5um, so perhaps the infrared cut filter that sits above the sensor was damaged. Perhaps Sony is using a absorptive IR filter and it was damaged by heat, if it was reflective filter i doubt it could be damaged.


Some shortpass filters only block 700 nm to 1100 nm and are not rated for performance above 1100 nm, since the silicon sensor cannot "see" light above 1000 nm anyway (but may be damaged by sufficiently strong amounts of it). So, perhaps Sony was using an IR block filter that blocks near IR from 700 nm to 1000 nm but doesn't block 1550 nm perfectly.


Yes this that is probably the case, no need to block beyond 1.1um for images, perhaps now they should extend it for safety. The laser must have passed through and damaged the metal interconnects on the backside of the sensor.


It's probably thermal damage


It sucks that UX issues (for lack of a better term) like this exist. It’s one of the sole reasons we don’t have the Concorde flight program any longer, since it couldn’t do any domestic flights because of the noise pollution from breaking the sound barrier etc. Joking aside, it’s interesting how these issues have to be considered when innovating at the bleeding edge.


In comparison, if you consider CES as a modern day world’s fair, imagine the dangers of experiencing a Tesla electricity exhibit or some of the earlier X-ray tech with hour long exposure to high radiation.

https://abcnews.go.com/beta-story-container/Health/Wellness/...


That sounds fantastic, can we put this into something I can wear?


We're thinking along the same lines. I was wondering how this could be adapted to just make you invisible to cameras. It would be a fun project to deploy at a protest or something where everybody is filming everyone else, or maybe I just don't want to be caught on 500 building cameras as I walk down the street.


Depends on the infrared band they're using but normal glass or acrylic should stop most of it.

Edit: looks like they're using SWIR (~1500nm) so glass/acrylic/polycarbonate will pass it just fine.


Just imagine in a few years there will be thousands of devices using lidar in a city. Will that be enough to knock out endless surveillance cameras?


Yep, by the same companies that will bring you _Surveillance Lidar_!


Surveillance lidar is a thing.

[1] Quanergy Perimeter Security System https://www.youtube.com/watch?v=IFCgEzdrbQM

[2] Quanergy's LiDAR-based security solution https://www.youtube.com/watch?v=PpYDWb2yX_M


We can call it "Sidar"; play on the word cider, and the logo for the company or brand would be a jar of cider with an eye floating in it.


Wonder if you could knock out speed cameras and red light cameras with this in a way which wouldn't be noticed.

Good to know they claim you can do it with a laser that's not dangerous to the human eye.


Back in the day I did this to an old CyberShot with a laser pointer, it permenantly had two purplish marks on the images afterwards. I would think the laser would need to be somewhat static or slow moving to cause the damage though not scanning at high speed.

You can see the purplish marks just slightly down and left of centre here - https://i.ibb.co/THMS42D/dsc00006.jpg


This is actually how several LIDAR scanning stations are allowed to be called class I (or maybe II) devices. The lasers inside would do bad things to your eyes (easily class 3). Because they're pulsed and spinning, the odds of you getting multiple hits to your eyeballs are quite low.


If the wavelength of the laser is a known 1550nm, maybe the companies like Lee, Tiffen, etc can come out with a filter to protect against this. Street photographers will be susceptible to driverless cars without warning. It would be a niche market for sure, but if I was doing street work where it was known to have driverless cars, it would be worth ~$100 filter to protect my >$2500 camera body.


For the short term, sure. But long term, the lidar system will have to be adjusted to not cause harm to the public.


Cool. This will be fine as long as 2 self driving cars never go near each other.


Aren't the optical paths highly selective for direction, time, and wavelength? Let's assume coincident-wavelength for our worst case. At any given point in time, it's as if each car has a laser pointer aiming a single dot somewhere, and that single dot is also the only point it's receiving light from (in stark contrast to a camera, which is gathering light from its entire field of view). Even if the LIDARs can see each other, there's no 0-attenuation path from the output of one to the input of another unless a pair of LIDARs have chosen, out of their entire FOV, to aim directly at each other.

So, if we define "traversal time" as the time required for the dot of car A to sweep the aperture of car B, in each traversal time there's 1 chance in N^4 of perfect alignment, where N is the ratio of dot size to the full FOV. Maybe it happens once in a blue moon, but if the sensor can withstand full illumination for more than one traversal time, you would start having to compound 1/N^4 events in order to fry a sensor. I wouldn't count on it.

Well, I wouldn't count on that particular mechanism, at least. I'm sure there will be cases where a coating degrades and lets broad-band sunlight in, or a sensor parks on the sun, etc. Sensors will get fried, but not because the engineers making them were too stupid to consider interference.


from TFA:

>Crucially, self-driving cars also rely on conventional cameras. So if those lidars are not camera-safe, it won't just create a headache for people snapping pictures with handheld cameras. Lidar sensors could also damage the cameras on other self-driving cars.


Shouldn't this also be a problem with other non-passive sensors like radar. If we have a traffic jam of cars each sending radio waves from multiple sensors there has to be quite some interference, no?


Radars pointed at each other may have some interference resulting in temporary erroneous readings but they won't be permanently damaged like the camera in the article.


In a heterogenous environment this is not necessarily true; a radar with a powerful transmitter could damage a radar with a very sensitive receiver, particularly at close ranges.

There are military jamming devices that can quite handily permanently damage radars not designed to distinguish it.


Tangent: the Soviet MiG-25P interceptor had a 600kW radar which was lethal to small animals if used on the ground.


Oh I see, that makes sense. I wonder if any automotive radars are that powerful.


Almost certainly not. The FCC approval process would likely catch that. Lasers are also far more focused then microwaves (typical spot size of a cheapo laser pointer is 12cm at 100m away; police LIDAR is good to target a single car out to at least 1/4 mile).


Self-driving cars use cameras too!


Selecting a pulse repetition interval and spreading sequence from a prng should mostly eliminate interference like that for both lidar and radar I think.


Is there any research material to support these claims? I can't imagine the energy and optics involved would have been sufficient to do this. Seems like coincidence to me. The sun has far more energy than what we're talking about here.

I'm happy to be proven wrong though as I wouldn't want something unsafe on the streets as much as anyone else.


> In an email to Ars Technica, AEye CEO Luis Dussan confirmed that AEye's lidars can cause damage to camera sensors—though he stressed that they pose no danger to human eyes.

So, yes.


Taking a picture of the sun will destroy most cameras. So, I don’t think that supports your argument.


Even with really quick exposures? I've taken pictures of the sun before (before, during, and after last year's eclipse) with my cell phone camera and it seemed fine.


Edit: added link [0] to lensrentals.com story about gear damaged during solar eclipse of 2017.

Your cell phone camera has a wide angle lens. it's at the longer focal lengths where one might expect sensor damage, or even melt the curtains of shooting directly into the sun when high above horizon.

[0] https://www.lensrentals.com/blog/2017/09/rental-camera-gear-...


Your cellphone camera lens likely doesn’t have the requisite light-gathering capacity to burn out the CMOS image sensor in your phone. CCDs (commonly used in DSLR/M43 cameras) are far more sensitive, as I understand it.


I don't believe DSLRs or M43 cameras have used CCDs since last decade.


The Pentax 645D, introduced in 2010, is a DSLR with a CCD sensor. But yes, it is extremely rare.

[1] https://en.wikipedia.org/wiki/Pentax_645D


Why do modern cameras use CMOS if CCD is more sensitive?

I guess the answer will be "because it's cheaper to manufacture", but is there any other reason?


Most DSLRs these days use CMOS sensors.


Interesting free article: Laser-induced damage threshold of camera sensors and micro-optoelectromechanical systems https://www.spiedigitallibrary.org/journals/optical-engineer...

Cameras also get regularly damaged at concerts and light shows with visible lasers. Those lasers are class 2 lasers, though, and can output more light while not being blocked by the IR filter in cameras.


The sun is also much farther away


I tweeted at him to ask him what size lens and sensor he has so I could estimate how much energy/flux was hitting his sensor. Anyone have an idea of how many lidar it would take to saturate an area and make distance measurements useless? It's tougher with lidar because you can't easily frequency shift like with radar.


If it's a $2000 Sony camera, it's definitely going to be a 35mm sensor. His lens on the other hand I wouldn't be able to guess.


It's a Sony ILCE-7RM2 with a 35mm lens at f/4: https://twitter.com/jitrc/status/1083190800710684673


I'm puzzled by the fact that it's only burned in two spots. This is a mirrorless camera so the lidar would be going into the lens and hitting the sensor constantly, not just during the exposure. I wonder what's happened here.


Somebody mentioned it's a pulsing one.


I strongly dispute the claim that 1550nm is eye safe. 1550nm is also the band used for long distance, high powered DWDM transport systems for internet backbone purposes. And is common in "long reach" SFP+ modules for use 80 to 120km on dark fiber without amplification. There are common eye safety precautions, one of which is you NEVER aim the ferrule from a optic or patch cable at your face with a live 1550nm signal on it. This is known by everyone who lights long distance dark fiber.

This company claims they have a 1550 band laser operating in free space air, spraying all over the place, with a 1000 meter range? Oh, great.

It's not in the visible spectrum so you can't tell if it's damaging your eyeballs, either.


Is LIDAR just a stop-gap solution until we get better structure-from-motion type algorithms? Humans don't need lasers to drive. Binocular vision seems to suit the task just fine. Autonomous vehicles have the advantage that they can leverage hyperspectral imaging. Why the obsession with LIDAR?


Writing software for competent self driving is turning out to be really hard, and no one is succeeding. On the other hand, building better and cheaper lidar hardware is a well-defined problem that companies can pour money into. Their software still sucks, but everyone’s occupied with wrangling the $100k lidar.


I agree... but I also want to point out that it's totally possible that writing software to make self-driving cars work acceptably well might be easier if the software has better sensors than a human does, so it might actually be the best way to try to solve the problem.


Could some rebel make things like fake road bumps that lidars can't pick up, but humans can?


Protecting AI against spoofing attacks is a pretty active area of research right now.


Protecting neural networks from adversarial attacks, i think you mean. Protecting good A.I. from spoofing attacks is exactly the same as protecting human drivers from spoofing attacks.


Or the other way around - trick the lidar into thinking a road bump is there when there is no bump.


Interesting. I also remember hearing about a device which zap mosquitos out of the air with lasers and a lot of careful accuracy. Is the future going to include nefarious actors with devices which near-instantly destroy all cameras (...or eyes?) in sight?


So some has already built an "Eye Seeking Laser" https://www.youtube.com/watch?v=Q8zC3-ZQFJI


Well, time for me to put on a pair of reflective sunglasses to never take off.


"Do not stare into laser with remaining eye."



You want to build your mosquito-zapping lasers to use a frequency that won't go through your eye to hit your retina.


1550nm anti-paparazzi devices coming in 3... 2... 1...

(Or maybe not, owing to the potential liability.)



Only a problem for mirrorless cameras.


With lidar becoming more and more popular, are there ways to protect cameras in it's path; i.e. dashcams.


[dupe]


"I'd rather share the road exclusively with other humans that fear death just as much as I do."

Assuming you fear death enough to give your full attention to driving safely, how do you hope to accomplish this wonderful utopia? There will always be drunk, distracted, aggressive, tired, epileptic, vision impaired drivers as long as humans are driving. The true solution is to remove all human drivers and only have computers that don't need to worry about unpredictability of other cars, because the cars communicate directly.


[flagged]


Everyone seems to be downvoting you but I agree. Our tech is not even nearly in the universe of good enough to strap to a 2 ton vehicle traveling at 100kph along random roads.

And by tech I'm including security, because that's the part that I'm worried about too. There's so much incentive for bad actors to mess with these systems, and the consequences are catastrophic.

Just imagine setting up a hidden 1550nm lidar jammer/spammer to confuse cars on a freeway. You could make cars crash and nobody would even know it was there. Even if it made only 0.1% of the cars just slightly more likely to crash, it could still cause deaths.

Also RCE and DOS attacks now the cars have remote start and other stupid "features" that increase the attack surface.

I'm sure there's many more malicious scenarios you can imagine.

Everyone on here seems so keen for these self-driving cars to take over, but there's just so many avenues for abuse and our road laws are not set up to handle all the complications.


I'm probably not the strawman you thought you were talking to, but I'll bite. I'm nervous about the prospect of buggy/vulnerable self-driving programs, but I'd say I'm even less comfortable sharing the road with the dozens of texting drivers I see everyday, and the several drunk drivers I report each year, along with many more that I just don't see, and the multitudes who can't be bothered to use their turning signals or check blinds spots on the highway.


I don't ever recall being asked if I consent to share the road with you...

Humans are bad drivers, self driving cars do not have to be very good to be substantially better than humans.


Directing your fabricated outrage at HN is only going to serve to earn you some downvotes.

You dont have to buy in to progress if you dont want to. But getting angry about it will only give you a headache.


Given that automobile accidents are currently one of the most likely ways that you will meet your demise, would you agree that decreasing the likelihood of accidents is a worthwhile pursuit?

As I understand it, increased safety is one of the primary motivators for self-driving cars. I think it's fairly obvious that this goal has not been realized yet. But it's one motivator for a lot of people, one which you don't seem to acknowledge.


> I don't recall ever being asked whether I consent to sharing the road with non-human drivers. I don't

Society does not require individual people's consent in determining which vehicles drive on the road. If it did, there would be no large pickup trucks, because I would never have allowed them.


I hate giving Elon credit, but there is some wisdom in the idea of only using optical Receivers for self driving input.


> I hate giving Elon credit

Why? Not only is he a genius (citation needed), but he's also highly invested in big-risk companies working toward a healthy future. I wish more people were like Elon.


Car LIDARs do/would operate at a different frequency and power level than the LIDAR at issue here, so no.


People that hate giving Musk credit don't, in general, call him by his first name.


I hope AI evolves fast enough to kill LIDAR even before it's birth. I really hate lasers on my eyes. I have fought once with some street laser sellers because of it, and some blood was splat. I don't like fighting and I'm not violent at all, but I hate those things and a future of thousands lasers coming from all cars on the street would be the Horror. Luckily it's all most vaporware and carmakers are going to loose a few billion dollars on this mistake, before pivoting to raw pixel based A.I.


If I had to bet my money, I would bet LIDAR will be like Betamax home videoplayers. Or the Sony Discman. Will die rather young : D , thanks god.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: