Hacker News new | past | comments | ask | show | jobs | submit login
Dirty lenses (2016) (kurtmunger.com)
101 points by pmoriarty on Jan 26, 2019 | hide | past | favorite | 34 comments



This article is missing a huge factor: aperture. If you go higher than f/11 the debris will start to get much more visible. All of the pictures in here are at f/5.6 or less.

Each pixel basically takes a weighted average of all the light that travels from a pixel-sized area in the world to the lens. As the picture with the dark area shows, this makes the debris form a dark area over a large part of the image. As the f number increases, the weighting gets stronger until 100% of it is a single line from scene to lens to pixel. At this theoretical f/infinity, you'll see all the lens's imperfections in perfect detail since the camera is essentially a pinhole.


You'd run into diffraction limit for any reasonable pixel size long before the pinhole diameter gets smaller than the wavelength of visible light. The rest of your comment I fully agree with.


Also fairly rare to be shooting at higher than F/11 anyway, isn't it?


In normal situations yes. On occasions where you want to maximise DoF and focus stacking isn't an option (say macro shots of moving insects) it's a useful approach.

The DoF when shooting macro can be fractions of a millimetre.


My understanding is that the diffraction limit correlates with "sensor" size. f11 is about the limit for the APS-C sensors common in DSLR's. f16 is usable for 35mm film and full frame. With large format film cameras, f64 is usable, e.g. Group f64. https://en.wikipedia.org/wiki/Group_f/64

There is a diffraction limit calculator at the bottom of this page, https://www.cambridgeincolour.com/tutorials/diffraction-phot...


A good companion to this article is the one by Lensrentals where they actually show what happens when the lens is stopped down, particularly the effect on flare.

https://www.lensrentals.com/blog/2008/10/front-element-scrat...


This sounds extremely specific. Do you know where I can find a formula describing this weighting mechanism or modeling light and camera and lenses in general? Thanks!


This is true for lower apertures, but at higher apertures they become much more noticeable especially at high magnification. This is especially annoying for telescopes (it tends to look like this https://farm9.staticflickr.com/8679/16584213212_5b33222d6e_c...), which why astrophotographers use a trick called flat calibration to remove any artifacts from dust or smudges in the optical train. The trick is surprisingly simple: take a picture of an evenly illuminated surface (usually a flat box or a clear sky) and make sure your ADU is in the linear phase of the sensor (ergo, don't clip the highlights or darks). Then divide your original image by the flat ($T * mean(flat) / flat). This is of course easier with a monochrome sensor and works best with RAW data, but it can definitely be done with a color CMOS as well. In case you ever find yourself with hard to correct dust on the sensor: this trick might save an otherwise poor image.


Huh.

Optics are modeled by differential equations. They describe a massively parallel phenomenon, light waves passing through and being modified by layers of media that, essentially, performs calculations using them. Warps and scales and so on. Clearly these particular differential equations are highly robust to architectural changes.

There's a parallel there to something else I've read about on Hacker News... what are those things called? Oh yeah, deep networks.

Are there any deep nets out there that attempt to mimic optics? or is anyone applying ideas from robust differential equations to deep nets?


The words you are looking for are "Fourier transform".

Effectively, the lens computes a Fourier Transform for you[1]

One can probably compute a FT with NN, but we already have FFT :)

[1]http://web.mit.edu/2.710/Fall06/2.710-wk10-a-sl.pdf


There are Moczulski et al. with their ACDC paper [1]. They describe a structured layer that can approximate any linear transform and discuss an optical implementation of it (see section 1.1 "lightning fast deep sell" :).

Our deep SELL offers several possibilities for analog physical implementation. Given the great demand for fast low energy neural networks, the possibility of harnessing physical phenomena to perform efficient computation in deep networks is worthy of consideration.... (see paper)

[1] https://arxiv.org/pdf/1511.05946.pdf


This article[0] appeared on HN the other day. I think it describes Richard Feynman using exactly the approach your talking about with an early distributed/parallel architecture.

See the eighth paragraph of the "An Algorithm For Logarithms" section.

[0]: http://longnow.org/essays/richard-feynman-and-connection-mac...


>Clearly these particular differential equations are highly robust to architectural changes.

Not really. There's an incredible amount of redundancy on the information received by the lens, because the exposure time needs to be short. The person writing the article purposefully didn't take pictures at higher apertures and different focus distances. Optics aren't modeled by differential equations, and if you want to model optics you need to choose a model that depends on your application. Modelling lens optics with quantum mechanics alone would be a massive pain in the ass, but completely ignoring them would also give you horribly wrong ideas. Also I don't really get by what you mean with "mimicking optics". Mimic which functions exactly?


Not directly that but there are companies working on optical computing for machine learning

https://www.optalysys.com

http://www.lighton.io


I took some photos of Mt Fuji from a distance with a 28-70 at prob something like 60mm f8. To my horror when I got home and fired up Lightroom there were several, very visible, dark blotches against the bright clouds around the summit.

I learned my lesson and will often use an air spray squishy thing to get rid of dust from the lens.


This was probably dust on the sensor, or the back element of the lens at least.


I had dust inside a push pull zoom. I never noticed anything. I think the main issue with dirt is if sunlight hits the dust on the lens it can do some additionaly flaring.

I don't use lens caps either, but put a clear filter on the lens. Resale value is hurt with scratches...


It is really, really hard to scratch a [modern] lens. Tony Northrop demonstrated this by stabbing the lens with a knife repeatedly and it never left a scratch.

UV filters though degrade image quality. Why buy nice glass just to put a “protective” filter on it that gets in the way of the image?!

The real concern is decentering the lens by dropping it, but no filter will protect it from that.


> The real concern is decentering the lens by dropping it, but no filter will protect it from that.

I've been saved by an ND0 filter when I bashed the lens against a rock. It destroyed the filter's screw thread but the len's wasn't damaged.

I also once while whale watching got 'steam blasted' by a humpback whale that exhaled while scratching it's back on the boats keel. I was able to quickly unscrew the steamed-up filter and carry on photographing.

Edit - This is also why I religiously use the lens hoods - if nothing else they form a sacrificial protector.


I think filters are useful as dust / junk collectors, especially modern ones with repellant coatings. The less you clean your real lens the better. The front glass is tough but not airtight so when you clean or dust it there is a real chance of introducing dust or water into the rest of the lens.

Clear filters can’t hurt IQ other than introducing flare or ghosts in backlighting. It’s possible to crack or scratch your front element with minimal to no effect on your picture; a clear piece of glass isn’t going to do anything.


>Clear filters can’t hurt IQ other than introducing flare or ghosts in backlighting.

Or causing fringing, or reducing sharpness, or reducing contrast...

The absolute _best_ that a filter can do to IQ is nothing. But that's basically impossible. Every air/glass interface causes reflections, and no piece of glass is optically perfect. So it's going to hurt your IQ, the question is merely how much, and whether it's worthwhile.

I used to use them. Then I decided I was probably never going to get much of anything reselling my lenses, so I might as well just use them instead of trying to baby them. Haven't had a problem with that decision, but YMMV.


Not a knife but a crab claw, which did result in some visible (on the glass) scratches: https://youtu.be/YcZkCnPs45s?t=581


The only time I put a filter on is when I shoot on the beach. Dried up saltwater spray is not good. Otherwise, I just use the lens cap to protect the lens. It's not safe to leave the cap off anyway even if you do have a filter: you could point your camera at the sun and damage the sensor (or shutter if it's a film camera).


On the other hand, a finger smudge on your cell phone camera lens will destroy contrast and the resulting picture, especially is there is any degree of backlight in the image.

Make sure you wipe the lens with your t-shirt every once in a while.


Nitpicking here. The element you can touch is not the lens. It is a protective cover.


Nitpicking more, the outer clear protective layer, still passes the image, and is therefore also a lens, dispite not playing a large part in focus.


I don't have any intuition as to why this would be a different situation. Anyone got an explanation?


It is much smaller.


I live in the tropics, and have fungi growing inside my lenses. They leave visible spots all the time, but photoshop takes care of that.


Oof. Use a drybox dude.


What I find fascinating about articles like this is that they demonstrate the high quality of common objects (well, maybe not too common for those that do not like to take photos with DSLRs, but anyway).

For some reason, I have almost been 'conditioned' to adopt a 'Everything that we manufactured is really crappy' philosophy---maybe this comes from working too much with software---so it is really nice when you learn about these things.

Also, as an owner of a DSLR, this is news to me, making me less anxious about now having the best equipment for cleaning everything with me all the time.


I wonder how oily/fatty smudges behave. Because on eyeglasses, they are either very noticeable, or, if you try to wipe them off but only end up smearing them all over the place, not really noticeable but extremely tiring (probably from subtle degradation and dispersing light coming from the side across what you see).


dust on the sensor shows up a lot more than lens scratches


The closer they are to the focal plane the more they show on the final image. Hence having a scratched rear element is going to affect image quality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: