Hacker News new | past | comments | ask | show | jobs | submit login

I try not to be this guy, but actually your terminology is a little off.

Film cameras have grain. Literally, there are tiny physical grains of photo-sensitive material on the film. Those grains, which are distributed semi-randomly in a film across the medium determine the limit of film's resolution and give it part of its organic, non-digital feel.

Digital cameras have noise. That's the term for randomness injected in a signal from the effect of physics: Brownian motion, heat, quantum effects on photons, etc. Noise means that while each pixel on the sensor is in principle quantizing a measurement of the real world, that measurement has some random error that can never be fully eliminated.

A digital image recorded from a digital sensor has no grain. There was no film, and no grains of photosensitive media involved. (It might have simulated grain from some "film effect" look applied to the image.)

The distinction matters because they look different. It's one of the key reasons film looks like film and digital looks digital.




Thought it was clear I was using "grain" as a blanket term for the affect a tool has on it's output in the same way a analogue cameras film stocks have unique literal grain.

In the sense that the iPhone sensor > internal AI process > Apple Camera app post-processing choices imbue an iPhone Camera Image "grain" to what you shoot.


Sorry, yes, I didn't mean to criticize, just to clarify.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: