This is very neat, but I’m a bit surprised by their headline examples:
> Materials admitting diffractive optical phenomena are visible: (a) a Bornite ore with a layer of copper oxide causing interference; (b) a Brazilian Rainbow Boa, whose scales are biological diffraction grated surfaces; and (c) a Chrysomelidae beetle, whose colour arises due to naturally-occurring multilayered interference reflectors in its elytron.
These are all local surface properties, in the sense that light hits a surface, does something wave-like, and interferes with itself on the way out. I would expect a shader could do a credible job of emulating all of these.
I would be more excited to see examples of interference between adjacent rays, like diffraction from the edge of an object. For example, if you close your eyes part way and look through you eyelashes on a bright sunny day, you see blurring and colors due to diffraction.
edit: My inner pedant feels obligated to clarify: by “interferes with itself” I mean interferes with slightly displaced versions of itself that are propagating in the same direction. If a plane wave is incident on an oil slick or a beetle, multiple reflections of the plane wave interfere with each other, but the result is, at least over distance scales comparable to the feature size of the surface (many wavelengths), a plane wave. And you can treat the reflected wave as a bundle or parallel (or not-quite-parallel) rays just like you treat the incident wave.
When edges are involved, the surface varies abruptly on a scale that is small compared to the wavelength, and the resulting wave is no longer a plane wave that propagates in roughly one direction.
> I would expect a shader could do a credible job of emulating all of these.
The key is that it is necessary to know the coherence of the light arriving at the surface in order for a shader to accurately model reflection. (See Figure 13 in the paper for an example that shows how this matters.) Otherwise a shader would have to make up a guess about the light's coherence.
The main contribution of the paper is a much more efficient approach than was known before for finding light's coherence, even in the presence of complex light transport (multiple reflections, etc), and one that further allows the application of traditional ray-optics computer graphics techniques for sampling light paths through the scene. (For example, previously if one wanted to track coherence, it was necessary to sample light paths starting from the light sources rather than from the camera, as is done in path tracing.)
I think this kind of effect is not supported by their algorithm. See the limitations section on page 10:
> The second assumption above is worth further discussion. When a generalized ray is partially occluded by matter, or falls upon the boundary between different matter (i.e., objects with different materials), interference between the different interactions arises. For example, on partial occlusion, interference between the matter interaction (of the partially occluded energy) and free-space propagation (of the unoccluded energy) results in free-space diffraction: diffracted lobes that “bend” around the matter. We ignore these edge effects in this work, and leave these for future work.
The interaction of light with multiple thin layers of dielectric material is first-year physics — you can compute all the reflections at the interfaces and the phase shifts through the layers as a function of wavelength and angle, and you can reassemble the result into a reflection coefficient. It depends only on angle of incidence and wavelength, and the angle of reflection equals the angle of incidence just like for a specular reflector.
So a shader should be able to handle this, in a physics-based way, as long as the path tracer can handle the wavelength dependence in a reasonable manner.
> Perhaps in the middle of the object. Near the edges it probably breaks down.
How so? The results from Maxwell’s equations are valid at any angle of incidence. And, as noted, even the algorithm in this paper can’t model the edges of an object.
> Anyway, if someone asks you to make a physically correct rendering of a scene, would you reach for a shader?
Yes, absolutely.
To be clear, I’m not talking about a fragment shader in a traditional rasterizer. I’m taking about a shader that executes when a ray hits an object. As I recall, POV-Ray was fixed-function, albeit a rather flexible fixed function. Modern ray tracers, path tracers, etc have shaders.
Yes, it's possible to do physically based diffraction grating effects in the traditional pipeline. Here's a tutorial on it: https://www.alanzucconi.com/2017/07/15/the-nature-of-light/ (I actually implemented this effect recently: the article is great overall, but you can get rid of the loop by using some simple algebra.)
However, as pointed out in sibling comments, this makes assumptions about the coherence of the light coming in.
> Materials admitting diffractive optical phenomena are visible: (a) a Bornite ore with a layer of copper oxide causing interference; (b) a Brazilian Rainbow Boa, whose scales are biological diffraction grated surfaces; and (c) a Chrysomelidae beetle, whose colour arises due to naturally-occurring multilayered interference reflectors in its elytron.
These are all local surface properties, in the sense that light hits a surface, does something wave-like, and interferes with itself on the way out. I would expect a shader could do a credible job of emulating all of these.
I would be more excited to see examples of interference between adjacent rays, like diffraction from the edge of an object. For example, if you close your eyes part way and look through you eyelashes on a bright sunny day, you see blurring and colors due to diffraction.
edit: My inner pedant feels obligated to clarify: by “interferes with itself” I mean interferes with slightly displaced versions of itself that are propagating in the same direction. If a plane wave is incident on an oil slick or a beetle, multiple reflections of the plane wave interfere with each other, but the result is, at least over distance scales comparable to the feature size of the surface (many wavelengths), a plane wave. And you can treat the reflected wave as a bundle or parallel (or not-quite-parallel) rays just like you treat the incident wave.
When edges are involved, the surface varies abruptly on a scale that is small compared to the wavelength, and the resulting wave is no longer a plane wave that propagates in roughly one direction.