Raytracing is not exactly a lightweight calculation. My first raytracer was TurboSilver 3D on the Amiga, in 1990 (actually one of the first commercial raytracers ever produced). "Photorealistic" images at 7.5 Mhz. For an image like this, you'd set up the scene, hit the render button, and grab a quick lunch. When you came back, the scene would be about 2/3rds rendered, and you'd watch it for a while, thrilled by every new pixel that pushed itself onto the screen. Then you'd go get coffee and hope the scene was done when you got back.
Now, the same scene (say, 320x200px) renders in an eyeblink on my phone, driven by a high-level universal scripting language I can tweak at will. This is beyond amazing. It's fucking transcendent.
(Oy vey, I feel old. Where'd I put my dentures? BTW: get off my lawn, etc.)
Raytracing is not exactly a lightweight calculation. My first raytracer was TurboSilver 3D on the Amiga, in 1990 (actually one of the first commercial raytracers ever produced). "Photorealistic" images at 7.5 Mhz. For an image like this, you'd set up the scene, hit the render button, and grab a quick lunch. When you came back, the scene would be about 2/3rds rendered, and you'd watch it for a while, thrilled by every new pixel that pushed itself onto the screen. Then you'd go get coffee and hope the scene was done when you got back.
Now, the same scene (say, 320x200px) renders in an eyeblink on my phone, driven by a high-level universal scripting language I can tweak at will. This is beyond amazing. It's fucking transcendent.
(Oy vey, I feel old. Where'd I put my dentures? BTW: get off my lawn, etc.)