Hacker News new | past | comments | ask | show | jobs | submit login

Sure. One way is to gather splats under various real world light conditions, then map those to the closest simulated light condition. (I.e. make the data animate over time of day.)

The data requirements might become massive, but there are ways to do the interpolation where it isn’t so bad. If a static scene is 2GB, you should be able to get to a rough time of day approximation in less than 16GB, which is renderable on modern GPUs.

Then it’s “just” a matter of spending several years optimizing it while waiting for H100s to become consumer grade devices.




It’s not actually that difficult. By differentiating on the spherical harmonics of each point/Gaussian we can approximate materials and their response to lighting.


Sure, if you want to reinvent N dot L. (In other words, yes, you can do that, but then the result will look just as fake as every other “photo realistic” scheme.)

The only hope is to measure actual photons hitting actual sensors, which is why Gaussian splatting looks so real to begin with.


> gather splats under various real world light conditions, then map those to the closest simulated light condition

is a better alternative from your perspective?


Oh yes. The key to making realistic-looking video is to sample from the real world. The more closely you do that, the more realistic it looks. The limit case is a phone camera recording a video.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: