Hacker News new | past | comments | ask | show | jobs | submit login

I don't know this field of research thus my question: why such a high framerate is consider as a feature at all? Does it help with learning rate?



If you have a simulation where realtime is 60 fps, you could simulate a little over 4.5 hours per second if you could run it at 1M fps. That would definitely help with learning rate.


Feels like less ≠ more. If you want to fill in these gaps you might aswell interpolate


He's not saying "break realtime into microsecond chunks."

He's saying: run through 4.5 hours of 16 millisecond chunks of time in a second. This is good for regression testing or producing training data quickly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: