Hacker News new | past | comments | ask | show | jobs | submit login

The movie is using a whole compute cluster to do a physics simulation when rendering the movie. Your computer has 1 CPU and 1 GPU.

Your computer is just very limited in how much attention to detail it can have, because it has wildly less processing power and has to get its job done in real time.

By contrast, Disney is using a server farm to spend 3 seconds analyzing the mechanics of a tuft of hair wiggling in the breeze for 1 second.




Additionally, multiplayer games often have to run a low-geometry simulation server side to reduce cheating and account for prediction/latency sync issues.

While certainly not directly linked to poor physics simulations of water in computer games I would guess it has played a big role in holding back advancements in this area because how much locally-only non-game mechanic influencing visuals could cause contrast with need to be duplicated on the server physics.


To put this in perspective, 60fps == ~16.7ms per frame.

That's 16.7ms to run your WHOLE game engine, you get some fraction of that left over to do rendering.


The other thing to remember is that Disney are completely at liberty to reject and re-render scenes when something doesn't look perfect. Arguably, this even applies to examples chosen for the demo video.


Has anyone tried to make a real game in the same way?

Ie a game which was hosted on a remote desktop that had direct access to a GPU server farm on a very fast network.


This was initially attempted by G-Cluster in 2000, Crytek tried it as well, popularized by OnLive (who started working on it in 2003) and blew up in 2010.

The general term for it is cloud gaming https://en.wikipedia.org/wiki/Cloud_gaming


The answer is no. People are bringing up OnLive and Nvidia but they are just running the same game you run locally on remote hardware.

What you are asking is basically has anyone run a game on a supercomputer/cluster so that the graphics are way better than what could be done locally. The answer is no, nobody has. That would be pretty neat though. It would be super expensive to make with no way to recoup the cost.


It certainly could happen - http://www.nvidia.com/object/cloud-gaming.html - how much will someone pay per hour to play one game? It feels like a timewarp to pre-AOL when online games cost more per hour than minimum wage.


Here is an example that is not what you are looking for but is somewhat related. It's a game that is played by rendering high quality screenshots of the game world on a server. I enjoyed it a lot.

https://extrasolar.com/




Not a field I work in but I guess it would never work. You'll have the network latency to contend with. Or people would need to self host a server farm, which probably makes your market pretty small :-)


I wouldn't say never. The golden rule of computer games code is that you get to cheat a lot.





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: