Hacker News new | past | comments | ask | show | jobs | submit login
WebGL Water Simulator (madebyevan.com)
259 points by Exuma on Jan 10, 2015 | hide | past | favorite | 72 comments



Fun trick: Pause the simulation with the space bar. Then, take your mouse and drag it on the surface of the water, you will notice it creates ripples but in the paused state.

Drag the mouse rapidly back and forth in a very tiny area so that it creates a layered 'ripple' that grows and grows. If you spend about 2 minutes doing this you can make the ripple go like 10 feet high completely off the screen.

Then unpause the simulation for a massive tsunami.


Rather than dragging, just click in the same spot hundreds of times to make a huge tower of water, then unpause to see a single ring of concentric waves and perfect wave reflection/interaction effects.


Or paste this into the console:

     water.addDrop(x, y, radius, strength)
0,0 is the center, bounds are -1 to 1. 0.1 radius is a reasonable spike, 1 radius is a huge swell. 1 strength is really big. 10 is ridiculous. Try negative numbers too!

Try:

     water.addDrop(0, 0, 0.1, 5)


water.addDrop(5, 9, -0.03, 1) is cool, especially as it starts to settle


HOLY CRAP.... Dude


If you click a lot, the level of water gets higher. Each click adds some water to the tank.

edit: The white sphere don't use the real level of water to decide where it should float (when gravity is activated)


Or automate the process:

    for i in $(seq 1 100); do cliclick c:.; done


But if you put the ball on half-way on the surface, it seems the water waves don't bounce on it. Or is it me ?


If you do it in the middle, you get some really nice patterns


This was really really fun! :D


I got a very nice heart shape by dragging: http://i.imgur.com/0675p4W.png


Doing this in a browser is a neat trick, but the actual state-of-the-art in realtime GPU fluids is slightly more impressive. nVidias FleX middleware is a good example:

https://www.youtube.com/watch?v=1o0Nuq71gI4


<= NOT a graphics programmer, in this era anyway. Back in my graphics days getting Phong shading working on a 286 was amazing to me.

So, does that Flex system BASICALLY imply that in another few decades the individual granularity of those particles will become so small and so numerous that we'll basically be modeling liquids and solids at the molecular level? It seems to me that it's only a lateral step then to mimic the fluid like effects of explosive forces being applied to solids.


> ... that in another few decades the individual granularity of those particles will become so small and so numerous that we'll basically be modeling liquids and solids at the molecular level?

Also not a graphics programmer, but I am a scientist who has model liquids and solids at the molecular level–a lot of this is doable today, but I doubt it'll ever be applied directly to a macroscopic graphics engine. There are quite simply too many atoms: there's 1 mole (6e23 atoms) in 18g of water. At best, todays chips have several billion transistors (5e9). Even at one transistor per molecule, the numbers just don't add up.

There are lots of multi-scale modelling techniques though, where you directly model the system on the atomic scale in just a few small volumes, e.g. at the tip of a crack just before it propagates. It's not impossible that sort of thing will make an appearance, though I'd still be surprised. Graphics programming is all smoke and mirrors; if it's easier to approximate something using an unphysical process that looks right, people will favour that over an exact simulation that takes far more computing power.


Yeah. You bring up an excellent point. Smoke and mirrors. It won't be molecular...not with semiconductors anyway. Maybe when quantum holographic systems are operating our transporters it will be possible :D.

UNTIL THEN, All it has to do is get granular enough to fill the resolution of the current generation of monitor technology. Just like how ray tracing really only requires enough granularity to appear seamless at the current resolution.


Number of transistors is not really relevant here, since it still doesn't tell us anything about how many atoms we could simulate per second. If anything, we should be rather looking at FLOPS that can be generated by a given chip.


My team made this back in 1996, on a Pentium Pro 180. I had to dump the frames off every 4 hours for three days straight. How things have changed.

https://www.youtube.com/watch?v=kU8pfwlPF4I


That video is breathtaking. I goofed off with N-body systems like probably most people and the size and performance of the systems shown there are light years ahead in comparison.


I love videos like this, do you know of any more good ones? I could watch this stuff all day.


Technical demos, or fluid dynamics eyecandy? Check out what the heavy duty non-realtime systems can do if you want the later.

https://www.youtube.com/watch?v=WD_K0Koi1MU


Great stuff, but man, what a reality check. The most cutting-edge physics and graphics effects and it's used to sell beer and candy bars.


Some of those examples were clearly not real but some - the snickers bar being split - seemed pretty real.

I wonder if those very realistic shots are usable in UK adverts without disclaimers? As an example, see ads for mascara which get regulated if they use computer enhanced lashes and no disclaimer.


You might be surprised how fake the real thing can look:

https://www.youtube.com/watch?v=Ow-dDfarZuY

I'd say without a doubt that shots like 0:25 or 3:00 were CGI...


Actually, I was once talking to a friend that does photography for print ads: I was surprised to find out that most of the highly detailed stills of, say, lipstick or champagne coming out of a bottle are actual photographs.


Why were you surprised by that/How did you assume they were made otherwise?


I assumed many things were renders or that stuff was combined later in photoshop.

I thought that champagne overflowing from a bottle was not made with real champagne over and over again until you got the perfect shot.

I was also surprised to find out that when the background had a nice motion blur it was sometimes actual motion blur made with a rotating rig that had the camera and the object to be shot on it.


Check out the annual Siggraph preview videos, https://www.youtube.com/results?search_query=siggraph+2014+p...



Sanity check: Wasn't this posted on HN before?

Regardless. It's awesome that a browser with WebGL can achieve that kind of speed and behavioral complexity. Is ray tracing the reflections part of OpenGL? Or is that a separate library?


It has been posted several times before.


This is extremely well done.

The only minor nit I can come up with is the lack of surface tension. When you pull the ball up through the surface of the water, some of the water should stick to the ball. Maybe the ball is made out of lotus leaves, though.

I'd also expect some more bubbles when violently stirring the pool with the ball.


Unfortunately the effects that you describe are nontrivial. The method of water simulation that is used here doesn't lend itself to simulating the effects of things like water tension or breaking waves. This is because the water surface is represented as a 2D height-map, representing the displacement from equilibrium (i.e. the z coordinate as a function of x and y). It's clear that it is not possible to represent water clinging to the sphere or dripping off it.

A particle-based system could simulate the effects you describe (by keeping track of the positions of some large number of water "molecules" and exchanging forces between each pair of particles), however the simulation would be far too costly to run in real-time on typical hardware today. It also has its downsides. With a height-map based representation it is trivial to calculate the normal to the surface of the water, which is necessary for the lighting effects, but with a particle system you would need to reconstruct the surface of the water in an additional step each frame, using something like the marching cubes algorithm.


So the surface of the water is a real time displacement map?


A real time calculated displacement map would be more appropriate. These simulations typically use real wave-front physics (Navier-Stokes), even if simplified.


That's beautiful. I used to write physics engines back when we had 100 MIPS, and could only dream about doing stuff like that in real time.

A nice use for this would be to emulate a ripple tank, as used in high school physics labs.


I've been using this to test whether I've got WebGL acceleration working properly for ages. I'm pretty sure I've seen this posted a couple of times before though :)


Wow, this runs at a smooth 60fps in iOS8!


This is 4 years old.


That's awesome, and holy cow I think this is the first time I've ever seen Chrome on Linux actually render WebGL. Did they finally enable it by default in recent updates?


I've been playing with WebGL demos on Ubuntu for years, first Chrome then Firefox. It isn't new. You might also be impressed to find out that Steam is on Linux now too.


I still have to enable it in chrome://flags as of the last official GNU/Linux release. It is enabled by default in Windows releases in my experience.


Can't wait to see some more crazy stuff done in WebGL! Working fine on my MBA 2014, but I think it's the first time I heard the fan spinning :)!


On Android Firefox:

    Error: Rendering to floating-point textures is required but not supported.


Android Firefox and Chrome both implement the floating-point textures extension, but your devices GPU also has to natively support it.

If you're interested, you can check what your device supports with a tool like this: https://play.google.com/store/apps/details?id=com.darkrockst...

Not all of the GLES extensions will be exposed to WebGL, but most WebGL extensions directly map to a GLES one (GL_OES_texture_float in this case).


On a Galaxy S5, it works glitchily in Chrome and not at all in Firefox.


Maybe Mozilla blacklisted the extension: Qualcomm drivers are apparently quite awful.

https://dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and...


same on Firefox Developer Edition 36.0a2 (2015-01-10).


I remember trying to run this on my phone previously and it not working. I also remember it causing my cup fan to fire up when I played with it on my laptop. The fact that the iPhone 6 runs this so well is pretty dang cool.


Same here! I remember trying it on an older iPhone and android phone and it was choppy. Just tried it on my iPhone 6 and it's so smooth. I love seeing that progression in technology first hand.


Does anyone happen to know what the effect is called that the sky is reflected stronger at shallow angles and whether that is the same effect which makes rough surfaces reflective at very shallow angles?



> This demo requires a decent graphics card and up-to-date drivers. If you can't run the demo, you can still see it on YouTube.

Works great on my iPad mini 2. Pretty incredible.


Yes it does. I'm SOL. On my SGS3 with both Chrome and FF I get:

    Error: Rendering to floating-point textures is required but not supported.
 
And on my Ubuntu 14.10 laptop with X1400 I get:

    Uncaught Error: link error: error: Too many vertex shader texture samplers


Shouldn't the ball cause ripples when moving underwater?


It looks like it's only simulating the surface. Moving a ball that of that relative size in that volume of water would cause all sorts of surface features, if the full volume of water was being simulated.


Surprisingly, this demo works great on my iPad 4.


Yeah, I just tested it on my mid-range android phone with Chrome. It works surprisingly well, though it lags a bit it's still usable. Really all webgl stuff should work on mobile since it was designed with that in mind. Ironically it probably wouldn't work on my old desktop with the Directx9 capable nvidia card.


I think this was the fourth duplicate post here


This is an old piece but it doesn't cease to amaze me. Well done, creating this must have taken a lot of time and skill.


The author of this simulation seems to have forgotten the effects of refraction. Besides that, it's great!


Refraction definitely works; try moving the camera around, and you can see the effects of refraction on the ball.


It appears to work on the top but not on the sides


The sides seem to be a cut-away view, not a surface...


Right. Refraction only occurs on the surface of the water, like the waves; the views through the sides are what you'd see from under the water at that point, not looking through a viewing window.


Well, if you want to get really technical, what you'd see from underwater would be much worse than what a camera sees, since your eyes require air to properly focus.

So if they wanted to make it what you would see underwater, they should just blur the crap out of it.


Are you talking about total internal reflection? http://upload.wikimedia.org/wikipedia/commons/5/5c/Total_int...

I think I can find an angle where you can't see the skybox.


does this use the navier stokes equations? How are the ripples generated?


Nope, it's much simpler than that. It's just a grid of vertical positions and each frame every cell moves toward the average position of the neighbors from the previous frame. See http://freespace.virgin.net/hugo.elias/graphics/x_water.htm for details.


Wow. It's amazing to see how far WebGL has come.


So close to a WebGL Wave Race!


you can even move the ball.


if you drag the ball round and round, it doesnt create a votex.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: