I have to say, the most impressive part about this is the fact that my macbook doesn't even rev up like a jet engine like it would if I was..say...watching a flash video.
Maybe you have a graphics driver that's been black listed, so you're falling back to software? Try Googling for steps to override the WebGL driver blacklist in Firefox and Chrome (probably a command-line flag, about:flags, or about:config setting).
And here I am trying to get a bunch of 2d lines rendering quickly in a 2d canvas for some charts. So strange to me that something 'simple' like drawing 20,000 lines chugs along at ~20fps and a water simulation with caustics is there running at 30fps just fine. Goes to show how much you gain from a GPU if you're using it for something it's designed to do.
Unfortunately it's a lot of smaller charts and they're embedded in an SVG based diagram interface as foreignObjects, which doesn't support a WebGL context (in Chrome at least). I'd love to play with WebGL, but not usable for this task.
Writing the whole interface in a WebGL canvas could be really fun, but far too much work for an MVP, and would kind of mess with making it work on iPad etc.
According to https://developer.mozilla.org/en/WebGL#Gecko_6.0_notes OES_texture_float was added in Firefox 6. This is strange, though, because IIRC the path tracing demo that was posted recently used floating point textures, though perhaps through a different means (I'm not familiar with how extensions and texture formats work in WebGL).
Edit: I just got the demo to work in Chrome, and it says that, at least for caustics, OES_standard_derivatives is also required.
Seconding a call for a YouTube video. On Chrome 13, Windows 7, with a nVidia GeForce Go 7600 with the latest driver and OpenGL 2.1.2, and I get the error:
You might laugh at my hardware, but from my own OpenGL work I know the majority of users out there have crappy old chips or god-awful Intel graphics chips which the vendors can't be bothered to support any more with decent drivers that support the latest OpenGL features. Most people don't surf with a gaming grade GPU. If WebGL is this fussy about hardware, it could severely limit its uptake.
Hardware compatibility is the least of WebGL's problems. Right now there's essentially zero chance of WebGL powering any modern graphics engine. It unfortunately seems doomed to a life of "hey-look-at-my-cool-effect". (Which is neat, don't get me wrong. But it will never reach Doom 3 quality, and that's a tragedy IMO.)
Example: WebGL doesn't even support compressed textures. The spec actually mandates "We will not support compressed textures, and this is by design". Facepalm...
Well my stuff isn't Doom3 level so I guess it's useless but I'm building visualization tools for the brain imaging community with WebGL and for that it works great. I have no fear that WebGL will do fine, it's an evolution of the Web the same as was seen on the phones. Looks at 2006, did anyone seriously think phones would be able to do much? Then a few years later Carmack shows some cool demo engine for iOS Rage which looks about as good as the PS3.
As for compressed textures. It hasn't been ruled out for ever. This is stil version 1.0 of the spec.
I was of course talking specifically about WebGL for games...
But it seems like you'd run into many of the same hindrances. For example, how will you get around the 5MB local storage limit? It would be unwieldly to re-download the whole visualization dataset each time, no?
P.S. I have nothing but respect for the medical visualization community. A lot of game graphics techniques were derived directly from work done in medical research (for example, BSP rendering and voxel rendering).
I offer multiple options, 1. they can open local files, option two is obviously to get them from the server every time. For most things they don't usually look at the same stuff multiple times. An average MRI is pretty low resolution and can fit in under 50mb (average 10mb, 50-60 for fMRI scans) of storage thus it's not to expensive to redownload it. For surface data it's even less, my average brain surface is also under 10mb. Most of my users are on University networks which helps a lot.
Also applications in the chrome web store can get around the local storage limitation. I'd browsers to offer a quota system where the user can assign a max amount to their applications. That would solve those issues.
Compressed textures are an intellectual property minefield. Blame the patent system, again...
However, the lack of compressed textures, or any other specific feature isn't WebGL's biggest problem. It really is hardware compatibility. WebGL is the first web technology to expose significant hardware variations (with bonus buggy drivers!) directly to web content, and web developers aren't equipped to handle that. Deploying a WebGL app with any hope of wide compatibility will require testing on at least 3 and more likely 10+ different hardware/driver/os configurations, and the fact that a GPU is required makes automated testing difficult (VMs don't work, remote desktop solutions often don't work, driver crashes hang machines).
The least of WebGL's problems? Failing to run at all with no fallback isn't a big problem? :P
Why is texture compression so badly necessary? Aren't downloaded PNG or JPEG files that you upload to the GPU uncompressed "good enough"? I don't think anyone's asking for COD5 to be in WebGL.
Texture compression isn't just for smaller downloads. In fact JPEG is fine for reducing download sizes. However, when the JPEG is converted to a texture it is completely decompressed. Texture compression formats aren't as good at compression as JPEG, but they are specially designed so that they never need to be decompressed; they remain compressed in graphics memory and the GPU decodes them on the fly as they are sampled. Not only does this save graphics memory, it also saves memory bandwidth, which actually makes rendering faster.
Given this, it makes sense to store your textures as JPEG and convert them to compressed textures at load time. That way you get the excellent compression ratio of JPEG for the download and the speed boost of compressed textures for rendering. For example, id Tech 5 (RAGE) does this for their megatextures (except they use JPEG XR instead of JPEG).
S3TC(DTX) and PVRTC have patent problems. The Khronos group is working on designing and popularizing new formats that aren't patent encumbered. So, far they best we've got is ETC -which doesn't support alpha channels and is not yet supported by PC hardware...
Ah, OK. Vertex Texturing is an extension. It's not part of the WebGL spec and only certain hardware supports it. Unfortunately, it make really cool effects a lot easier. So, demos keep using it. I wish they'd stop. It's giving WebGL a worse rep for compatibility than it deserves.
When I bought my computer I didn't think graphics were important. Now I wish I had spent some money on a good graphics card. I might have to buy one so I can develop stuff like this.
For those who are interested, if you have a laptop with an intel based graphics card, you are probably using something called Mesa. After researching for a long time, Mesa is some sort of a graphics driver that works on Linux. Apparently, if you have Mesa 7.10+ and Firefox 6 webgl works, though I haven't figured out how to upgrade Mesa. The default on Ubuntu 10.04 is 7.4 or something like that. Apparently, Mesa is pretty buggy and can crash a lot.
If anyone has more info about this subject, I would appreciate it.
Mesa is a software 3D driver. It uses the CPU instead of your GPU. I've never had an intel igp on Linux, but it should probably be using intel drivers for 3d, not Mesa.
Mesa was originally a pure-software implementation of OpenGL (with the trademarks filed off), but since it was the only complete, open-source GL stack, a lot of open-source hardware drivers re-use parts of it, including the official Intel graphics drivers.
To the grandparent poster: Like any system component, the best way to upgrade Mesa is to upgrade your OS. Sadly, the latest version of Ubuntu is 11.04, which comes with Mesa 7.10.2, and Firefox blacklists every version of Mesa before 7.10.3, so you'll probably have to wait for 11.10 (or just force-enable webgl).
Thanks for the extra info. I wasn't aware that hardware drivers used it, I just remember back in the day Mesa meant no Soldier of Fortune demo and Beryl :)
I guess this is going to be the next race for browser compatibility. So far I've tried Firefox, Safari and Chrome all on OS X Lion, I'm hoping a newer version of Chrome will work, if not I guess this is all still mostly academics.
One of the great joys of using the FireFox Nightly, is that stuff like this usually works. Unless you need total extension compatibility, these have seemed rock-stable over the last month or so.
I’m on Lion and vanilla Chrome works for me. Safari also works if I turn on the Develop menu (in the Advanced tab of the preferences) and use it to turn on WebGL.
Yep, newest version of chrome does work. Which is also the only browser that does. It uses about 40% of a i5 dual core cpu in the Macbook Pro 15" mid 2010.
I'm on a relatively ancient non-unibody MBP with a Penryn Core 2 Duo, 512 MB GeForce 8600M GT and Snow Leopard and it works on vanilla Chrome for me as well. And it uses 30%–40% here too — interesting since the i5 benches a lot better.
Doesn't seem to work here however on the Dev branch (15.0.849.0). Not that that's surprising: random stuff breaks regularly, and there's no way to go back...
Something really interesting happens if you move the ball to the surface, let the water become calm, then move the ball horizontally along the surface.
Eventually we'll want more from our standard, flat websites. WebGL is part of that. Not every site of course, but with the inevitable increases in processing power that come in the future so will our desire to be inventive with it.
Keep in mind we're entering a post-PC era. Our mobile and tablets already include (a bit) of GPU acceleration for certain CSS transitions. This is where WebGL will shine, in my opinion.
Doesn't work on Intel graphics (at least under linux with F15-era drivers). "This demo requires OES_texture_float extension." Float textures are pretty bleeding edge even for fancy GPUs.
Hrm, the site says that latest versions of both Firefox (Stable & Aurora) and Opera, and Chromium 6 don't support WebGL. Is there any browser this is supposed to work in?
that's just the initial surface disturbance. it starts at the center of where the sphere breaches the water and has an amplitude. the disturbance radiates outward from that center, but there's no "sticking" to other meshes happening
Another HN item that can only been seen by those using Google Chrome. When will it stop? Are we really so dumb and foolish to go right back to the "Works only in IE6" days? WHY is it that because it's Chrome and not IE6, that makes it alright?
Chrome isn't the only browser that works:
- Safari 5.1 (released with Lion) works if you enable WebGL in the developer menu.
- Firefox 6.0 works with this demo. 5.0 tells me a feature is missing.
- Opera's WebGL preview build fails in the exact same way as Firefox 5.0, which indicates to me that a newer build might work.
WebGL is an emerging standard supported by FOUR major browser vendors (Google, Apple, Opera, Mozilla). Microsoft is the only one not currently adopting. It's a continuation of them doing their own thing without a care about inter-operability. Silverlight 5 implements GPU-accelerated 3D so Microsoft likely considers WebGL to be a competing product.
Microsoft considers the security implications of open-ended access (relatively) to be insecure, and the crashes other users are reporting on this article right here prove them right, for now. GPU drivers have never before been considered attack surfaces requiring hardening against malicious attackers. Stand by for the first major attack using this vector.
Awesome.