Huh, didn't think this would be posted. This is a little demo effect I threw together last week when I was bored. It's built on top of WebGL (the effect itself is really just in a shader), and the reason it looks like a PNG is that it's a self-extracting PNG as described in my article here: http://daeken.com/superpacking-js-demos
Impressive: Firefox hold fine and rendered it, but after I opened it the second time, all the text in my system (that includes the browser, but also the terminals, Awesome, etc) was FUBAR - even after restarting the X server!
I have no idea what kind of bug leads to only the text being corrupted (so, it doesn't seem a bug in the graphic drivers, since AFAIK they're agnostic to that) but survives restarting the display manager.
I've had similar experiences with graphical corruption in X, and Firefox was almost always involved. Same radeon driver as you. And it seems to be distro-agnostic: I've seen it with Arch, Gentoo and Mint. All I know is that it's a recent thing: if I run an older version of my Gentoo build, it doesn't happen. But otherwise I see the same results.
I have no idea what to point to, because it's really weird. Scrambles /everything/ that goes through video. (And until now, I figured it was a quirk of my bizarre hardware. Guess not.)
Fantastic. I've been trying to pinpoint the cause of it since aeons. It's absolutely dreadful in awesome wm and at some point I laid the blame on (cairo-)XCB, or some Xft bug, but it cropped back too, only less often, under gnome/xfce.
I'm using the OSS radeon drivers and no, fortunately it was fine after a reboot :)
Corrupted memory was my first thought, but affecting only text is odd - I mean, the drivers aren't even supposed to know those particular pixels are text, right?
Not necessarily, no, I think graphic cards are aware that certain things are text. For example, I have a thinkpad t400 with two graphic cards(ATI and intel), on the ATI the text in consoles is not anti-aliased, but on the intel one it is. So I'm pretty sure that the graphic card(or at least the drivers) are aware of what is text and what is not to some extent
In Arch / Chromium / Intel card, and after starting Chromium disabling the webgl graphic cards blacklist, it just shuts my monitor down and I have to reboot.
I think the comments hilight very nicely why some people are not all that thrilled about having OpenGL in their browsers in the form of WebGL. Graphics drivers seem to be extremely difficult to "get right", especially if any degree of performance is wanted.
My considered opinion (from the industry) is that its really shocking how under-invested-in the drivers are.
If a fraction of the effort that went into making the card went into making the drivers and maintaining and fixing them, we'd all be massively better off.
The problem may be too much investment in drivers. Modern graphics card drivers are little operating systems unto themselves; they even have built-in compilers. They are incredibly complex, yet written in unsafe languages for maximum performance, with tons of ugly hacks for compatibility.
What we really need is a simplification of graphics APIs so that graphics drivers don't need to be so complex. That may be impossible without convergence in hardware though.
That graphics drivers are a mess internally says more about the insufficient resourcing to do it properly and cleanly and refactor than it does about the really rather trivial and well-understood APIs that the driver implements.
You think OpenGL and DirectX are "trivial"? I'm sorry, but an API that involves passing text strings of complete programs for the driver to compile, link, and asynchronously run on a vector coprocessor is not trivial to implement, not to mention the complex NUMA memory management going on.
The keywords were picked arbitarily, and are probably not very representative, but it seems like at least nVidia is actually investing in driver/software developement.
I think we need to start using a [WebGL] tag or something for these... obviously some browsers don't take it well, and it would keep a lid on all the (kinda pointless) "Doesn't work"/"works for me" comments.
In some cases, it's even worse. Low end netbooks running Chrome will load the page and then the entire machine locks up as the WebGL starts rendering and takes over the weak GPU. Easy way of DOSing someone is to send them a WebGL link if they're on a weak computer.
This can't be used for XSS or the like, really. The way it works is that the PNG is first interpreted as HTML by the browser (hence the filename) and then it loads itself into an image tag, causing it to be interpreted as a PNG. Once it's loaded into the image tag, the image is drawn to a canvas so that the code -- embedded in the PNG -- can be extracted and executed.
While fun, the only real security concern here is that it's really good at pissing off IDSes.
Edit: I linked my article describing the technique in another comment here if you want to see how horrible it really is. I'm always both proud of and disgusted by myself for this technique.
How do I get a refund on my Arch setup? Seriously, WebGL talks to the driver almost directly. All it takes is a browser/driver bug and the whole display stack goes down, possibly bringing the OS with it.
There's your problem. Are you using an ATI card, by the way?
Make sure you have restricted drivers installed. If you're using an ATI card you'll have to do the further step of going to about:flags in Chrome and making it ignore the software rendering blacklist.
NoScript is the new subject for "I run Linux, I don't have to worry about..." But after I enabled scripting it worked fine, very neat-looking demo. Also I admit that I enjoy reading about the problems of others here...
This crashed my OSX Lion running Chrome, lost all my work and had to reboot. Just tested Firefox too, crashes as well. Safari doesn't load anything, just a bunch of PNG binary text? Might want to work on compatibility ;)