Hacker News new | past | comments | ask | show | jobs | submit login

There are tradeoffs, but industry consensus hasn't really stifled innovation in a meaningful way.

Stifled innovation in what context? I wouldn't describe the web stack as an industry consensus given the rejection of it (for apps) by mobile devs, mobile users, video game devs, VR devs and so on. If you mean within the web stack, then, well, there have been plenty of proposals that died in the crib because they couldn't get consensus for rather obscure reasons that are hard to understand from the outside. Certainly, there were innovations in the earlier days of the web when it was more open that have never been replicated, for example, Flash's timeline based animation designer died and nothing really replaced it.

Fundamentally we can't really know what cool things might have existed in a different world where our tech stack is more open to extensions and forking.

Why can't a common graphics API evolve through well researched and heavily scrutinized proposals?

Why can't everything be done that way? It's been tried and results are a tragedy of the commons. These ideas don't originate in the web world. The incentive to develop the tech isn't actually limits in web specs, that's a decade+ after the innovation happens. What we see is that the web stuff is derivative and downstream from the big players. WebGPU traces its history through Vulkan to Metal/D3D12 and from there to AMD's Mantle, where it all started (afaik). So this stuff really starts as you'd expect, with some risk taking by a GPU designer looking for competitive advantage, then the proprietary OS design teams pick it up and start to abstract the GPU vendors, and then Khronos/Linux/Android realizes that OpenGL is going to become obsolete so they'd better have an answer to it, and then finally the HTML5 people decide the same for WebGL and start work on making a sandboxed version of Vulkan (sorta).

N.B. what made Mantle possible is that Windows is a relatively open and stable environment for driver developers. Ditto for CUDA. They can expose new hardware caps via vendor-specific APIs and Microsoft don't care/encourage it. That means there's value in coming up with a radical new way to accelerate driver performance. In contrast, Apple write their own drivers / APIs, Linux doesn't want proprietary drivers, ChromeOS doesn't even have any notion of an installable vendor driver model at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: