Hacker News new | past | comments | ask | show | jobs | submit login

It is already dangerous anyway, as you have no control over driver bugs handling shader code.

As for blaming Apple, the reason why WebGL 2.0 lost compute shaders, a GL ES feature from 2014 (!), was because Google dropped it from Chrome after two failed attempts from Intel to bring it, as WebGPU was just around the corner, two years ago!

This is what happens with committee APIs, everyone messes up and we end up stuck with MVPs forever.

As for what we are getting instead, it is quite clear from console vendors, server side rendering with pixel streaming.




It sounds to me like you're saying:

(a) WebGPU could be dangerous anyway, so why not just expose Vulkan entirely with no safety?

(b) Committee-designed APIs are always doomed to fail, equally Google/Apple/Intel's fault, they won't ever ship a non-MVP anyway, 'so why bother' I presume?

(c) Console vendors are pushing server-side rendering pixel streaming too, so we won't have any control over our devices ultimately in the future anyway - so none of this matters.

If those are the arguments, I'd rather have WebGPU than not have it, personally.


You misread (C), server side rendering is the only way we will ever have modern 3D APIs on the Web.

Looking forward to WebGPU support on ShaderToy, pity we will have to rewrite all shaders to Rust flavoured WGSL.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: