It is already dangerous anyway, as you have no control over driver bugs handling shader code.
As for blaming Apple, the reason why WebGL 2.0 lost compute shaders, a GL ES feature from 2014 (!), was because Google dropped it from Chrome after two failed attempts from Intel to bring it, as WebGPU was just around the corner, two years ago!
This is what happens with committee APIs, everyone messes up and we end up stuck with MVPs forever.
As for what we are getting instead, it is quite clear from console vendors, server side rendering with pixel streaming.
(a) WebGPU could be dangerous anyway, so why not just expose Vulkan entirely with no safety?
(b) Committee-designed APIs are always doomed to fail, equally Google/Apple/Intel's fault, they won't ever ship a non-MVP anyway, 'so why bother' I presume?
(c) Console vendors are pushing server-side rendering pixel streaming too, so we won't have any control over our devices ultimately in the future anyway - so none of this matters.
If those are the arguments, I'd rather have WebGPU than not have it, personally.
As for blaming Apple, the reason why WebGL 2.0 lost compute shaders, a GL ES feature from 2014 (!), was because Google dropped it from Chrome after two failed attempts from Intel to bring it, as WebGPU was just around the corner, two years ago!
This is what happens with committee APIs, everyone messes up and we end up stuck with MVPs forever.
As for what we are getting instead, it is quite clear from console vendors, server side rendering with pixel streaming.