Hacker News new | past | comments | ask | show | jobs | submit login

> What are the ‘false promises’ of WebGPU

That you can write a "compute shader" once, and it will run "anywhere." This isn't the case with any accelerated compute API, so why is WebGPU going to be different?

Reality will be Chrome Windows Desktop WebGPU, Chrome Android (newish) WebGPU, Mobile Safari iOS 18 WebGPU, iPad WebGPU, macOS Safari WebGPU, macOS Chrome WebGPU, iOS default in app browser WebGPU, Instagram and Facebook in app browser WebGPU...

This isn't complicated! If that's reality, I'd rather have:

Apple Compute Shaders for Browser. Windows Chrome Compute Shaders for Browser. Android Chrome Compute Shaders for Browser.

Because I'm going to go through a middleware like Unity to deal with both situations. But look at which is simpler. It's not complicated.

> I’m trying to understand if and why you do.

I make games. I like the status quo where we get amazing game engines for free.

I cannot force open source developers to do anything. They are welcome to waste their time on any effort. If Bevy has great WebGL 2 support, which runs almost without warts everywhere, even on iOS, for example, it makes no sense to worry about WebGPU at all, due to the nature of the games that use Bevy. Because "runs on WebGPU" is making-believe that you can avoid the hard multiplatform engine bits. Engines like Construct and LOVE and whatever - 2D games don't need compute shaders, they are not very performance sensitive, use the browser as the middleware, and the ones that are, they should just use a huge commercial game engine. People have choices.






> That you can write a "compute shader" once, and it will run "anywhere."

Can you post a link to that quote? What exactly are you quoting?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: