Try (best title ever) “Slow Death by Rubber Duck” by Rick Smith and Bruce Lourie. Excellent book, although they frame their enquiry a little too much like “Supersize Me” and I'm not sure it was the best way.
Having done both web and gl ux for a living, I think you might be overestimating the complexity of a gl implementation and underestimating the complexity of meeting the same specs using a web implementation.
While not quite the same thing, if you have the time, dip your toes into some immediate mode UI, for example imgui. It is enjoyable not a grind.
Never did they ever say that would implement the UI of Blender (I assume that's what you were referring to) using OpenGL. I think what they meant was that implementing just a UI in OpenGL isn't as hard as the other guy thought
Do you think wasm has any chance of replacing the whole html/css stuff with just webgpu in a canvas? I have been playing a bit around with wgpu in rust and I can compile the same project either as a native binary or a .js that just renders to the browser. It seems to work pretty cool. Photoshop seems to be runable in the browser now, and I've seen a lot of other cool stuff, but things like fluid simulations seem to still be very laggy.
Rendering everything into a canvas will realistically mean total lack of accessibility features. Also you won't be able to use the DOM inspector. I don't think that would be an improvement at all. If one could work with the DOM via Wasm (plus source maps so you can still use the debugger) it might be something.
The cuda happy path is very polished and works reliably. The amdgpu happy path fights you a little but basically works. I think the amd libraries starting to be packaged under Linux is a big deal.
If you don't want to follow the happy path, on Nvidia you get to beg them to maybe support your use case in future. On amdgpu, you get the option to build it yourself, where almost all the pieces are open source and pliable. The driver ships in Linux. The userspace is on GitHub. It's only GPU firmware which is an opaque blob at present, and that's arguably equivalent to not being able to easily modify the silicon.
AMD GPUs work great, the issue is that people don't want to mess with ROCm/HIP when CUDA is kind of the documented workflow. Along with the fact that ROCm was stagnant for a long time. AMD missed the first AI wave, but are now committed to making ROCm into the best it can be.
The other problem is that there aren't any places to rent the high end AMD AI/ML GPUs, like the MI250's and soon to be released MI300's. They are only available on things like the Frontier super computer, which few developers have access to. "regular" developers are stuck without easy access to this equipment.
I'm working on the later problem. I'd like to create more of a flywheel effect. Get more developers interested in AMD by enabling them to inexpensively rent and do development on them, which will create more demand. @gmail if you'd like to be an early adopter.
As a Canadian, I can confirm 12 months is not enough let alone 6 weeks… I am deeply saddened at the thought of mothers having to go back after so little time
If it increases entropy as much as many suspect and it only took 1/3 of a couple humans' lives to open that phase space, the Universe has done what it wanted - to hasten heat-death.
As Hawking once explained, “Since events before the Big Bang have no observational consequences, one may as well cut them out of the theory, and say that time began at the Big Bang.” When cosmologists talk about the universe and its age, it seems to me, as a non-cosmologist, that they’re using terms of art related to their models.
Hawking’s explanation deduces that if the observable universe expanded from a singularity, we would be unable to meaningfully theorize what happened before then, since it would be beyond any form of observation to test the theory. Therefore, a scientific model rooted in observation can describe nothing earlier than the Big Bang.
However, not everything unseen is untrue. If a singularity were to form somewhere in Andromeda tomorrow — in all likelihood, one will — we will still have existed today.
Edit: The initial comment was meant as a lighthearted reply to the universe personification, but I ended up sensing a need to explain the reasoning.
It's not "personification", it's the universe tending toward entropy increasing overall. I don't think I've heard anyone claim that heat death "should have happened" as an argument against it, or what it's supposed to mean in reference to the original post.
There is a singularity in Andromeda, so I don't know why one forming matters
I absolutely love the integration in VSCode with the official Go extension. I can debug a running web server with delve with minimal config. Same for tests. Just experiment with the options, there are quite a lot, and unfortunately some not very well documented like gopls ones, at least last time i checked.