OpenGL isn't pretty, but it's at least cross-platform. And my impression was that OpenGL support is mostly handled by the GPU manufacturers, so I'm not sure how much Apple gains here by deprecating OpenGL.
Requiring developers to use an API locked to a particular platform feels pretty hostile to me. Doesn't matter if that API isn't perfect, or even far from it.
Although I agree it's a terrible decision for Apple only to have Apple-specific graphics APIs, please note that:
* Being deprecated does not mean that things will suddenly stop working. It will take a few more releases of macOS before this can be removed.
* Next to MoltenVK there is MoltenGL, which is an implementation of OpenGL ES 2.0 that runs on (edit) Metal [1]. That indicates it's at least feasible to wrap OpenGL applications in the future if necessary.
Furthermore, Apple wil drop support for all Macs that don't support Vulkan in this release of macOS [2]. Ouch, what a waste.
Nah. The GPU on Intel chips is free and the eGPU thing, to me, is official notification that Apple think GPU's should be on the outside. I bet this generation of MacBook Pros are the last to have discrete graphics...
I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac? And last I checked I have the choice of OpenGL and Vulkan on Windows because these days MS doesn't control the hardware stack from top to bottom on their software platform.
>I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac?
Plenty of big 3D/CAD/etc players? In lots of creative areas, the Mac dominates still (despite stories about people moving to Windows nobody's going anywhere, where nobody = quite few creatives overall).
Besides, with Metal they'll target iOS as well, and that's a huge platform, and where most of the profits are for mobile.
CAD on Mac is pretty much non-existent, as is any professional 3D market - the market share isn't there, the hardware support is terrible, so few major players bother with supporting Macs. All this stuff is either Windows (CAD) or Linux (3D simulation, visualization) these days.
And with this deprecation Mac is pretty much dead as a platform for professional 3D.
Creative Suite has run better on PC and at better price-performance ratio for almost a decade.
Graphic Designers still like Macs for the most part I guess -- and I still see them in video production a lot, but that's starting to change pretty quickly.
That depends on who you ask. OpenGL is in the deprecated API section on MSDN[1]. Because of the ICD model, Microsoft can't prevent GPU vendors from adding OpenGL features, but they don't bother integrating it with modern Windows APIs. You can't create an OpenGL context on a DirectComposition surface or in a UWP app. It integrates poorly with the compositor. You can't get composition feedback, and most drivers will flicker or show artifacts when windows are resized. OpenGL apps don't get windowed-fullscreen optimizations and you can't control when they enter exclusive fullscreen mode. I don't think you can use windowed-stereoscopic or windowed-HDR either. All these issues push developers away from OpenGL and towards DirectX, which is what Microsoft wants.
It’s not deprecated because it’s not even there to begin with — Windows 10 doesn’t ship OpenGL by default; GPU vendors provide their own implementations.
Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them
At least on Windows, the OpenGL implementation is part of the graphics driver. Why? Because by default Windows only has between rudimentary (at least up to Vista, I think; I am not sure about Windows 7 and 8.1) and no (Windows 10) OpenGL support - this is what the GPU vendors provides as part of his graphics driver.
Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them
I'm not sure. NVIDIA provides updates for CUDA and an extremely limited amount of updates for their graphics stack (AFAIK none at all for integrated graphics, for example).
OpenGL is pretty. Much prettier than these Metal and Vulkan abominations.
The difference is that OpenGL is designed to be easy for humans. glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd(); you can't beat that. The issue is that it hard for the driver to optimize.
That's where Metal and Vulkan come into play. These are low level APIs, sacrificing user friendliness for a greater control over the hardware. It is designed for 3D engines, not for application developers.
Nope, glVertex3f was deprecated years ago by OpenGL itself. That is not the way the API works any more. [1]
Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.
1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.
Maybe the games were not very complex? Professional game programmers building games with lots of shaders are very familiar with what I am talking about. See for example this thread:
> What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.
State-based recompilation is a known issue in many GL drivers, particularly on mobile. E.g. changing blending settings may cause shaders to get recompiled. This can take up to a second.
Some engines work around this by doing a dummy draw to an offscreen surface with all pipeline configurations that they use at init time. This (usually) guarantees that all the shaders are pre-compiled.
I think the recompilations being talked about here are shaders generated by the OpenGL implementation behind your back. That is, your program never sees them as shader or program objects because they implement some permutation of blend mode, depth test, culling type, etc..
While Vulkan is a bit verbose, it's not an order of magnitude difference if you follow modern OpenGL best practices. If you rely on default state and use the default framebuffer and rely on implicit synchonization, you can squeeze it down to a few hundred lines but that's not a good foundation to build practical apps on.
To give a ballpark figure, my Vulkan "base code" is less than 2x what my OpenGL boilerplate is for the same functionality. The big difference: the Vulkan code is easy to understand, but the GL code is not.
Comparing "Hello World" doesn't make much sense, OpenGL gets really darn complicated once you get past the basics.
In my opinion a similar difference exists between CUDA and OpenCL. OpenCL takes more code to get something simple going. But at least it doesn't break if you upgrade your gcc or use a different GPU vendor.
Each to their own but over the last 6 months I've written a graphics engine in openGL + SDL. Once you truly understand modern openGL you realise how beautiful it is.
You will think it's less beautiful when you ship that game on several platforms and find that it has different bugs on each platform, on each hardware version, and on each driver version. And most of these bugs you can't fix or work around, you just have to bug the vendor and hope they ship a fix in a few months, which they usually won't because your game is too small for them to care about.
This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.
> glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd();
That's fine for a "hello triangle" program, but quickly becomes ridiculous for anything approaching a serious engine. There's a reason that glDrawArrays() has been around since 1995 (and part of the core specification since 1997).
Requiring developers to use an API locked to a particular platform feels pretty hostile to me. Doesn't matter if that API isn't perfect, or even far from it.