Hacker News new | past | comments | ask | show | jobs | submit login
Deprecation of OpenGL and OpenCL (developer.apple.com)
668 points by WoodenChair on June 4, 2018 | hide | past | favorite | 452 comments



Ugggggh. As if graphics support on macOS weren't middling enough already. It's like they're trying to become as irrelevant as possible in that area.

I could understand if they were deprecating it in favor of Vulkan. That would be in-line with Apple's history of aggressively pushing forward new standards. But by no means do they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else. All they'll accomplish is further killing off the already-small presence they have in the gaming space.


Apple must truly hate gaming, or suffer from a serious case of Not Invented Here with their Metal stuff. As if any serious gaming studio would target Metal which doesn't run on Windows.

In fact, they couldn't get their act together, keep with current versions, and as a result titles like Elite Dangerous were being shut down anyway. Reason: OpenGL stuck on an old version without support of compute shaders.

https://forums.frontier.co.uk/showthread.php/424243-Importan...

https://support.frontier.co.uk/kb/faq.php?id=228


As much as Apple would like to emulate the Microsoft of the 1990's, they're just so bad at it.

Embrace. Extend. Then extinguish.


Worse still, by ignoring a ubiquitous tech in favour of their own bespoke solution they are emulating Apple of the 90s!


Apple's model is Embrace. Replace. Extinguish.


I don't think there is any embracing.

Just Replace. Extinguish.


Or just plain extinguish.


To be fair, most games today are built using Unity3D, Unreal Engine etc, which all support Metal already. Hardly anyone writes their own game engines these days, and if they do they probably have the resources to support Metal. Overall still a bummer though.


The problem is still with apple forcing them to invest resources, without any reason, but to advance their vendor lock in. And if you're a developper of a small high performance 3d graphics and gpu computing library like me, its just a giant middle finger from apple and I will either need to drop opengl/opencl or apple - there is no way that i can afford to offer both, especially since i'd need to buy apple hardware to test things.


> most games today are built using Unity3D, Unreal Engine etc

On PC: Maybe by number of games, but not by the number of players.

(I count Fortnite as an outlier because it's technically not built on a third-party engine)


So based on number of players, what is the most used engine today if not UE nor Unity?

There's LoL, Dota 2 and Overwatch using their custom engines with huge numbers of players but... What else? CS:GO?


The Witcher 3 is using RedEngine, GTA V RAGE, the Battlefields and SW:Battlefront {1,2} are using Frostbite IIRC, the two new Tomb Raider on Horizon, Rainbow 6 Siege and the Assassin's creed are on Anvil, Overwatch & SC2 have their own engines too, same for League of Legends, CoD are on a heavily customized id Engine, Minecraft is custom, Bethesda have their own engines too for Skyrim & Fallout, Path of Exile cutom too, all taken from Steam 100 most played.


That's a nice list, quite complete. Many console exclusives also use custom engines by the way, e.g. Decima for Horizon:ZD, KillZone and Death Stranding, Naughty Dog has their own engine (don't know the name), etc.

The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though. I think it's hard to argue that any of the companies that develop these engines would not be able to also add a Metal back-end. Many of them already work across a pretty wide range of back-ends anyway. Xbox One, PS4 and Switch all use completely different API's, for example. I think most of the work is not in adding an additional backend like Metal, but in tuning the back-end for some specific piece of hardware (NVidia vs. AMD vs. mobile GPU, etc).

Whether companies are actually willing to invest in a Metal back-end remains to be seen, but considering many of them license their engine for commerical use, I would be surprised if the major players will simply ignore Metal.

I tend to agree with Jonathan Blow's comments on Twitter, that the low-level graphics API should be just that: as low level as possible, small, focussed, and not actually intended (but still allowing!) to be used directly. Engines or higher-level API's can be built on top of that, with the option to dive down to the lowest level when needed (which will probably be a rare occasion).

DirectX will definitely not be this API because it is Windows specific. Likewise for Metal because it is Apple-specific. Blow appears to be of the opinion that Vulkan is also not moving in the right direction, because it is becoming too complex, and trying to be too many things for too many applications at the same time.

If true, in a sense, it's not that surprising Apple is doubling down on their own API. I think they should consider making the API open-source though, and develop something like MoltenVK (but the other way around) for Windows/Linux.


>The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though.

OP doesn't know the diminishing profits of the AAA gaming industry.

OP would never have to justify to producers and directors why <1% of gamers should take 30% or more of rendering architects time.


Switch supports Vulkan.


Also, the Forza (custom) and Far Cry (CryEngine/DuniaEngine) series. Titanfall uses a modified Source engine IIRC.


CSGO and DOTA2 both run on different versions of the source engine.


The top 10 most played today in steam are using UE4 (2), Source 2, Source (2), and custom engines (5: AnvilNext, RAGE, Evolution). That's a lot of variety, there's almost no reuse.


Battlefield, CoD, AC, all the sports titles, GTA. More or less everything build by one of the big three uses their own in-house engine.


Err at least some Battlefield games, fifa 17 and 18 use frostbite engine. I don't think one can call it custom.


> what is the most used engine today if not UE nor Unity?

My answer was aimed at that part. But yeah, maybe we should define what "custom" is.


> Hardly anyone writes their own game engines these days

more people than ever do.


> To be fair, most games today are built using Unity3D, Unreal Engine etc

What's the "etc"? Are there any other engines in that set?


With a bit of luck, Godot Engine. Sort of a dark horse, but I like it and my very smart corporate-programmer brother likes it. He says it's designed like a programmer would design it: everything's a node. I know I did a game in Unity (which has become overcomplicated) and had a surprisingly easy time jumping into Godot.


Cocos2D, CryEngine, MonoGame, Orge, Unigine, etc…


You’re not giving history it’s due.

Go back in time six years ago. What were Apple’s choices?

(1) continue to live with the deficiencies of OpenGL. Remember that, over time, it had come to fail at one of its primary purposes which was to provide efficient access to GPU hardware. Further, sticking with OpenGL would be to accept the leadership of a group that had allowed its flagship standard to falter.

(2) They could marshal their resources and create the better API that the Khronos Group wouldn’t/couldn’t.

They really had no choice. Note that Vulkan wasn’t announced until after Metal was released.

The gripes in this are should really be leveled at the Khronos group, which fumbled their stewardship of OpenGL and, with it, the chance to lead open GPU APIs.


The time table is being pretty generous to Apple. Metal, Vulkan, and DX12 are reworked versions of Mantle.

The entire point of Mantle was to be a proof of concept that could be reworked into cross platform API (which became Vulkan), there was plenty of work already being done by Khronos in 2014 (and Apple knew this). And they just went out and released Metal anyway.

I also blame Microsoft for the same thing, early parts of the DX12 docs were taken word for word out of the Mantle docs, that's how similar they are. But Microsoft at least had couple decades of having a competing API, but Apple went out to create a new one for some reason.


Talk about rewriting history, Mantle wasn't never supposed to become Vulkan, it only happened because AMD was generous and Khronos would otherwise still be thinking how OpenGL Next would look like.

It started as an AMD/Frostbyte proprietary API.

https://community.amd.com/community/gaming/blog/2015/05/12/o...


They had no choice, really? Oh, give me a break.

AMD was basically in the same position when they started Mantle (at about the same time, no less.).

They chose differently. Now we have Vulkan. No thanks to Apple.


While I get the concern, everybody's history here is backwards. Apple released Metal 2 YEARS before Vulkan. Why? Because OpenGL wasn't hacking it anymore and had become too asymmetric. Vulkan copied Metal, not the other way around.

I'm not sure they should have spun around and dropped Metal for Vulkan once it became available, or slow down the pace of progress til the rest of the market caught up. Doesn't make sense.

Also Apple is perhaps the largest GPU manufacturer in the world, with 200-250M GPUs shipped in 2017. That is 4-5X of Nvidia! Also Apple is investing highly in AI from tools to devices to GPUs, being able to customize may have tremendous value.

It is highly possible that Apple sees owning their interface stack as a means to keep their software-hardware warchest a couple years ahead of the competition. Which in mobile has been paying off of the last 5 years, as they constantly have crushed all others by 2-3X.


Does it matter anymore? People are using less and less of the higher level stuff of OpenGL. Most of the graphics code is now in the engine. OpenGL is getting very outdated, who starting a project today would chose it over Vulcan, Directx or Metal? I would bet most small shops would prefer to use some sort of middle layer or engine from a third party. That pushes the problem of implementing the lower layers in Vulcan, DirectX or Metal to a small group of specialists.


Well, it matters for people who are writing an engine – or a web browser.


No games aren't going to target Metal to support the Mac any more than printer manufacturers are going to go out of thier way to support AirPrint to make printers Mac compatible.

What developers will do is go out of thier way to support iOS and supporting the Mac is just a side benefit. Just like almost every printer company supports the Mac as a byproduct of wanting to support iOS.


Hypothesis: There are more machines in consumer hands which support Metal than DirectX.

This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.

Its true Apple hasn't won the hard core gamer market, but they are no longer the niche player that had to cater to windows users.


If you're counting only gaming PCs (i.e. device used mainly for demanding 3D games) you should also count only gaming Macs/iPads/iPhones. How many are there in the world?


PC's use openGL and Directx All the android devices use opengl ES. Older ios devices use opengl


>This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.

There should do, albeit not for gaming but most of the office software is Windows with DirectX support. You won't be playing on, though.


Android runs OpenGL ES and there are a lot more of those than iOS devices.


Are there more Android devices that actually have hardware that can actually play high end games decently? The average Android phone is a low end phone - with an average selling price of $225 for all Android phones how can they not be?


Yeah, and they still support OpenGL. The most popular iphone by a long margin is the 6, which is not exactly a graphics power house.


Based on what statistics? Who is selling all of these high end Android phones? Even Samsung is selling mostly low end phones.

Also looking at Apple's sells every year since the 6 came out, I doubt very seriously that Apple has sold more 6 phones than 6S, 7, 8, and X phones.

Also if the 6 from 2015 is not a powerhouse, neither is Samsung S8 that was just introduced last year....

http://bgr.com/2017/05/23/iphone-6s-vs-galaxy-s8-speed-test-...


OpenGL is part of the platform, they all support it. The stats page doesn't even include 'not supported' [1]

Being able to run anything slightly demanding is other thing, but you can't argue there's no support.

Also, the benchmark you linked is for application load, which is heavily influenced by storage speed and load method (android has to JIT compile sometimes) and has almost no impact from the graphics' performance other than the bus between CPU/memory and GPU

[1] https://developer.android.com/about/dashboards/#OpenGL


Being able to run something suboptimally doesn't turn into sales. I'm sure that the owner of a $70 Blu R1 HD is not going to be spending money on high end games.


I've yet to see anyone build a hardcore Mac gaming machine.

Oh wait, you can't.


you could make a pretty safe argument that the iMac Pro is right up there with the best gaming PCs one could buy/assemble.


It really isn't. The fastest GPU available is a Vega 64 underclocked to basically the performance of a normal Vega 56. A 1080Ti is ~50% faster. Even if you connect an external 1080Ti it's constrained by TB3 bandwidth.


You can with an external GPU.....


Fortunately you can still use Vulkan on iOS and Mac OS through MoltenVK[1], a Vulkan implemention in Metal.

[1] https://github.com/KhronosGroup/MoltenVK


Sure, but the whole point of both Vulkan and Metal is to bring out more performance by being lower-level. I'd assume that at least part of that benefit is lost when you use something like this.


A lot of the point of Vulkan is not having to rewrite your entire graphics stack for every OS you want to target.


But you have to rewrite it for every major hardware vendor, or else you won't get the performance you want.

GL should always work for the simple case, but instead you need to rewrite to avoid bugs in its many layers. And once you have Vulkan/Metal, industry-specific wrappers are better than the impossible to debug procedural GL junk.


I'm not sure I agree with the claim, but even if we take a full rewrite at face value, "every major graphics vendor" for desktop applications is NVIDIA and AMD/ATI. On mobiles, you're probably using Unity or similar middleware and therefore not thinking about bare metal (no pun intended)


Actually it is more like every major GPU family, thanks to the extension fest of Vulkan.


That's the point of OpenGL; to my understanding, Vulkan is a modernized, lower-level spiritual successor to OpenGL.


Isn't that kind of the point of OpenGL?


Yes, but OpenGL is so outdated that the people that should use it the most (game and 3D application developers) were avoiding it due to a hardware vs. API incompatibility.

Vulcan was created to get that same portability, with an API that fits modern hardware.


That isn't remotely true. OpenGL is only outdated on macOS where Apple hasn't updated it for 8 years.

OpenGL 4.6 isn't anything like OpenGL 1.0/2.0 even though you can still _run_ those old OpenGL 1.0/2.0 tutorials.

You can even do most of the cool stuff of Vulkan in OpenGL via AZDO techniques (example: https://developer.nvidia.com/opengl-vulkan )


> OpenGL is only outdated on macOS

Also on Windows.

Apparently, if you want to distribute your software to wide audience, you can only rely on OpenGL 3.0, with minimal set of extensions. Here’s an example: https://github.com/Const-me/GL3Windows#building-and-running

All the target systems had latest Windows updates, and they all run Direct3D 11 software just fine (I mostly develop for D3D and I test on them). On some systems it works in 10.1 compatibility mode, MS calls that “feature levels”. Not a big deal in practice, the majority of D3D11 stuff still works OK.


No, you're getting stuck at 3.0 because you're hitting the deprecation strategy. You need to specifically request a post-3.0 context with wglCreateContextAttribsARB which you're not doing. Thus the system thinks you're an old legacy OpenGL app, and is giving you 3.0 as that was the last version before things were removed.

See: https://www.khronos.org/opengl/wiki/Tutorial:_OpenGL_3.1_The... for a tutorial.

My Pascal-based Nvidia GPU is showing OpenGL 4.6 on Windows 10. Nothing outdated here.


> No, you're getting stuck at 3.0 because you're hitting the deprecation strategy.

I think you’re wrong here. Two reasons.

1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.

2. Please read Intel’s documentation: https://www.intel.com/content/www/us/en/support/articles/000... Specifically, please expand “2nd Generation Intel® Core™ Processors” section. As you see in that table, Intel says HD Graphics 3000/2000 only support OpenGL 3.1, which is exactly what I’m getting from the GLEW library I’m using in that project.

Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.


> 1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.

Behavior depends on if the device supports 3.2+ compatibility mode which is optional.

You're hitting the legacy path, that's well-defined ( https://www.khronos.org/registry/OpenGL/extensions/ARB/WGL_A... ). You need to use the method I mentioned to get real post-3.0 OpenGL.

> Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.

ok, so? 4.5 isn't really outdated, either. It still supports all the modern good stuff. And, as we've established at this point, it's not Window's stopping you from leveraging the full extent of the hardware you have. By contrast macOS does stop you from using the hardware you got to the fullest, as it's stuck on 4.1


> depends on if the device supports 3.2+ compatibility mode which is optional.

For the systems I have in this house it’s not required, i.e. I’m getting the same OpenGL version that’s advertised by the GPU vendors.

> You need to use the method I mentioned to get real post-3.0 OpenGL.

Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.

> ok, so? 4.5 isn't really outdated, either.

Right, but 3.1 (Intel Sandy Bridge) is. And 4.0 is outdated, too (Intel Ivy Bridge). Meanwhile, modern Direct3D works fine on these GPUs, 11.0 feature level 10.1, and native 11.0, respectively.


> Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.

Go read the extension I linked, it explains the behavior you're seeing. Also go read the tutorial I linked, it's using GLEW and shows you how to create a context.

You have a bug if your intention is to get a post-3.0 OpenGL context. Whether or not you care is up to you. You may be perfectly happy being in the compatibility bucket. I don't know. But you're not in the explicit 3.1 or later path.

> Right, but 3.1 (Intel Sandy Bridge) is.

Sandy Bridge is a 7 year old CPU. Of course it's outdated...? And D3D 10.1 is from 2007, it's also hugely outdated. You're getting anything more modern out of the hardware with D3D than you are OpenGL here. I don't even know what the argument you're trying to make is at this point.


Just a nitpick, OpenGL 4.0 is newer than DirectX 11.0 and on par in capabilities.


No. Both ATI and Nvidia drivers include recent OpenGL versions, so OpenGL support problems are limited to actually not capable hardware.

In the old link you offer as example, Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions (hence the need to downgrade the client), and fortunately obsolete. Current Intel integrated graphics have improved. And VMware is a virtual machine, not hardware; it should be expected to be terrible.


> Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions

Technically that’s probably true. However, if you drop support of Intel GPUs, your GL4+ software will no longer run on a huge count of older Windows laptops people are still using. For many kinds of software this is a bad tradeoff. That’s exactly why all modern browsers implement WebGL on top of Direct3D, and overwhelming majority of multi-platform games and 3D apps use D3D when running on Windows.

> VMware is a virtual machine, not hardware; it should be expected to be terrible.

It’s only terrible for OpenGL. The virtual GPU driver uses host GPU to render stuff, and it runs D3D11-based software just fine. I don’t use it for gaming but it’s nice to be able to use a VM to reproduce and fix bugs in my software caused by outdated OS, windows localizations, and other environmental factors.


That's not why they do that at all. They don't need anything recent from OpenGL or Direct3D, which is why they target DX9. And DX9 specifically is targetted because that also works on XP, which D3D10 doesn't.

Intel GPUs D3D drivers have historically been better than their OpenGL ones (which isn't saying much since their D3D drivers are also trash), but now we're talking driver quality of one player which has nothing to do with the API itself or opengl somehow being outdated on windows.

But ANGLE also targets desktop OpenGL (and vulkan), and as OpenGL 4.3 adoption increases I'd expect increasingly more browsers to use it for WebGL 2.0 since you don't need translation there at all. OpenGL 4.3 provides full compatibility with OpenGL ES 3.0.

You seem to be pretty confused on how OpenGL versions line up with the D3D ones, too. For reference OpenGL 3.1 is roughly equivalent to D3D 10.1. When you're complaining about only getting GL 3.1, you're also complaining about being stuck with D3D 10.1


Yeah, but that software won't run inside Windows containers, like the store, or work with the Visual Layer Engine in W10.


You know that OpenGL is an API standard, and not a piece of software, right?


Yes? What does that have to do with anything that I've said?


It's still way faster than opengl in some senarios


Way faster in most scenarios actually. I'll give it that. (Any scenario I've tested anyway.)

But it's also more difficult for the lay programmer to use as well.


Just use a vulkan implementation of opengl.

OpenGL -> Vulkan -> Metal

I can feel the performance gains already.


Why is Apple investing in Metal at all then?


Same reason Microsoftinvests in DirectX. Lock in software developers and consumers alike.


I can kind of understand iOS, but it’s not like there’s a thriving graphical computing market worth locking in on the mac side. All major titles already use game engines. They’d just be locking out smaller developers who can’t invest in porting all their shaders: it’s not gonna be worth the effort.

Granted, this may also be true for OpenGL.


But iOS is a lot bigger than macOS.

You notice that Apple supported OpenGL while they were making most of their money from desktop and laptop sales; but once iOS became so profitable, they decided to go their own way and start pushing Metal.

Lock in, or at least getting people to do more iOS first development (helped by lower profits on app store sales on Android, Android fragmentation, etc), helps Apple out a lot. You get the first version of apps and games, or more polished versions of apps and games, on iOS this way.


So why lock in macOS at all?


Because the developers who would be working on OpenGL on macOS are working on Metal instead, because that's where the value is for Apple.


Maybe.... why would you assume they continue developing for macs at all? Small studios might not have the resources, and the market is tiny for many apps, eg indie games, modeling software, and ML (to be fair, apple has repeatedly emphasized they don’t care about ML on the desktop by not offering nvidia cards...).

And again, I don’t see the benefit for apple over supporting cross platform apis to encourage development. It seems like a net loss for everyone but some line in their budget on driver maintenance.


They do make some money on Macs, and Mac software, but not nearly as much as on iOS.

Providing macOS gives a developer and designer platform for iOS. That is really important for them. So Metal being available on macOS is important for that reason. But it's also important in that the Mac platform is still important, just not nearly as important as iOS.

OpenGL doesn't really have much of a future. Everyone is moving towards the next generation frameworks. It just happens that there was a lot of uncertainty about whether OpenGL could adapt or whether there would be a successor, and during that time Apple decided to invest in developing Metal. It wasn't until a couple of years later than Vulkan was released.

In the meantime, Apple has built up quite a lot of tooling around Metal.

And it's not like it's that difficult to write cross platform apps that target the Mac. If you write against major engines, they will already have support for the different backends. If you are writing your own software, you can still target OpenGL, or you can target Vulkan and use MoltenVK to run it on macOS.

And for the next several years, people writing portable software are going to have to either just target OpenGL, for compatibility with older graphics cards, or maintain at least two backends, OpenGL and Vulkan. Given that lots of games target DirectX first, and consider any of the cross-platform frameworks a porting effort, Apple probably doesn't consider it a big loss to add one more platform that needs to be ported to.

What's going to wind up happening is more software like ANGLE (https://github.com/google/angle), MoltenVK (https://github.com/KhronosGroup/MoltenVK), and gfx-rs (https://github.com/gfx-rs/gfx and https://github.com/gfx-rs/portability, see http://gfx-rs.github.io/2018/04/09/vulkan-portability.html for details) for providing one set of abstractions on top of several different backends.


Wouldn’t they be better off attracting developers instead of locking them in?


They could easily jump ship if that isn't the case. If you're working on Metal, your skills aren't much of a use in Microsoft.


Tailored to their hardware, more modern and is written in Objective-C which makes it much easier for Mac developers to integrate in their projects, since Objective-C interface nicely with Swift and most script languages.


Metal is a C++ API, not Objective C.


Ditto for OpenGL ES (http://moltengl.com/)


I've been using a cross-platform GUI framework/engine to do app development on all the platforms: Linux, MacOS, Windows, iOS and Android - and it has been a joy to deploy one app on all of these systems.

One of the reasons this has been so feasible has been the fact that the engine (MOAI) uses GL ES to manage the framebuffer itself - giving the same look and feel on all platforms the host runs. This has been, honestly, revolutionary in terms of building a single app that runs everywhere.

This now becomes more complicated because the engine has to be modified for Apple platforms to use Metal, and represents another fork/splinter in the unity of the host itself.

I wonder if their decision to use a non-standard graphics API is due to them wanting to make this style of development a lot more difficult in the future - i.e. are Apple passively antagonizing the cross-platform framework builders in order to establish an outlier condition for their platforms? The cynic in me says yes, of course this is what they are doing ..


>All they'll accomplish is further killing off the already-small presence they have in the gaming space.

In the AAA game space you mean. Else, in the casual gaming space, iOS is perhaps the most popular platform -- and the new integration effort means all those games will be able to run on macOS as well soon.


And those games are horrible. Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted. The biggest difference between those games and gambling is that you don't carry a slot machine in your pocket. For the most part the only exceptions to that are the games that were ported from desktop.


>And those games are horrible.

They work fine for me -- both as implementation and as gameplay.

>Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted.

I think you confused casual gaming with Zynga or something. I was referring to smaller, non-AAA megatitles. Could be anything from a platform game, to Angry Birds, Monument Valley, Threes, Letterpress, racing games, RPGs and so on...


I'm not saying there aren't decent games on iOS. You can find gems like Monument Valley, Florence, or, as I mentioned, the games ported from other platforms like Limbo, Terraria, and so on. But take a look at the top charts on the iOS app store and compare that to the top games on Steam. With few exceptions, the games on iOS are riddled with ads, microtransactions, and are designed to be as addictive as possible.

The point is that the kind of games that thrive on the app store tend to be exploitative and low quality. Desktop gaming isn't immune from that, but it's a dramatically better platform.


Not if Microsoft has any say. I can't count the number of Windows Updates that re-installed the previously uninstalled Candy Crush Saga, Bubble Witch 3 Saga and March of Empires (among titles).


I played the fuck out of Angry Birds - on Android. How exactly does forcing developers to adopt a platform specific API help anyone? That was a rhetorical question BTW, don't even try to answer it. Apple are being arrogant as fuck with this.


>How exactly does forcing developers to adopt a platform specific API help anyone?

Well, platform specific APIs aren't lowest-common-denominator affairs, and get support for native platform capabilities faster (plus can be more optimized).


I understand their benefits. But why refuse to support standards as well? I don't think Apple is short on resources.


I think because then you don't give developers the extra motivation to use your platform APIs.


That just means the benefits don't outweigh the down side - lack of cross platform API, new learning curve to climb.


You talk of a subset. A lot of casual games on iOS a very good: Cut the rope, Angry Birds, Bad Piggies, Simple Rockets. Civilization for iPad was very good. I can't remember all the stuff I've played but a lot of games are not the candy crush kind.

Also a lot went wrong when Apple opened up for ads and in game purchases.


PUBG Mobile is one of the best game I have played in a long time. And it doesn't cost me a penny. Nor do I have to pay to win. ( Actually I may have to upgrade my phone to play better )

But not every game are gambling. Fortnite seems to be doing great. And that shouldn't be a pay to win game.


PUBG was ported from desktop. It originally started out as an ARMA 2 mod, was turned into PUBG, and only much later ported over to mobile. It's a perfect example of the kind of game that can come out of the desktop gaming community. You don't get games like PUBG, Minecraft, Starcraft, Terraria, Civ, Kerbal, and so on without desktop gaming.

The games that grow out of the app store ecosystem are games like Candy Crush, Clash of Clans, Clash Royale, etc. I'm not saying good games don't exist on the platform, I'm saying the platform is conducive to low quality games. Almost all of the great games on the app store did not grow out of the platform.


Woah now, Clash Royale is a good game that is skill driven.

People take level 1 accounts into legendary arena which shows that skill dominates money.

How on earth the devs at Supercell tricked management into making a skill rather than money based game is genuinely a mystery to me.


That I certainly agree. But I do think the future are Games built on top of Game engine. Unreal seems to have a massive improvement changelog every 6 months and if Unity didn't exist I did doubt how anyone is going to compete within a reasonable budget.

Game Engine Choices would become the old day of OpenGL vs DirectX. No one sane would write their own game from scratch.


“99% of everything is shit”

There are plenty great games. It help to source them from a gaming community you trust.


>> built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted

I guess that explains why they are on Apple devices in the first place.


I don't expect there will ever be an AAA game presence on macOS at this point, given so few of their machines offer dedicated GPUs anymore.

And even in cases where they are available, for example Macbook Pros, the cost difference involved in stepping up from an integrated GPU to an entry-level dedicated card is greater than the cost of buying an Xbox or PlayStation.


Which none of them support either Vulkan or OpenGL.


XBoxes and PlayStations have been getting more game ports than macOS tho, doesn't take a genius to figure out why.


Consoles always had more game ports that other computer systems.

Having a fixed set of hardware to target is a dream.


Additionally, I don't think indie developers have loads of time on their hands to port their niche games over to a new technology. I can see Unity supporting Metal, but smaller platforms (jMonkeyEngine) will have a slower adoption rate, and in that time hopefully open-source middleware will come out to handle legacy APIs.


Indie engines like Ogre3D just need to add a new backend to their already backend agnostic engine.


Sure, but this makes it a right pain in the arse to bring over apps from other platforms.


Well, at least in the Rust ecosystem there is https://github.com/gfx-rs/gfx which provides backends for Vulcan, Direct X 12, Metal and OpenGL. I'm not sure it's super relevant outside of the Rust ecosystem right now, but it's worth spreading the word around of such solutions.


How is the cross api shader support?


Surprisingly good. My tests have run across properly across 4 os's without much change. I can't verify for Apple though.


> across 4 os's

windows, gnu/linux, android; and what other os?


https://github.com/gfx-rs/portability may be relevant out side of Rust, since it's a linkable C library.


All they'll accomplish is further killing off the already-small presence they have in the gaming space.

That of course depends on the definition of "gaming space".

The classical desktop gaming space, Apple was never a player in it. They simply don't care about it. Hence why they treated OpenGL on macOS the way they did.

But: Apple is arguably the biggest player in the mobile gaming space. That's what they care about. So instead of spending a large amount of money to attract a low number of AAA desktop titles to their OS they just tap into the vast (game-)developer base that they already have in iOS and make it easy for them to deploy and sell their games on macOS too [1].

The move to deprecate OpenGL and OpenCL in favor of Metal makes total sense in that regard.

[1] https://techcrunch.com/2018/06/04/apple-is-bringing-the-best...


And where is Vulcan standard? Windows games use DX. iOS is Metal. Android Open GL, ES, etc. Gaming consoles have proprietary APIs.

I would very much prefer games to use Metal on macOS (Starcraft2 is much smoother on the same hardware)


On Khronos dreams, even on Android which was adopted on version 7, is not worthwhile to display on the developer's dashboard.

https://developer.android.com/about/dashboards/

And good luck finding a Vulkan version beyond 1.0, with extensions that work properly across all mobile devices.

http://vulkan.gpuinfo.org/vulkansupport.php#android_devices

You get a spectrum all the way from 1.0.0 up to 1.0.66, with Vulkan 1.1 promised for Android P devices only.


Nintendo Switch


I believe, as of now, there are more AAA games running Metal than there are running Vulkan. Pretty much every new macOS release is running Metal now, meanwhile a game running Vulkan on PC is considered a rarity.


Because on PC, there are working alternatives.


"But by no means to they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else"

To be fair, the overwhelming majority of game shops develop on engines, and leave the engines to deal with the platforms. Unreal Engine, Unity, etc, support Metal, among others.


Not everything that runs on OpenGL is a video game. Tons of applications out there that just won't have the budget to do a rewrite(and even fewer were probably setup with the right architecture if they were depending on OpenGL in the first place).


Indeed, I directly thought of 3d applications like Blender, Maya, etc which use OpenGL.

It's a very weird move to me, even if the software in question will be kept compatible with Apple's Legacy OpenGL, these versions will be worse than their counterparts running on other platforms making use of new shiny OpenGL features.

It's like Apple is saying 'we don't care' to the 3d professional market, also doesn't Photoshop rely on OpenGL these days as well ?


After discontinuing AutoCAD for Mac in 1994 people begged for 18 years to get it back and now Apple says "eh, we didn't want that anyway."

I heard they have a WebAssembly/WebGL version now, betting that'll get wrapped up in a WebView and we can all pretend it's a native program still.

Speaking of WebGL, that's basically OpenGL ES 2.0, but I assume the implementation in WebKit is backed by Metal? What about other browsers like Firefox?


Firefox uses OpenGL to implement OpenGL ES. It also uses OpenGL for hardware accelerated compositing.


AutoCAD is a dead technology. Architects/Stuctural Engineers/MEP Engineers are moving to BIM platforms (Revit, ArchiCAD, etc)! Product/Automotive/Industrial design and engineering use PLM tools (Catia, SolidWorks, etc). Besides, AutoCAD didn’t/doesn’t need much graphics power at all. AFAIK it never really used OpenGL.


Translating between these two is not particularly hard. (Similar to Vulkan backend for OpenGL)


It is. Keep in mind you have to transform the shaders too and make them safe so that they would not allow for undefined behaviour to happen.


LibANGLE


> Photoshop rely on OpenGL these days as well ?

Yes, though with Adobe's relationship with Apple they probably got all the handholding and resources to do the port to Metal.


I'm not sure about that. Or maybe Adobe just doesn't care. My 2017 Macbook Pro has horseshit graphical bugs in both Illustrator and Photoshop. I'm exclusively doing all my graphics work on my Windows 10 machine now (even though windows and my Wacom tablet do not play nice together.)


I'm in the same boat, my so called Pro machine the USB-C 2017 MBP has had glitching out completely unusable rendering on the latest version of Illustrator since October 2017. Adobe blame Apple, presumably Apple blame Adobe because neither of them are fixing it.

As if my deteriorating keys on this machine were not bad enough. This wasn't a good WWDC for me. My PC is working great despite being an obscure setup with mismatched GPUs, can't say I understand why the graphic designer workhorse machine MBP is unusable with Illustrator and that is working just fine..


> even though windows and my Wacom tablet do not play nice together.

They don't? Whats the matter? (mine works great, but I'm not a heavy user so curious if it depends on the model or I just didn't run into it so far)


microsoft changed the pen behavior in one of the creators' updates and now the pen buttons behave strangely (randomly dont work in certain applications) as well as the pen being registered as a finger in legacy applications for a while... making windows 7 the only really viable way to use wacom for a professional (speaking as one)

let alone the inability to reconfigure things like n-trig pens to have hover right click/middle click functionality, it's been INCREDIBLY frustrating without any communication from microsoft.


I've got the Intuos Pro from a couple models back. Windows Ink randomly causes pressure sensitivity to drop out (especially since the creators update.) On Windows 8 I never had trouble with the wireless adapter, now I have to run wired. Button clicks don't always register and sometimes will send the wrong input.

Overall it's rough, there are days where it seems better than others - but I'll randomly lose sensitivity and multiple reboots appears to be the only pseudo-consistent means of getting it back.

That being said - It's still way more usable than Photoshop/Illustrator on my Mac.

I miss 15 years ago when I had CS2 + Intuos Pro 2 and everything just worked.


Photoshop CC 2015 already had partial support for Metal.


True, but they didn't remove OpenGL, they simply deprecated it (e.g. don't expect any updates to it, new tooling will not be built around it, etc). That shouldn't affect legacy apps.


Yes, and deprecation doesn’t mean a lot on the Mac. Apple often deprecates stuff and still leaves it in. They remove it only when there’s something to be gained.

(eg. linking with the system-provided OpenSSL has been deprecated for years, but AFAIK they still ship it.)


They mentioned in the State of the Union that this is the first step towards removing it.


Chicken and egg. If they remove it apps stop working, if apps don’t update they can’t remove it.


I’m not so sure they’re worried about apps breaking. They’ve certainly stuck to the “no more 32bit iOS apps” thing.


Apple can get away with that on iOS, but they're a lot more conservative with macOS.

To expand on your example, I maintain a legacy app that is stuck in 32-bit land because it relies on the QuickTime framework. QuickTime has been deprecated for seven years, and the transition to 64-bit has been in progress for over a decade, and yet my legacy app runs just fine even under the Mojave beta. There are multiple time bombs lurking in that app, and one of these days I'm going to have to rewrite it from the ground up, but I've been astonished at how long it has lasted.

Apple knows it would be bad karma to make a large number of legacy apps and games suddenly break on the Mac. They're not idiots; they have a perfectly good idea of the scale of mutiny that would ensue. So I'll eat my hat if OpenGL doesn't continue to work for at least the better part of the next decade.


They said in the Platform State of the Union that Mojave will be the last macOS that runs 32bit apps, so QuickTime.framework and your app are running out of time!


Huzzah! Thanks for the heads-up. I’m looking forward to catching up on the whole SOTU.


They specifically mentioned QuickTime in the release notes as well.


> QuickTime has been deprecated for seven years,

It has entirely been removed from the latest (since 10.12 IIRC?) SDKs so now you have to keep older SDKs just to build your app.


True — 10.12 is my recollection as well — but I’ve been bitten so many times by compiling under a new SDK, especially with an older build target, that I do that as a matter of course anyway.


I find it hard to fathom that people think a huge software company like Apple doesn’t have awareness of the impact of its changes or people responsible for compatibility.


If there isn't one already, I'm sure someone will implement OpenGL on top of Metal when it's needed badly enough. At least they're going closer to the hardware, not further away.


Already there: https://moltengl.com/moltenvk/

"MoltenVK is a runtime library that maps Vulkan to Apple's Metal graphics framework on iOS and macOS."


But molten is for Vulkan (ie: not OpenGL) - and not 100% of Vulkan, AFAIK.


Both exist: MoltenGL is for OpenGL ES https://moltengl.com/moltengl/


OpenGL ES is not OpenGL though.

It's a subset mostly for mobile.


LibreOffice uses OpenGL and OpenCL extensively.


Why use Libre Office on a Mac when Pages, Numbers and Keynote are free (as in beer)? I’m going to go out on a limb and make a baseless argument that the Libre Office install base on the Mac is very low. On iOS it’s non-existent.


There is no way that Pages, Numbers and Keynote can open as wide a range of file formats that LibreOffice can. And there are way more features in LibreOffice.

On iOS, dunno, you probably have a point there.


Having used pages and word, please don't tell people to use Pages for everything.

It doesn't have the features you need when you're creating more complex documents. Last time I've used it it didn't even allow you to have different sections which allow you switch between the rotation of pages.


Every game developer I know turns off the metal rendering pipeline and uses the much more stable and refined OpenGL one unless getting every tiny bit of performance needs to be squeezed out.

I’ve witnessed plenty of last minute builds be saved by a Unity game dev on a Mac just flipping their renderer settings.


Sounds like this might be the incentive Unity needs to fix their Metal implementation.


I don't think it's much of an incentive. According to Valve's hardware surveys roughly 3% of Steam's market is MacOS. Those type of numbers are similar across different distribution platforms that a Unity game dev will target. It'll be hard to nudge it away from low priority with that share.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


The other figure to look at is the amount of money spent. If it's similar to the hardware percentage then you're right, if on the other hand, macOS users spend more on games then a rethink is in order.


Do you have a source for that? Because I can't really find a figure. Genuinely curious.


True. This is the only sane reasoning they could've had.


This is a problem that is entirely of Apple's own doing.

Microsoft could not care less about OpenGL on Windows. However, it works just fine.

You know why? As soon as you install your video card drivers, your OpenGL implementation is no longer from Microsoft. It comes from AMD, NVidia or Intel, with all needed optimizations for their hardware.

Apple insisted in not allowing this and doing the OpenGL implementation themselves (which was always crappy and outdated).

Had they allowed the GPU vendors the ability to provide their own implementation, this would have been a non issue.


OpenGL is very much a second class citizen on Windows. Mass-market OpenGL apps like browsers currently use ANGLE to emulate OpenGL on top of D3D. Native OpenGL is used in professional apps that can make demands on GPU and driver setups.

(Many toolkits, like Qt and Cocos2d, also use ANGLE on Windows for OpenGL functionality)


Pro 3D apps from Autodesk tend to use DirectX. Certainly 3dsmax and Revit use DirectX over OpenGL and have done for a while.

The same is true for plenty of other “Pro” apps on Windows.

https://knowledge.autodesk.com/support/3ds-max/learn-explore...


This is to a large extent because browsers can't make the same assumptions about reasonable graphics drivers being installed as games can.


What makes you think that Apple refuses to allow GPU vendors to provide an OpenGL implementation?

The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?

If producing an OpenGL implementation doesn't provide a competitive advantage for selling their products, why would they bother?


> What makes you think that Apple refuses to allow GPU vendors to provide an OpenGL implementation?

Because it is a fact that Apple develops their own drivers. Also, when did you last download a driver update from NVidia for your Mac?

https://arstechnica.com/gadgets/2018/02/vulkan-is-coming-to-...

Article is about Vulkan, but briefly mention Apple's own outdated OpenGL stack.

http://renderingpipeline.com/2012/04/sad-state-of-opengl-on-...

From 2012. Which means it is still accurate, given how out of date drivers are.

There are more references, you can look it up.

> The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?

They still have GPUs, which can be Intel, AMD or NVidia depending on year and model. Just because they are soldered on, doesn't mean they don't need drivers.

EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.


> Also, when did you last download a driver update from NVidia for your Mac?

Last week.

Nvidia releases drivers for cards that the drivers which ship with macos don't support. I would also guess that the nvidia drivers which ship with macos are written by nvidia under some agreement with apple, same is likely true of AMD and intel.


Last week. Nvidia releases drivers for cards that the drivers which ship with macos don't support.

That's a tiny part of Apple's lineup though.


Yes, but you need to switch to an older version of of Xcode/ developers tools if you want to program CUDA on a Mac. Specifically I have to switch back to last Decembers release when I want to do any CUDA development on my 2015 MacBook (I don't think there is any later Mac's that even support nVidia).


That is not a macOS specific problem. Because CUDA interacts with the compiler, you can have exactly the same problems on Linux with gcc.

(Ironically, using OpenCL avoids this problem)


Yes, once upon a time both Microsoft and Apple provided an implementation of OpenGL with their OS.

When Microsoft abandoned OpenGL for DirectX, GPU vendors produced their own OpenGL implementations because doing so provided a competitive advantage that allowed them to sell more product.

The question is, why would those GPU vendors do the same thing now that Apple is following the same path?

Apple doesn't even produce a computer with slots you can install their products into.

>EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.

Keep doing research, because NVidia provides downloadable Pascal drivers even though the last time Apple produced a computer with a PCI slot was the Cheese Grater Mac Pro which came out over a decade ago.

https://9to5mac.com/2017/09/27/nvidia-pascal-drivers-high-si...

Making sure nothing diminishes CUDA is very much in NVidia's competitive interest.


NVIDIA has mac drivers for their whole consumer line. For example https://images.nvidia.com/mac/pkg/387/WebDriver-387.10.10.10...


Good to know. It seems that they also provide updated CUDA drivers.

It doesn't seem to change anything on the OpenGL stack, unfortunately.


It just goes to show that NVidia thinks supporting CUDA everywhere is very much in their competitive interest, while creating and supporting an OpenGL implementation simply is not.


Yes. But tell me a Mac past 2015 that you can even use an nVidia card in.


They're doubling down on eGPU support, so there's that. But not as ideal as having something actually inside your machine.


Any Mac that can use an eGPU.


Only on desktop.

In case you missed the news, only DirectX is supported on UWP and store apps.

To the point that Microsoft has their own port of Angle.

https://docs.microsoft.com/en-us/windows/uwp/gaming/compare-...

https://github.com/Microsoft/angle

So unless they change their mind, say goodbye to ICD drivers on Windows as well.


Is UWP/Store actually taking off? I was under the impression that it was another flop.

I run Windows at home, but wouldn't if it went Store-only. For me, an open platform is the only thing Windows had going for it.


Yes, little by little.

Quite a few apps like Adobe XD are store only.

Next Office for Windows 10 is only available via the store.

Microsoft has taken the other approach, if apps don't come to the store, the store comes to the apps.

So thanks to the outcome of Project Centennial, they are now merging the UWP and Win32 worlds into Windows containers and making the store Win32 aware as well.

https://bramwolfs.com/2018/03/13/msix-the-platform-for-all-w...

Deep down session from BUILD here, https://channel9.msdn.com/Events/Build/2018/BRK2432


As a long-time professional game engine programmer, it is hard for me to see consternation over things like this, and avoid judging it as mainly ignorance. The amount of code in an engine that needs to touch the graphics API is tiny. A handful of methods for device init, filling buffers, uploading shaders, setting state, and saying "draw!" All of the graphics API code can easily fit in one medium-sized source file. As a proportion of the whole engine, it's very small. As a proportion of the whole game or app, it's negligible. It's also boilerplate. There are only so many ways to copy a stream of structured data into a buffer.

Legacy software, blah, blah, blah. No legacy software runs forever, and least of all on Apple platforms. Who cares.


Professional game engines are not the only application for OpenGL, though. Many people would like to build cross-platform software without using major abstraction layers, such as game engines. This could be research software, prototypes, small tools in general.. -- I think there is a long list. For these people life might get harder.


Gaming is all about legacy software, especially single player games. As a gamer I'm very happy that almost all older games still work on Windows today (either directly or using an emulator like DOSBox).

(Of course that does not mean that the OS needs built in OpenGL support. If you can convince an old game to use some kind of OpenGL-Metal compatibility wrapper without needing access to the game's source code or support from the original developer, that's fine with me as well.)


I'm one of the happy persons about this move, because my competition use OpenGL on Mac and I just use software rendering. It was easy to see this deprecation coming...


Quite true.

People in FOSS friendly circles really don't get the games development culture, IP management or the contracting business related to ports.


OpenGL isn't pretty, but it's at least cross-platform. And my impression was that OpenGL support is mostly handled by the GPU manufacturers, so I'm not sure how much Apple gains here by deprecating OpenGL.

Requiring developers to use an API locked to a particular platform feels pretty hostile to me. Doesn't matter if that API isn't perfect, or even far from it.


Although I agree it's a terrible decision for Apple only to have Apple-specific graphics APIs, please note that:

* Being deprecated does not mean that things will suddenly stop working. It will take a few more releases of macOS before this can be removed.

* Next to MoltenVK there is MoltenGL, which is an implementation of OpenGL ES 2.0 that runs on (edit) Metal [1]. That indicates it's at least feasible to wrap OpenGL applications in the future if necessary.

Furthermore, Apple wil drop support for all Macs that don't support Vulkan in this release of macOS [2]. Ouch, what a waste.

[1]: https://moltengl.com/moltengl/

[2]: https://9to5mac.com/2018/06/04/macos-10-14-mojave-supported-... (anything from before 2012 does not support Vulkan)


Did you mean Metal instead of Vulkan? :P


It seems like a clear signal that Apple is preparing to develop its own GPUs. They're already doing this on the iPhones.


Nah. The GPU on Intel chips is free and the eGPU thing, to me, is official notification that Apple think GPU's should be on the outside. I bet this generation of MacBook Pros are the last to have discrete graphics...


You don't get free Intel GPUs on your ARM laptop chips...


This is the most obvious explanation I have read on HN... After I read it!


> And my impression was that OpenGL support is mostly handled by the GPU manufacturers

You would be correct, but not on OSX.


I mean it wouldnt be that big of a problem if they adopted Vulkan, but they are pushing Metal :-/


>Requiring developers to use an API locked to a particular platform feels pretty hostile to me.

So like DirectX?


I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac? And last I checked I have the choice of OpenGL and Vulkan on Windows because these days MS doesn't control the hardware stack from top to bottom on their software platform.


>I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac?

Plenty of big 3D/CAD/etc players? In lots of creative areas, the Mac dominates still (despite stories about people moving to Windows nobody's going anywhere, where nobody = quite few creatives overall).

Besides, with Metal they'll target iOS as well, and that's a huge platform, and where most of the profits are for mobile.


CAD on Mac is pretty much non-existent, as is any professional 3D market - the market share isn't there, the hardware support is terrible, so few major players bother with supporting Macs. All this stuff is either Windows (CAD) or Linux (3D simulation, visualization) these days.

And with this deprecation Mac is pretty much dead as a platform for professional 3D.


Creative Suite has run better on PC and at better price-performance ratio for almost a decade.

Graphic Designers still like Macs for the most part I guess -- and I still see them in video production a lot, but that's starting to change pretty quickly.


> I still see them in video production a lot, but that's starting to change pretty quickly.

I think the Final Cut "Pro" X was the inflection point - the change is ongoing.


Visualisation is largely done on Windows, mainly with 3dsmax. Has been for a while. Linux is used more in movie VFX.


Your view of visualization is a limited world: windows and Max?


OpenGL is still an option on Windows, it's not deprecated.


That depends on who you ask. OpenGL is in the deprecated API section on MSDN[1]. Because of the ICD model, Microsoft can't prevent GPU vendors from adding OpenGL features, but they don't bother integrating it with modern Windows APIs. You can't create an OpenGL context on a DirectComposition surface or in a UWP app. It integrates poorly with the compositor. You can't get composition feedback, and most drivers will flicker or show artifacts when windows are resized. OpenGL apps don't get windowed-fullscreen optimizations and you can't control when they enter exclusive fullscreen mode. I don't think you can use windowed-stereoscopic or windowed-HDR either. All these issues push developers away from OpenGL and towards DirectX, which is what Microsoft wants.

[1]: https://msdn.microsoft.com/en-us/library/windows/desktop/ff8...


Deprecated means very different things when coming from Microsoft and Apple.


It’s not deprecated because it’s not even there to begin with — Windows 10 doesn’t ship OpenGL by default; GPU vendors provide their own implementations.

Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them


> Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them

As far as I am aware Apple develops the GPU drivers for OS X (though, I think, based on code that the GPU vendor provides).


OpenGL is not a driver though, it's a graphics API


At least on Windows, the OpenGL implementation is part of the graphics driver. Why? Because by default Windows only has between rudimentary (at least up to Vista, I think; I am not sure about Windows 7 and 8.1) and no (Windows 10) OpenGL support - this is what the GPU vendors provides as part of his graphics driver.


> At least on Windows, the OpenGL implementation is part of the graphics driver.

It's distributed with the Graphics Driver, but most of it exists in a user space library, not in the driver proper.


Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them

I'm not sure. NVIDIA provides updates for CUDA and an extremely limited amount of updates for their graphics stack (AFAIK none at all for integrated graphics, for example).


Not on UWP or store apps.


Exactly like DirectX. Great API to use if you don't give a shit about portability. If you do, it's useless.


No not like DirectX because DirectX is optional.


OpenGL is pretty. Much prettier than these Metal and Vulkan abominations.

The difference is that OpenGL is designed to be easy for humans. glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd(); you can't beat that. The issue is that it hard for the driver to optimize.

That's where Metal and Vulkan come into play. These are low level APIs, sacrificing user friendliness for a greater control over the hardware. It is designed for 3D engines, not for application developers.


Nope, glVertex3f was deprecated years ago by OpenGL itself. That is not the way the API works any more. [1]

Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.

[1] https://www.khronos.org/opengl/wiki/Legacy_OpenGL


> for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.


Maybe the games were not very complex? Professional game programmers building games with lots of shaders are very familiar with what I am talking about. See for example this thread:

https://www.opengl.org/discussion_boards/showthread.php/1998...


> What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.

State-based recompilation is a known issue in many GL drivers, particularly on mobile. E.g. changing blending settings may cause shaders to get recompiled. This can take up to a second.

Some engines work around this by doing a dummy draw to an offscreen surface with all pipeline configurations that they use at init time. This (usually) guarantees that all the shaders are pre-compiled.


Also, you can handle caching of compiled shaders yourself now (glProgramBinary).


I think the recompilations being talked about here are shaders generated by the OpenGL implementation behind your back. That is, your program never sees them as shader or program objects because they implement some permutation of blend mode, depth test, culling type, etc..


The non-deprecated OpenGL code for a hello world triangle is still an order of magnitude less verbose than Vulkan though.


While Vulkan is a bit verbose, it's not an order of magnitude difference if you follow modern OpenGL best practices. If you rely on default state and use the default framebuffer and rely on implicit synchonization, you can squeeze it down to a few hundred lines but that's not a good foundation to build practical apps on.

To give a ballpark figure, my Vulkan "base code" is less than 2x what my OpenGL boilerplate is for the same functionality. The big difference: the Vulkan code is easy to understand, but the GL code is not.

Comparing "Hello World" doesn't make much sense, OpenGL gets really darn complicated once you get past the basics.


Vulkan code is extremely front-loaded. HelloTriangle is much longer. A complete application can be significantly shorter.


In my opinion a similar difference exists between CUDA and OpenCL. OpenCL takes more code to get something simple going. But at least it doesn't break if you upgrade your gcc or use a different GPU vendor.


Each to their own but over the last 6 months I've written a graphics engine in openGL + SDL. Once you truly understand modern openGL you realise how beautiful it is.


You will think it's less beautiful when you ship that game on several platforms and find that it has different bugs on each platform, on each hardware version, and on each driver version. And most of these bugs you can't fix or work around, you just have to bug the vendor and hope they ship a fix in a few months, which they usually won't because your game is too small for them to care about.

This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.


> glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd();

That's fine for a "hello triangle" program, but quickly becomes ridiculous for anything approaching a serious engine. There's a reason that glDrawArrays() has been around since 1995 (and part of the core specification since 1997).


Made me want to revisit the good old NeHe tutorials for a quick browse :)

http://nehe.gamedev.net/tutorial/creating_an_opengl_window_(...

I wonder how much of this stuff is deprecated now.


While this is startling, it seems pretty consistent with Apple's modus operandi in a lot of areas -- leap forward to where they think the industry is going and hope they're right. OpenGL is effectively being deprecated by its own developers in favor of Vulkan, which has an open source implementation for macOS and iOS, developed in part by Valve, built on top of Metal:

https://store.steampowered.com/news/37575/

If game developers -- and game engine developers -- targeting OpenGL now are in the process of moving to target Vulkan, and if MoltenVK ends up offering better performance on macOS than Apple's legendarily anemic OpenGL stack, isn't this likely to be better in the long run despite the short-term pain?


OpenGL is "learnable" by someone in the process of learning. Vulkan and Metal are much less approachable. This will put a huge damper on low-level graphics programming as a hobby.


I would disagree with that, especially with regards to Metal. It's a very approachable and well-designed API. It might not have the volume of resources that OpenGL does, but the docs themselves are good, and I have seen plenty of intro-level tutorials that are decent enough. Debuggability is also much better than OpenGL, which I think is important for newcomers. Debugging OpenGL issues is very, very painful, especially with macOS's lack of debug extensions. Metal is described as "low-level", but it's not quite at the level of Vulkan -- things are simpler and more "streamlined".

There's also the problem that a large chunk of OpenGL learning materials out there are hopelessly outdated, and IMO actively detrimental to learning modern graphics techniques. Judging from the types of questions I see around various forums, it seems to be VERY hard for newcomers to distinguish between "bad" and "good" OpenGL tutorials. In general, there's too much cruft for learners to focus in on the stuff that is actually part of "good OpenGL".


Any recommendations for 'good OpenGL'?


I think https://learnopengl.com/ is pretty good.


> OpenGL is "learnable" by someone in the process of learning.

If this is the case, it's only because there is much more material and tutorials written. Not because OpenGL is simpler or better.

I know because I watch noobs stumble with OpenGL all the time over at ##opengl in freenode. It usually takes them a week or two to get a triangle on screen, and they're super confused about semi-opaque concepts such as "vertex array objects" (they're well documented in the OpenGL wiki, but reading documentation seems to be out of fashion).

It would certainly help (them) if they had a good knowledge about 3d graphics in general before stumbling into OpenGL. But if they had, they'd be able to do it using Vulkan or Metal with no great difficulty. OpenGL isn't at all better here.


They need a new more developer-friendly API built on top of Vulkan as a replacement for OpenGL.


I believe (hope) that OpenGL continues to be that developer-friendly API. Building OpenGL on top of Vulkan shouldn't be too hard, and it means we don't have to pointlessly deprecate and recreate the huge number of OpenGL resources out there.


OpenGL isn't friendly to developers on either side. Tutorials are generally not trustworthy, there's no cross platform debugging tools, and errors just get you a "something happened vOv" code.

If you care about performance it will just do something slow at uncontrollable points in the background, like copy CPU-GPU-CPU memory, synchronize against the GPU, do format conversions, etc.

If you're the one implementing GL, it's gotten gigantic again since they simplified it. GL 4.3/4.6 core has compute shaders, which means you have to implement OpenCL twice but different this time.


There is already OpenGL-on-Metal and OpenGL-on-DirectX middleware, so this may not actually be an issue.


Then you could just use one of the existing engines like Unity.


Pretty sure Unity is more heavyweight than a few hundred kilobytes of a library.

It also doesn't come installed on my machine.


Depending on the project type, the final release package can be just a few hundred KB, like they do when targeting WebAssembly.

OpenGL also does not come installed on my machine.


Right, but it still involves several gigabytes of largely unused functionality to get a “hello world” and it pigeonholes you into a specific ecosystem. Unity is an entirely different offering from a graphics api.


> OpenGL is effectively being deprecated by its own developers in favor of Vulkan

OpenGL is not deprecated, releases slowed down since there isn't really much new stuff to add, but 4.6 was released just last year after Vulkan.


I know it's not an official deprecation, but it sure seems like the Khronos Group sees Vulkan as OpenGL's de facto successor.


I'm not sure why you say that, their official stance has always been that OpenGL will continue to evolve as GPUs evolve and need to expose new functionality.


I would be a lot more okay with this if Apple supported Vulkan, the more portable comparable API, rather than just the macOS/iOS-only Metal.

I also wonder what means for WebGL and its future. Right now, WebGL works in browsers on macOS, Linux, Windows, iOS, Android, which is incredible. There is no equivalent.

Sure, Apple has started working on WebGPU, but that’s not yet ready nor is it guaranteed to gain Linux, Windows, Android support.


MoltenVK is an implementation of Vulkan that runs on top of Metal, and so far yields exciting performance results: https://www.phoronix.com/scan.php?page=news_item&px=Dota-2-I...


WebGPU would gain traction if it was based on Vulkan. But it's not. However, Mozilla's Obsidian API is:

https://github.com/KhronosGroup/WebGLNext-Proposals/tree/mas...

Apple has so little to gain over Vulkan by developing its own API but so much to lose by not adopting Vulkan (gaming companies may actually prefer developing games on the cross-platform Vulkan to target macOS/iOS devices, too, at the same time, instead of using DirectX).


Obsidian didn't manage to materialize a Khronos working group, so it's not moving forward. Apple instead went with the W3C to form the GPUWeb group, based on their work on WebGPU. The Obsidian folks at Mozilla have decided to follow this path instead, see here:

https://github.com/KhronosGroup/WebGLNext-Proposals/pull/11#...

However, there's no real writeup of what the API will end up remotely looking like right now, so it's too early to speculate. WebGPU's original prototype used Metal's shading language for instance (since the prototype came from WebKit), but any real standard probably will probably change things up.

I believe the webgpu-servo folks have, in the mean time, begun working on lower level components/libraries to target Vulkan/DX12/Metal, for use by systems like WebGPU. Sort of like ANGLE by the Chrome team, but for newer GFX APIs.

TL;DR absolutely nothing is fleshed out at all yet and it seems plenty will probably change


> instead of using DirectX

Maybe Nadella will push for Vulkan support on Xbox, or maybe Xbox will die off, who knows. Unless one of those happens, DirectX is not going away. As soon as consoles come into the equation, you are stuck writing a PAL (or using an existing engine that already has one) because they use proprietary APIs and that's unlikely to change.

It's not ideal, but that's the reality. Apple is following the idiotic status quo, but it's not fair to single them out for it (that being said, at least Microsoft supports Vulkan and OGL on one of their platforms - but the 3rd-party driver developers are mostly responsible for the great support).


My bet is they spin the whole gaming division off or sell it to someone like Amazon. It’s not really a great fit anymore and IMO they need to be bolted to a company with a greater interest in the creative side of the business (i.e. running a movie/game studio) than Microsoft.


They don't have anything to loose regarding not adopting Vulkan, because all game engines that matter to professional game studios already added Metal support.

Same applies to Photoshop and other relevant 2D and video editing professional tooling.

Professional game studios always favored hardware specific APIs that allows them to extract all the juice up to the last drop.

For example, OpenGL ES 1.0 + Cg on the PS3 was an adoption failure, with everyone adopting the PS3 specific APIs.


Seconded. This seems like a major step back for x-platform GPGPU. I always just assumed a natural transition from GL, CL support to Vulkan would occur at some point, but this is just a shame.


Maybe there is no equivalent, but WebGL is not a mature technology. Webgl stuff still breaks or has performance bugs whenever some part of the OS/Browser/GPU Driver/GPU Hardware sandwich changes. You can run the conformance tests yourself.


I don't see how a technology can get to a mature status if a major hardware company decides to not support it. The real question is WHY don't they support it? Is there a webMetal?


All we know is that Apple was bankrolling a portion of OpenGL development on OSX and now they feel otherwise. OpenGL ARB is a committee and Apple has only one vote. Maybe they were not satisfied with the direction the spec was going. Its certainly not an unfounded belief.

>I don't see how a technology can get to a mature status if a major hardware company decides to not support it.

Alternate reading - They gave it their time and money, and it didn't work out.


There might be, but we rejected it.

https://news.ycombinator.com/item?id=13593272


As someone who read this with an editor full of OpenCL kernels, I think apple must really have missed the point of these sort of frameworks - heterogeneous computing.

If I wanted the best possible speed, latest features ect. I would write multiple back ends in things like CUDA.

I choose OpenCL because I can develop code on my Macbook pro, and run that on a computer with a discrete GPU on a different operating system, and have a fair amount of confidence that it would work.

:/


> I choose OpenCL because I can develop code on my Macbook pro, and run that on a computer with a discrete GPU on a different operating system, and have a fair amount of confidence that it would work.

That was the promise, but it never became reality. When writing kernels for real-world applications, OpenCL breaks down in numerous ways. The result is usually neither stable, nor portable, nor fast and a pain to work with. There was never OpenCL support for 3rd party developers on iOS.

You say you are writing OpenCL kernels on a MBP and they are portable, maybe you got lucky? Lots of comments I see on the deprecation on OpenCL seem to come from people who like the idea of having OpenCL (and its promises, which are awesome), but never had the awful experience of actually working with it.

I remember the open letter from the Blender developers on the sad state of OpenCL support on Mac (http://preta3d.com/os-x-users-unite/) from 2015. Some GPU vendors (AMD, Intel and Qualcomm) continued to put resources to better OpenCL support over the last couple years, but maybe too little, too late? It seems at least Apple had already given up on OpenCL by the time of this letter (and moved their resources completely to Metal), as nothing new has happened for OpenCL since then.

I'd prefer if we had a working OpenCL on many platforms. As we don't, especially not on Apple platforms, the step of deprecating it is regrettable, but at least honest.


I know that Apple is commercial organisation and not a charity but projects like Blender bring a lot to the platform.

It would be great to find out later that Apple had reached out to the Blender dev team with a strategy on how to move to either Metal or a Vulcan/Metal adaptor.

Personally I was thinking about getting an eGPU just for Blender use. It would be a shame to have to leave macOS just to run Blender.


Agreed, I am in a similar situation. This is very sad. Also, while OpenCL is a bit verbose to interact with directly, Vulkan compute shaders are much much worse. I realise that at some point I will have to start using it, but I'm not looking forward it.


>because I can develop code on my Macbook pro, and run that on a computer with a discrete GPU on a different operating system, and have a fair amount of confidence that it would work.

I'm not an OpenCL programmer by trade, but I have dabbled in it (Wrote an AES decrypter in OpenCL) and I have never found this proposition to be true.


All things considered, I think there are some companies that are worse to the FOSS community than Apple, but I can't think of one that has Apple's degree of baldfaced cynicism to exploiting FOSS and open standards only to the degree that it benefits Apple, and then throwing them under the bus the instant they're no longer useful.

Apple loved HTML5 when they had to kill Flash and get web developers to support mobile, but then as soon as it became a threat to the App Store, Safari's compliance came to a screeching halt and now Safari is in last place, even behind Microsoft's browsers, in HTML5 support.

OpenGL was useful when it was a way to potentially lure people away from Windows, but as soon as Apple had the clout to not care about it and force develops onto its proprietary API, that's what happened.

I almost prefer old-Microsoft's honesty about wanting to kill FOSS, rather than this blatant acknowledgement of FOSS as a tool to be ripped off to improve one's ecosystem dominance and then promptly thrown aside. Makes you wonder what's going to happen if and when Apple no longer needs Clang/LLVM, or, hell, Unix.


> I almost prefer old-Microsoft's honesty about wanting to kill FOSS, rather than this blatant acknowledgement of FOSS as a tool to be ripped off to improve one's ecosystem dominance and then promptly thrown aside.

Well, to be honest, Apple has always been quite consistent here. They created their own ecosystem and made interoperability with other systems as difficult as possible, at software and hardware level. So this announcement is quite in line with that.


Didnt Apple steal all the code from one FOSS BSDs, closed sourced it and call it a day?


Mac OS X is based on the closed source NextStep; in many respects, the Darwin kernel is the continuation of NextStep's kernel. A fair amount of the user space code of OS X is from FreeBSD, which to the best of my knowledge continues along merrily open source as ever. Apple actually hired one of FreeBSD's lead developers to manage their BSD technology group.

And, I mean, c'mon. The move from NextStep to Darwin moved the kernel to open source. Webkit? Open source. CUPS? Still open source. Clang and LLVM and Swift? Open source, open source, open source. Apple maintains an open source page. They put stuff on GitHub.

I get some of the hostility toward Apple here; they're not always good at playing with others, there have been complaints about the way they do (or don't) contribute to projects they benefit from. But the narrative that Apple hates everything open under all circumstance and is all about proprietary everything all the time just isn't supported by reality.


OpenCL itself is something that was created by Apple, and then turned over to Khronos as an open source project.

https://en.wikipedia.org/wiki/OpenCL#History


Apple has a tiny market share when it comes to 3D applications - OpenGL is mostly the "pro" 3D world, be it CAD, 3D visualization, 3D simulation and similar. None of that runs on Macs, everything is Windows/Linux these days.

So there will be little "forcing" into their proprietary APIs - the few 3D developers that actually tried to support Mac will kill the platform off because nobody is going to rewrite major piece of software to use Mac-only Metal. Too much effort for little to no benefit.

Basically Apple just killed off any 3D support they may have hoped for on Mac. Including any hopes on anything VR related (so much Oculus/Vive fans hoping for seeing a Mac support - it is now even less likely than Linux one ...).

There is a 3rdparty port of Vulkan and I am sure there will be 3rdparty OpenGL drivers (e.g. Mesa) but nobody is going build a CAD system on top of that, IMO. Without official vendor support it is just too risky.


SteamVR is actively being developed for Metal (it's in beta rn), and Khronos has released a vulkan implementation that runs on top of metal.


Autodesk Fusion 360 runs on Macs.


Okay, thats one explanation. The alternate explanation is that Apple supports mature and robust technologies because they want whats in their users' best interest. Neither OpenGL nor OpenCL in their current form are robust. Certainly, that is not to deny that Apple might have a vested interest, but its naive to think that everything is just black or white.

RE: HTML5 - Apple simply made a mistake. Jobs famously said that they don't want to support native apps because bad apps could bring down cellphone towers.


OpenGL and OpenCL aren’t “robust” on macOS because Apple stopped updating their drivers after the version 4.2, which has been released circa 2011. Current version of the standard is 4.6, released July 2017.


Robustness is orthogonal to versioning. They were funding opengl dev on osx, and decided to stop. You can insert your own reasoning but I believe the more reasonable assumption here is that they were not happy with the direction the spec was going. Apple is strongly biased towards vertical integration. Owning the spec + OS + driver + hardware is the best way of achieving a high level of robustness (Whether they actually do achieve that remains to be seen).


Aren't most of the games on MacOS running on OpenGL? This is going to kill all the older titles that are not maintained anymore. Terrible move just to push Metal down people throats. As if MacOS gaming wasn't dead enough.


That’s a really worrying point, yes. Mac users can say goodbye to the majority of our steam library.


It will take a couple more releases for it to stop working.


I would be surprised if it disappears entirely. More likely some third-party hero (or perhaps Apple themselves) will spin off their GL implementation as a separate package - see XQuartz.


Apple releases about once a year, so two years before I am forced to switch to a PC?


At least you'll have the option with the games being part of steam. How many games are unplayable and never will be playable again on any phone that upgraded to iOS 11?


I'd assume it'd only be removed on iOS 14 or beyond.


Nope most of them use some sort of game Engine, unreal and Unity dominate.

MacOS gaming is not only not dead it’s flourishing rapidly. Take a trip to Steam shop. You will find a massive amount of games are made for MacOS and the numbers are growing rapidly since Apple lost interest into OpenGL. But then MacOS has reached a 10% of the desktop market making it much harder to ignore than in the past when it barely reached 3%. With iOS dominating mobile revenue , Appple is the undisputed king of gaming for over a decade now.


"Nope most of them use some sort of game Engine, unreal and Unity dominate."

But, don't those engines use OpenGL on the backend? They need some sort of interface to the GPU, and AFAIK, OpenGL is it for both Mac OS and Linux.

Edit: Yes, at least Unity does. I can't find an authoritative source on Unreal.

https://forum.unity.com/threads/opengl-core-backend-default-...

Edit2: Found Unreal info...they switched to Metal in 4.14. So, you're OK if you develop games for Mac with Unreal.

I guess it's plausible that Unity will do the same going forward.


The backend plays a minor role, as I repeatedly say and I am repeatedly downvoted by ignorant people. Unreal supports on MacOS not one but 3 technologies. OpenGL, Vulkan and Metal and that is just for graphics. There is also CUDA, OpenCL, PhysicsX and much more that go far beyond the scope of 3d graphics.

A well designed graphics engine never ties itself to a platform however popular that platform may be. Whether that platform may be a OS , a graphics API or any kind of SDK.

At least Unreal is neither a game engine nor a graphics engine , its an entire ecosystem of tools , APIs. Unreal even extends C++ to facilitate for GC and some rudimentary reflective abilities.

So there are a ton of things go on, from low level to very high level.

Overall MacOS develpers have been very quick into embracing Metal for new projects and so did Unreal. Which is no big suprise because Metal gives access to both MacOS and iOS which is a variant/fork of MacOS.


Unity already supports Metal on OSX and iOS. See "Enabling Metal" here: https://docs.unity3d.com/Manual/UsingDX11GL3Features.html


Unity supports Metal, it was shown on stage during the keynote: https://blogs.unity3d.com/2018/06/05/wwdc18-book-of-the-dead...


If you look at the most recent Steam hardware survey though 96.3% of players are on Windows and 3.07% are on macOS. I don't know if that's "flourishing rapidly". It's certainly going up slowly. https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


Well I will just mention that i am an iMac user since 2007 and I replying to your message with Win 10 via Bootcamp. Mac gamers choose the bootcamp route because its an easy solution and gives access to more games and software. I am willing to bet that Mac gamers are at least 8% if not more. Bootcamp became popular way before Steam did and way before we saw the big switch of big Game studio from pure Windows to Windows/MacOS. Windows also has a notorious reputaiton of being far more stable on bootcamp that a regular pc and and its a reputation that my personal experience confirms it. Because I am a developer I have decided to stick with Win 10 and bootcamp and it has been a smooth ride so far but from time to time I go back to MacOS. Windows 10 also has been the first reliable OS that came from Microsoft.

Of course bootcamp is not the only solution, there is also wine like solution and vm solution but Bootcamp is by far the most popular for gaming.

Another reason to stick with bootcamp is that games may offer a MacOS support but are not quick to fix bugs and resolve issues usually becayse they develop on windows and use crossplatform APIs to port to MacOS using Windows and not MacOS devs which can cause all sort of issues that can take time to resolve.


Mac has twice the amount of users on Steam that Linux has. And game studios mostly ignore Linux. Apple making themselves precious is not going to win them any hearts in the game industry.


Oh yes, don't get me wrong, I totally agree!


To be fair, PC gaming is a small fraction of the overall gaming market. All the consoles have their own proprietary graphics apis. It's ok if apple's api is proprietary too. Consoles support opengl, but it's not optimal. If they all strictly adhere to standard graphics apis, how do they differentiate themselves? Why make custom silicon with their apple 'bionic' chip? It's going to need an api to go with it.


PC Gaming is about the same market size as console: https://newzoo.com/insights/articles/global-games-market-rea...


Lol, Metal is Mac-only which relies heavily on CoreWhatever dependencies and thus can never be cross-platform, right? The only reason any game or CAD developer even supports Mac at all is because OpenGL is a cross-platform API that works great on Windows, Mac, and Linux, so they only have to write one type of shader program, etc. No game developer in the world will write both an OpenGL/DirectX/Vulcan and Metal renderer for the purpose of staying up to date with Apple's "deprecations".

If you're Pixelmator or Apple's own Final Cut team, sure, use Metal. For anyone else that wants to make a living, supporting multiple platforms is a given, so you won't pay the slightest attention to this deprecation notice.


> No game developer in the world will write both an OpenGL/DirectX/Vulcan and Metal renderer for the purpose of staying up to date with Apple's "deprecations".

Actually, most game developers do that. Pretty much every game (even ones with a custom, "non-AAA" engine) will have some kind of abstraction layer for dealing with graphics API's. Writing an additional backend for Metal is not a monumental undertaking -- it's a tiny fraction of the overall code you will end up writing. Also game consoles, for the most part, use their own graphics API's which are not portable to desktop PC's.


That's kind of true for games, but he's totally right about CAD and other software. Big 3d applications Nuke, Houdini, Maya all use OpenGL. Before OSX, Mac versions either didn't exist or were kept up about as well as IE or Word was. Over the past 5-10 years they've been reliably released for OSX at the same time Windows and Linux versions were (they all originated on IRIX). They all have a lot of other development going on and I don't see any of these companies making and maintaining a Metal version.

I'm sure the Mac version of these tools aren't used by any studio with more than 5 people, but independent contractors, small studios, and individuals working at home really benefit (or else they wouldn't have bothered to port and maintain up until now).


Well, then maybe it is a good thing for wannabe pixelmators? Create a nice Mac-only tool, profit.


Yeah, but that really sucks for people who use powerhouse tools that took 15 years to develop because we were able to use Macs, but it's looking less and less likely going forward.

I really like the idea of creating more space for the little guy. I want them to do well, but honestly I haven't had the best experience. As someone who uses the professional tools during the day, but has only the occasional needs for personal use I was happy to pay for a Mac-only tool. In practice, for a tool I only grab every few months I often find there was a regression in some feature that I guess is only used moderately often (selection boxes), it doesn't currently support that feature (I can't remember, but it was something like HSV color space, more than 8bpp, or some file format like TGA or PNG--not features you'd add to v1, but nothing too obscure), or there's another paid upgrade.

Even beloved Mac tools like Panic's Transmit, I feel like I can't 100% rely on. At a moment's notice I'll have to jump to something else to finish the task.


While that may be somewhat true (and I disagree with the tiny fraction assessment, unless you are measuring in some other metric than time investment), I feel like it would still greatly increase your QA budget. Each additional back-end that is supported now requires extensive testing through the game development process.


In my experience, trying to get OpenGL to behave the same way (and with the same performance) on a bunch of different systems is actually more work than just maintaining multiple backends. Testing OpenGL on Windows does not in any way guarantee that it's going to work on Linux, or macOS, or whatever else -- even if you don't add a crazy OpenGL extension support matrix into the mix. So your QA budget is already going to include testing OpenGL on all those platforms.


I think they're expecting developers to use existing engines such as Unity and Unreal. Most game developers probably already use those engines are will not be effected.

On the other hand, they definitely made it harder for developers to create a new game engine.


Game engine not so much, Apple has always been an "also ran" when it came to games. Moreover, these days nobody is writing new major game engines - it is just way too expensive and difficult to justify vs. downloading Unity and starting building your game. Furthermore, games have always been something that is on the market for a year or two and then it is done for, with the developers moving on.

However, where it will have major effect is availability of any professional software. Such as CAD. Mac has always been a pain in the ... platform to support because of their weird "Unix but not quite" ways of doing things and now there will be no justification to support it anymore, especially in the OSS sphere. E.g. I fully expect things like KiCAD (and also the commercial Eagle which has Mac support) PCB CAD to disappear from Mac as soon as OpenGL is removed. Nobody has resources to rewrite such software to use Apple-specific Metal. Another such project is OpenSceneGraph, a large building block for 3D visualization and simulation applications.


At least there's this kind of thing happening: https://www.khronos.org/vulkan/portability-initiative


If this means that macOS will lose opengl support even for X11 apps, a substantial part of academia will switch away from Apple. It's highly unlikely that software like ROOT or geant4 will ever get ported to something else.


I just recently bumped to John Carmacks stories about Steve Jobs, among which he convinces him to adopt OpenGL[1] (HN commentary [2]). Thought this might be an appropriate historic reference here as OpenGL now appears to be on it's way out on MacOS.

[1] https://m.facebook.com/permalink.php?story_fbid=214641282559... [2] https://news.ycombinator.com/item?id=17066846


I too thought about this as Carmack mentioned that he believes it was one of his most important achievements that Apple adopted OpenGL back in the day.


This is not the least bit surprising.

As one of the creators of Direct X at Microsoft commented when Metal was first announced, "Why help Android siphon off their game developers by propping up OpenGL?"

https://web.archive.org/web/20140606055700/http://www.alexst...


Turns out it was a good idea to base new stuff on this: https://github.com/bkaradzic/bgfx


Yeah, using things like that is the way to go.


I'm getting a bit worried about Apple dropping out of the business of producing professional tools and rig. I always liked that you can walk into an Apple retail shop and get a decent Unix notebook (though I've opted for a different notebook for my last purchase because of the lack of display, keyboard, and port options). But with Apple pulling out of OpenGL, what little pro (or at least pro enough for me) F/OSS software for 3D (Blender) and other graphical stuff was running on Mac OS won't any longer. I can't imagine Blender has the resources and inclination to port their software over to Metal, especially when Apple deliberately torpedoes their efforts.


Well most games using OpenGL run like crap anyway on macOS. The Windows version alwys runs better in Windows via Bootcamp.

Maybe the forced Metal usage (however shitty it is to remove support for an open standard) will increase macOS ports of games.

But in reality, 50% games will still be a wine bottle


Note that they're also deprecating OpenGL ES on iOS: https://developer.apple.com/ios/whats-new/#deprecationofopen...


Disclaimer: I'm not trying to be the Devil's advocate here, but just wanted to share an observation.

Apple always removes stuff which looks untimely or just plain stupid (headphone jacks, optical drives, USB/Firewire ports, optical in & out, rosetta, APIs, etc).

Always the same outrage has happened, but things normalize then. People, companies adapt, hell does not freeze over, company doesn't go bankrupt.

I feel that maintaining OpenGL & OpenCL felt like a burden to Apple. We all know that Apple likes to control everything from hardware to user interface, and GPU drivers are one of the most notoriously complex, overprotected part of the software stack. In the OpenCL world compilers and other stuff (I don't remember the terms clearly, sorry) also gets in, and makes everything much more complex.

Maybe this move will help them to slim the drivers to the basic "hardware-software" interface level and build metal and related technologies to their own term on top of this relatively simple interface.

I have a feeling that metal can be directly translated bidirectionally and relatively cheaply to OpenGL (and maybe Vulkan and OpenCL too), so at the end things don't become extremely complex for everyone.

Apple doesn't feel that backwards compatibility is strictly necessary unless things can be translated and made to work with relatively good performance.

As a Linux and Mac user for 10+ years, these are my observations. They may be wrong, technically incomplete or else. Feel free to discuss, debunk, or downvote.


> Always the same outrage has happened, but things normalize then. People, companies adapt, hell does not freeze over, company doesn't go bankrupt.

But that doesn't mean we took the best path. There is always an alternative future if x hadn't happened at all. If x hadn't happened, we might be in a better place, rather than accepting it and doing the best we can.


> But that doesn't mean we took the best path. There is always an alternative future if x hadn't happened at all. If x hadn't happened, we might be in a better place, rather than accepting it and doing the best we can.

Of course. I didn't mean to say that Apple is designing the best possible future or making the best possible decisions. I'd rather have a future where these technologies are supported as first class citizens by everyone, and I can just cross compile this stuff with virtually no porting or optimization effort, however unfortunately this is not the world we live in right now.

As I said, mine is an observation rather than taking one side.


Metal is a nice low-level API and you can have a tiny (<1MB) library, that can efficiently emulate modern OpenGL on top of Metal.

E.g. for WebGL purposes, all web browsers on Windows emulate OpenGL on top of Microsoft DirectX.


Very much in line with Apple's way of doing things. God forbid they'd adopt some sort of open standard - even though they've hugely benefited from them.


"God forbid they'd adopt some sort of open standard "

They did, when they adopted OpenGL and OpenCL. Years ago.

Apparently they've decided that those have outlived their usefulness.


"They did, when they adopted OpenGL"

Yes, and this was so much against their DNA that it required the chutzpah of a Carmack to get the Apple's CEO to implement the decision.

Apple is and has always been a strict leech on open ecosystems.

They try to build closed proprietary stuff first to lock you to the platform, and when what they build is bested by OpenSource, embrace it and move on to the next opportunity.

They did the exact same thing with BSD: grab a beautiful piece of OpenSource tech, bolt on a metric ton of proprietary closed source tech on top of it, and call the whole thing open source to get love and applause from the OSS crowd.

Techies are essentially a gullible bunch.

Source: https://news.ycombinator.com/item?id=17066846


did you also complain when they adopted USB-C


Yes, when they decided to drop all other ports AND their iphones still don’t have USB-C, so you cant use iOS headphones on your mac, and you cannot charge or sync via the default cable on your mac.

Nobody has USB-C stuff everything. Every owner is using a collection of dongles and it is absolutely stupid.

The OLED bar is also stupid.

I just hope my 2015 macbook never gets obsolete because I dislike the newer ones... and yes, I owned one, and I sold t after 2 weeks and got my 2015 back


And if you use a USB-C device in the back left port on a 4-port 15", there's a good chance your MBP will fry it! Oh standards!


At this state if i'm coding a game engine and want HWA graphics in all platforms i'm better of using middleware/user level frameworks like BGFX, GFX-RS etc, that abstract away D3D, Metal and Vulkan. Best choice. All that time learning OpenGL will not go to waste completely.


I think it’s a great idea to alienate developers from every other operating system out there and ensure that great effort is needed to maintain and port apps from other platforms. This is what we expect from Apple anyway, right?

Who’s going to start working on an OpenGL to Metal wrapper?


I suppose it got flagged as a dupe due to duplicated URL. However, I believe it shouldn't be - "what's new in macOS" is completely uninteresting to me, while deprecation of OpenGL and OpenCL is a big news.


This is bad: a hobbyist will be faced with a huge burden to bring anything 3D cross-platform to their audience. In the past, it was possible to use Qt or a nasty GLU/GLUT wrapper to write portable code.

Way back in 2006, in CS175, we implemented almost all of the core 1.5 pipeline in C++. Software OpenGL implementations may not be the fastest, but they’re more or less trivial (quaternions, trapezoid-based triangle engine, painters algorithm, z-buffering, texture mapping, bump-mapping, lighting and various shading models), and therefore accelerate-able with CPU and GPGPU SIMD ops.


That hobbyist can still use Qt, because nowadays they support multiple graphical backends.


Qt supports multiple backends for its own rendering, but the hobbyists will still have to rewrite their rendering code to support that new backend. For a lot of my programs, that's about equivalent to rewriting in another language.


Not when the hobbyists took the right approach to use the Qt 3D APIs, like the new scene graph API.

Also there is no harm in learning to write modular graphics engines with multiple backends, hobbits get to learn how professionals do it.

Finally, if it is just a hobby, then what hobbyists can do best to improve productivity is to use whatever their OS offers out of the box.


I got a small kick out of the typo, with hobbits learning to write graphics engines. No doubt between second breakfast and elevensies


Yeah, my English went downhill with auto-correction keyboards. :)


Vulkan on Metal is already faster than OpenGL so this is a non-story. If you want cross-platform support use Vulkan, if you want ultimate performance target DirectX and Metal.


There is one thing I do not really understand about this. nVidia releases a macOS driver here http://www.nvidia.com/download/driverResults.aspx/73628/en-u...

Would it be possible for this driver to add OpenGL/OpenCL and Vulkan support to macOS like it is done on Windows? (Or am I completely misunderstanding how this works)


Microsoft deprecated OpenGL back in Windows Vista over a decade ago. They still have to support it, because many major packages didn't switch to Direct-X.


They don't support OpenGL at all, they tolerate it via ICD driver model, which isn't supported in the store app containers.


I wonder at what point developers will leave. I don't think they will. Changes from Apple might seem to hurt, but people adapt. The changes are not big enough, nor damaging enough to those who have invested themselves in Apple to switch. Normal users won't see much difference.

If you have Apple desktop, a laptop, a phone, connected accounts, apps and associated data, maybe a watch or a TV you are fully within the Apple ecosystem. You cannot leave without it costing you major hassle and stress. This might cause surface inconvenience but it's not going to be enough to push anyone out from that ecosystem. Apple has their users where it wants them, and the users are happy.


Related to the article (but not the subject here); as a user of "dark" themes:

The precise wording they use seems to indicate a view of dark themed sets as being less colorful. High contrast themes and visibility aiding limitations in theme color use have their places; so too do themes based around darker, more night time, friendly colors. Thinking of a system as having only one true theme, or of light/dark as being full / visually impaired themes is a dangerously limiting misconception.


Windows has done much better in this regard, even back in XP iirc.


Could they please pick a better name than "Metal"? I cringe at having to sort through the unwanted results when Googling issues in the future.


The wall just got 10 feet higher. I can't find any info though on webGL.... anyone find anything? Apparently iOS is getting hosed too


So Adobe, Blender, Maxthon (Cinema4D), Maya, CaptureOne, DaVinci Resolve and 99% of the film, image, photo industry have to rewrite all their OpenCL kernels.

Also currently Metal and Accelerate are completely unsuitable replacement to OpenCL for Deep Learning ... Not that deep learning on OpenCL was a thing yet but I was adding support to it in my own framework.


Does anyone know if Apple actually intends to remove OpenGL drivers at any point, and if so, when?


Very disappointed about the deprecation of OpenCL. We use it to achieve cross-platform GPU compute usage (Windows, Linux, Mac). Dropping OpenCL is not going to encourage developers to target the Mac for such software as ours.


Why is anyone surprised by this? Apple has a decades-long history of taking outdated technology and making it obsolete by forcefully removing it (which they haven't actually done yet with OpenGL):

Parallel Ports

Floppy Drives

CD/DVD Drives

Older USB, Firewire ports

Network ports

They were the first to remove all this stuff, and everyone was shocked. Now they are doing this with an API. Both Apple and Microsoft have long ago created much more modern, highly performant graphics technologies over what OpenGL offers, and serious vendors support platform-specific APIs most of the time. If it takes Apple to do this and say to the world "wake up, OpenGL sucks," I view that as progress.

Despite, I don't think your OpenGL app will fail to run on Mac any time soon. I suspect it's years away before they actually remove it entirely.


This would have been a valid argument...if they supported Vulkan instead.

But they do not.


OpenCL -> Cross-vendor ROCm & Portable CUDA/HC via HIP

- https://github.com/ROCm-Developer-Tools/HIP - https://github.com/RadeonOpenCompute/ROCm

"CUDA/HC Portable with HIP", "Microsoft C++AMP", "Apple/Khronos OpenCL": https://github.com/ROCm-Developer-Tools/HIP/blob/roc-1.8.x/d...


Would it be correct to assume that swathes of old games will just stop working when 10.14 arrives? I'm not a MacOS user, but as a gamer that would frustrate the hell out of me.


"Apps built using OpenGL and OpenCL will continue to run in macOS 10.14, but these legacy technologies are deprecated in macOS 10.14."

Should be okay for now.


How does this affect Angle, the WebGL implementation underneath Chrome? I believe Angle only has bindings for OpenGL (with Vulkan in alpha). It has no bindings to Metal.


I didn't know you could replace OpenCL with Metal. I thought for GPU computing there are just OpenCl and CUDA. Does Vulkan offer an alternative to OpenCL too?


Well, great. OpenCL 1.2 was the only API that was actually portable. Now we have nothing. Might as well drop OSX and just move to Vulkan.


Oh those mother fuckers.

Can't believe I'll actually have to switch to windows to develop my OpenGL/WebGL games.


Go fuck yourself, Apple. The only thing this is doing is driving devs off your platform.


My question, is it possible to create an openGL/webGL emulation layer over Metal?


Some comments addressed MoltenVK and MoltenGL https://news.ycombinator.com/item?id=17232868


What does this mean in practice for macOS users?

Asking because I am not familiar with these libraries.


We got dark made in return.


And didn't implement Vulkan either. Typical Apple.


And that is also the final push for Quartz Composer...


python, bash, opengl :( Scientists just want to use it!

(as of high sierra python 2.7, bash 3.2, ....)

New thing: DARK THEME, YES, OMG :(

Is apple holding the future back???? Where is the innovation?


tar (bsdtar) apple: v2.8, current 3.2 feel free to continue...


Great ... just when I was going to use openCL .. :(


How does this affect Valve and Blizzard?


you're all completely correct: Apple is doomed. Since no one plays any games on iOS (where Metal is the only real option), forcing Metal adoption on MacOS is doomed from the start. iOS devices will be left in their moribund state, used only for a few limited tasks like DTP and blogging or whatever.

They should take a page from Microsoft and adopt an open, cross-platform technology like DirectX.


Everyone is blaming Microsoft for acquiring GitHub. Apple chose the right time to deprecate OpenGL in favor of Metal. Sneaky


The Bureau of Compatibility should demand that all major platforms support Vulkan or have their tax loopholes closed.


congrats, that's as cool as DirectX from a developer perspective.


Does this affect WebGL?


I feel better about moving to Mint every day.


To me this signals that the Second Moribundity of Apple is nigh.

The First Moribundity was the period in the 1990s when Apple was coasting on the DTP and Photoshop advantage over Windows they had, reducing the features of their desktops and not innovating. Schindler, a smart but ineffective leader, was replaced by Gil Amelio, a star in his prior field but unable to get Apple headed in the right direction. It took Jobs' return to right the ship that time.

The emphasis on thin but less functional and less serviceable laptops, the dropping of OpenGL, the cruft piling on top of OS X to no new net benefit for users, and their coasting on the desktop market all point to this IMHO.


Everyone is coasting in the desktop / laptop market. Refresh cycles are like 5-10 years on PCs now, and it’s not clear tablets even have a refresh cycle. You can’t get growth on PC hardware anymore, not even if you’re Apple. So it goes into maintenance mode. Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes.

Apple is all about mobile phones, the Apple Watch and air pods right now. The MacBook is barely a blip on their product radar.


"Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes." I believe you should embrace the novelty concept called 'work' sometime. Then you may see some 'niche' application of PCs. Mobile devices can, yes, be sold in bigger numbers, may produce higher profit, many trivial applications like browsing for news and sharing our newest and greatest experiences by tweets and photos do not require, luckily, a desktop anymore and so these paramount usages could been shifted to the only important platform of mobile devices, yet the second grade activities of design, manufacturing, academia and so still rely on the archaic concept of desktop computers and they keep alive this dying artefact. Even traitor mobile developers (all of them, the fools!) dare to use desktop computers still instead of solely relying on mobile phones. But not long, soon the last aircraft engineer or corporate accountant will sell their desks and throw away the last of the keyboards moving to the only necessary platform of mobiles so the PC can go extinct finally!


I think the GP’s point was laptop sales are outgroing desktop sales at a high pace for a while now. And that in an overall stagnating market.


More that both laptops and desktops are dinosaurs compared to mobile so Apple isn’t pouring R&D into them. And it’s a smart move.


> Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes.

My personal impression is that while pre-built PCs are mostly sold to businesses, there is a strong growing trend since a few years to build your own PC from parts. I really do observe that if mobility is not an issue, people now tend to move away from laptops towards self-built PCs (or often rather: let a good friend build one). I don't want to go into the details what advantages these have over laptops, but just mention customizability with respect to requirements (e.g. very silent, very high-power, ...) and repairability.

A small (but only small) contributing factor is that laptops do not have sufficient power for VR, so you really need a stationary PC for VR.

The strongest "dampening factors" with respect to this trend are the growing prices of GPUs, because lots of cryptocurrency miners (in the last years in particular Ethereum) hoard them (but it is my impression that this will somewhat ebb away as soon as attractive ASICs for Ethereum get released) and the growing prices of RAM lately. On the other hand the release of Ryzen made building of PCs with either very high-performance or very good cost-benefit ratio feasible.


Is building your own PC really a recent trend?

A lot of early pcs were kits people built, and I remember pc gaming in the '90s and '00s was also heavily focused on custom built machines.


Probably the last couple decades saw custom PC building decline in market share as prebuilt solutions became mainstream, but perhaps it's seeing a resurgence now that most consumers have moved from focusing on computers to focusing on smartphones as their personal tech hub.


Now that prebuilt PCs tend to be cheap and limited, anyone who wants something better needs to build the machine from parts.


Prebuilt pcs were cheap in the '90s and '00s as well. Premium brands like AlienWare still exists for gamers.


But in the 90s laptops were insanely expensive. :-)


Not a recent trend but online communities act as a catalyst.


Custom built PCs are a niche. The vast majority of people just want to buy a box that works.


> Custom built PCs are a niche. The vast majority of people just want to buy a box that works.

That is why I wrote

"(or often rather: let a good friend build one)." :-)

Seriously: In my opinion (but others might disagree) an advantage of self-built PCs over one that some company produces is that you know exactly what components are inside (in particular you can buy components in which you trust), which makes you far less dependent on driver support by the manufacturer (i.e. you can find drivers in the internet by yourself if necessary). I often had bad experience with driver support by manufacturers (expect for a few well-respected names).

Also lots of cheap PC manufacturers install lots of crapware by default, while your self-built one is a very clean install.

In summary a self-built PC often "just works" far better than a pre-built one.


You're forgetting the most important reason to DIY: money. Where manufacturers are happy to charge you 2x the price to double your RAM or put in a non-shitty SSD, DIYing can shave off a few hundred Euro.

Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).

Looking on amazon or geizhals.eu (price comparison site), 16GB ECC DDR4 2666MHz sticks cost 200-220€ x 2 = ~450€ für 32GB. That means you can save 500€ on just RAM alone.

Same applies to SSD storage. And graphics cards. And CPU. And basically anything you could want to upgrade.


> Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).

Specifically Apple is well-known for their expensive pricing for better optional components (which is often necessary to pay, since for many models you cannot simply replace the component (RAM, SSD etc.) by another one on your own).


Good point. I should have used e.g. Dell or HP as an example, they do the same albeit AFAIK slightly less absurd price-hiking.


PC gaming is growing and thats where mostly the trend for self-built PCs comes from and most of the growth in PC market. It is still dwarfed by Laptop sales though. As an anecdote, today i know hardly anyone who owns a desktop PC while 10-15 years ago almost everyone had one.


The trend around me is to see the shops that used to sell PC parts closing down, while consumer stores have 80% of their PC surface full with laptops, tablets and mobile devices.


> The trend around me is to see the shops that used to sell PC parts closing down

Couldn't this be simply explained by the hypothesis that people simply buy their PC parts mostly in the internet instead of a brick and mortar shop?


It could, but I am willing to bet that is even more niche than going to a physical shop.

The kind of experts capable of understanding if they are ordering hardware that is supposed to work together instead of blowing up in some form when assembled, is quite tiny.


I'm sure it's rarer than I'd like to think, but if I managed to figure it out as a teen in the mid 90's without much in the way of internet access or budget, it's still probably easier today.

Anyone with a modern internet connection and a bit more patience than money (or at least a willingness to learn) can hop on Reddit or PC Part Picker and get a pretty good idea of what is out there and works together.

Compared to the days of making sure you had the right number and type of ISA, PCI, and AGP slots, assembling a PC from parts today is a breeze. Shopping online keeps costs low and places like Microcenter are great for buying in person.

I only haven't built one in a while because my current 5-year-old workhorse media/editing/gaming/everything PC shows no sign of needing a full upgrade any time soon. Sure I bought a new GPU after a few years when I got bitten by the experimental VR bug but other than that, it was an afternoon buying parts and snapping them together, an evening installing software, and 5+ years of "just working".


The “enthusiast” segment of the market is small. Even then you don’t have nearly the choice in vendors you used to, so people mix and match parts from the same 5 or 6 vendors who’ve been there forever.

Cryptocurrency mining has kept the enthusiast hardware market afloat, I have a feeling...


GPU prices seem to be normalizing the past week, at least here in Europe using historical price data from Geizhals[0]. If things continue we will be back to "normal" 2017 prices very shortly.

[0]: www.geizhals.eu

Examples:

1070: https://geizhals.eu/?phist=1456548

1070 TI: https://geizhals.eu/?phist=1717563

1080: https://geizhals.eu/?phist=1449277

1080 TI: https://geizhals.eu/?phist=1587606


I disagree, I think Apple is still rather committed to the Mac even if it is no longer their #1 priority. Even if it's a small part of their profits, it's a big source of the strength of their brand (visually speaking, a Mac stands out against a PC far more than an iPhone stands out against any other smartphone), and that's something that's crucial for all of their products. Bad press about the Mac damages the brand, and if they don't continue to put out a good product (and correct their recent mistakes) they are going to keep getting bad press. I would argue that the Watch and Air Pods are more in the category of the Apple TV than they are the iPhone, because they're just a means of pushing up the average value of each iPhone customer. The Mac both does this and serves as a "halo" which is crucial even if it's less profitable. Honda certainly isn't making much money on NSX sales but that doesn't mean it's not still vital for them to produce it just for the sake of the brand (although I admit that's a bit of an extreme example).


The big signal for me that Macs are becoming second class citizens is if they ever start supporting Xcode on something that isn’t MacOS. That is the only thing keeping app developers on their platform, and if/when that changes, you know the final nail is in.


I would be surprised if Apple doesn’t throw in with Microsoft soon around mobile development. There’s a lot of symbiosis there — Apple and Microsoft don’t really compete in very many markets against eachother anymore.

Visual Studio is actually an awesome development platform (I’m including VSTS in this). There’s not much secret sauce in Xcode to threaten Apple’s App Store revenue — it’s just a bunch of signing keys that Apple controls and manages.

I do think Apple will make a strategic decision to diversify from core Mac OS soon. The ever-shrinking space allocated to traditional computers at the Apple Store has convinced me of that. They have more floor space allocated to the Apple Watch than they do the Mac these days.


Sure; but users stopped buying computer hardware on specs last decade (or at least the mass market Apple is chasing). Macs aren’t even any more expensive than top-end PC laptops. They’re just a brand choice.


They definitely aren’t usually buying them because of typical CPU/RAM specs or whatever, but I think things like battery life, trackpad, keyboard, display, general build quality are all still very relevant aspects of hardware that people pay attention to. The slow upgrade cycle is definitely hurting things and generally slowing everything down, but it’s not as terrible as it’s often made out to be. It’s been a few years since the 5K iMac launched and it’s still a great desktop. The Mac Pro was botched but at least they’re fixing it. We’ve only had 2 years of weak MacBook Pro releases, and it’s very possible that the next release later this year will address both major issues: a bad keyboard and no 32GB RAM option. The keyboard will matter more for most people, but the added RAM plus 2 more CPU cores and Vega graphics will be a big win for true pro users. I don’t think they’ve necessarily “coasted” on Macs just because the last couple years have been weak. It’s more accurate IMO to say they’ve just fucked up a couple times in the last 5 years — specifically with the Mac Pro and the butterfly keyboard devices.


Oh I agree; Apple’s brand carries a ton of heft in the laptop market because they generally get the “product” features (keyboards, battery life, screen quality, etc) right. It’s only so noticeable because their track record has been so good — we would still have plastic laptops and trackpads with mushy membrane keyboards without Apple’s industrial design incorporating metal and glass.

And honestly, their desktop line looks very similar to many PC companies. They make the Mac Mini (aka the Apple NUC), the all-in-one iMac, and the Mac Pro (ok, the Mac Pro sucks and there’s no excuse for it).

Anyone doing serious workstation tasks these days is likely using a cloud-based solution. There are a few specialized exceptions (ML development stands out to me), but none big enough to build a product around. Especially when doubling the entire PCs and Laptops category wouldn’t move the needle for Apple at all.


Approximate yearly sales of desktop PCs worldwide (all kinds) are 200 million per year... I can't find 5 year refresh cycles on any HP, Dell, etc. desktops or laptops.

They tend to refresh in line with corporate cycles, i.e. 18 month to 3 years, from what I see.


Contrast with ~1.5 billion (!) smartphone sales per year.


Sounds like a 'we sell more shoes than airplanes so we don't need airplanes no more' kind of argument.


such figures wont last forever once smartphones are sufficiently mature and there is no performance difference from one model to the next.


The main reason to buy a new phone these days surely is "the old one broke". Being with us all the time. The planned obsolence - unreplaceable battery, glued screen, glassback etc mean phones break all the time.

PCs stand still, even laptops tend be mostly stationary. No wonder my work laptop is now 6 years old and I feel no urge to request upgrade.


> The planned obsolence - unreplaceable battery, glued screen, glassback etc mean phones break all the time.

Don't forget planned insecurity two years after release through unpatched software vulnerabilities.


With shops still selling un-upgradable Android devices with 5.1 on them, I seriously doubt non-technical users actually care about it.

https://tinyurl.com/yakmsz6b (sorry, in German)


People drop their smartphones more often than their desktops.

But I expect cheaper devices to take more share, since the "high-end" won't give you much benefits over them. My 120 EUR Xiaomi already offers amazing cost performance compared to most other Androids.


Which is pretty much where we are now. At this point smartphone sales are mostly based on making performance more affordable, and lots of marketing.


The difference is that in the 90s they were close to bankruptcy whereas today they are swimming in cash. That's not going to help them much in the long run if they have lost their vision, but they are definitely too large nowadays to go away within a few years.

You may call me Captain Obvious now. :)


I think it is a good point, and they could probably survive about 2.5 years of losses without too much pain - but Apple's margins back in the 1990s were pretty good also: what hurt them was that sales dropped off a cliff, so their good margins weren't enough to save them...

There could be a number of factors that would hurt them, like margin compression, reaching market saturation, etc. All the ailments that Business Schools will warn you about...

However, they (IMHO) have become arrogant and out of touch with their users. VISION is their problem.

They need the MacBooks and desktop Macs to be seen as very desirable, in order to project the "creative people buy Apple" halo over the rest of their products.


This time around, they have way more cash and maintain a larger share of much larger markets, and keep tighter control of spending and resource use. Their decline, if you are right about the writing being on the wall, will be an extremely long and slow one when it begins (we're talking about the first derivative here, right?). Too slow, probably, to reach "moribundity" rather than an equilibrium point of mediocrity.


well, all we need is the third coming of steven christ

come on steve, reinvent death please


The classic "Fuck Off" hubris from Apple.

Their B.S. is getting out of hand. List of gripes:

1. Heavy handed regulation of app store: This needs intervention from regulatory authorities. Apple has successfully inserted itself between the consumer and producer of apps. It plays kingmaker, and very clearly promotes internal apps to the competition (recent ban of the Steam app is a prime example). It's only legal recourse stems from being ~17% of the mobile hardware market, but from a paid-app developer's perspective, not being on the app-store is a definitive death-knell. It's a clear monopoly in the 'paid-app' space, and others are arguing a monopoly classification on other grounds [1][2]

2. Unreasonable restrictions placed on approved apps - like disallowing non-webkit based browser engines. Restricting API access though user security isn't compromised (dlopen, webgl-2, opencl, vulkan, etc). Restrictions designed to choke-hold potential competitors & restrict user choice

3. Forced licensing of hardware parts for accessory makers (look up the Apple MFI program)

4. Purposeful non-conformance towards industry standards (like OpenGl, Vulkan, WebGl, many, many holes in open-wen compliance). This non-conformance is steeped in monopolistic, unfair-trade-practices psychology to maintain absolute, unfair control over any potential competition.

I hope France fucks them for purposefully slowing down older iPhones/iPads. And hope the FTC, EU regulatory authorities pay heed to app-developers (and consumers) getting shafted by a beast of a corporation.

[1]: http://www.businessinsider.com/apple-monopoly-ubs-steven-mil...

[2]: https://www.pcworld.com/article/3157551/mobile/apple-must-fa...


1. If cable tv providers can get away with disintermediating TV viewers from TV stations I don’t see how Apple is any worse, especially since I can still buy third party apps, but I can’t either set up or tune into an indie TV station. Amazon is a far worse problem with respect to books (and without any demonstrable user benefit) so good luck waiting for regulation here.

2. Again, you can use Chrome on a Mac. The App Store is not stopping anyone from using anything, it’s just facilitating certain cases. Microsoft installs its apps on your machine without asking you (and has an App Store). Don’t even get me started on Nintendo. Seriously, you’re complaining about the least worst (commercial) player in this space.

3. Clutching at straws here. No one cares.

4. You mean like OpenGL, and two other OpenGL things. Again, Microsoft DirectX is what?

5. Apple intentionally slows devices down to prevent random shutdowns from peak power draw when the battery has aged. This is a good thing, well implemented, but poorly explained. Apple has paid a PR penalty for its lack of transparency, and having them suffer regulatory punishment for this seems harsh.

Ok, I get it, you hate Apple. But unless you avoid all commercial vendors I don’t think you’re going to find someone that satisfies your peculiarly quixotic requirements.


"Why are you arresting me for armed robbery, dont you know that guy is a Murderer??"



More like: why are you making harsh language a crime since murder is legal?


There's no need to turn this on the commenter.


To whom or what are you referring? If my suggestion that the commenter’s requirements seem quixotic — guilty as charged.


Everything on that list is relevant, but please keep in mind that banning custom browser engines isn't particularly bad thing for everyone.

Of course Apple doing it to force web developers to keep supporting Safari since no one can simply expect iOS users to install Chrome, but it's also benefit open web greatly since it's preserve status quo that we have more than just one browser engine. The day Apple stop doing it would be the day web developers going to abandon support for Firefox too.


I don't buy that a platform excluding rendering engines helps keep the web diverse and open.

Apple has crippled the mobile web since iOS reinvented the category. How many years before the file upload widget worked?


I myself had to fix weird Safari and sometimes even iOS-only bugs more than once, but it's just Apple cripple it's own platform. Also even if it's annoying for web developers these things still keep our eye keen on how what we work on look in something other than Chrome.

Unfortunately Google has abysmal power over web right now and I cant count how many times I see them cripple experience for Firefox. Now they even outright saying that new Adwords interface only work in Chrome. And AMP only prove how little we can trust Google.


I would argue #2 has turned out for the better. Googles store is a dumpster fire of crap, malware, and knockoffs. On iOS there is less crap and less knockoffs and generally no malware.

Steam link is weird, but I thought I read somewhere that they would have had to pay Apple 30% of in-app sales.

But long story short is that I am paying them to curate the apps.


> Steam link is weird, but I thought I read somewhere that they would have had to pay Apple 30% of in-app sales.

Valve tried to explicitly disable purchases through app, but Apple still refused it even if it's would only do streaming.


There is a story there that might be interesting to hear. I wonder if it has to do with it taking away from the incentive to produce ios native games vs streaming/controlling a game on a pc.


1. Not monopoly. That is like saying Walmart has an monoploy of its own selection in store.

2. What ?

3. MFi? The greatest invention ever. Now I am assured every single MFi Lightning cable is of quality and decent, instead of the crap USB-C has true into.

4. Industry standards of what? May be Playstation, Nintendo , or Xbox should all forced to be using OpenGL?

Yes apple should be fined for slowing down iPhone. Absolutely! But this. calling of B.S ... is just ... wow .....

And if you don't like or even hate Apple to bits, You can always buy an Android phone. Oneplus, Samsung, and the Vizo are all fairly good in Hardware spec. You have a choice.


You're right that this is ridiculous. It's time that we boycott Apple for their hostile policies!

Stop developing for Apple products. Stop buying Apple products. Tell everyone you know not to buy Apple products. If enough people protest, they will eventually be forced to change.


Boycott Apple... and then what? What's the viable alternative? Android? No, thank you. At least you have some leverage over Apple by the virtue of being the customer. The competitor's business model makes it much harder to influence, which is hostile to the customer and their privacy.

If you think Android is really open, think again. Without Google Play Services and Play Store you are basically hosed.


There's a lot of options. Easiest is just don't upgrade -- stick with your old phone as long as you can, and don't buy any more apps. If you need to replace your phone and don't want to use standard Android, you can use an Android fork that doesn't depend on Google. (I've been hearing good things about LineageOS, but haven't tried it personally.) Alternatively, you could try a less popular mobile OS like Tizen. Another alternative would be to use an older style non-smart phone. You could even ditch the phone entirely and go back to the way people lived before the last decade or two if you want.

Just don't give Apple any more money until they change.


Fair points. I don't think they are going to effective though for an average enough user to change Apple, and that just addresses the customer angle. As a developer you don't have much choice to boycott Apple.

You have to pick your battles in life. There are a huge number of companies I have chosen not to do business with and a long list that I'd boycott before Apple. I admire you if this is the cause you want to spend your life fighting for like I admire RMS but if that's the case one should really consider spending their energy ensuring a viable good alternative exists, and maybe lobby for regulations. Merely shouting 'boycott' in a niche community will not change Apple's behavior I don't think. This of course is not a discouragement to try, but a personal opinion on practically of achieving the desirable outcome. Especially so when the competition is much much more anticonsumer in a myriad of ways.


Android let's you install any app outside the store through sideloading. Imagine that, being able to install the software of your choice on the hardware you paid for!


FYI, it's been a while since you can get a signing developer certificate for free and sign and deploy any app you want for your own device. You can't easily distribute it though, as the complexity is equivalent to sideloading. It's a fair point that Android supports non-Google Play app stores and "Unknown Sources" app downloads. Google Play Services is a a very key tie-in though that's hard to avoid even if you're on LineageOS.


I use iOS also because of its strict regulation of appstore, and I'm definitely not the only one. Who will pay Apple for lost revenue after someone implements your suggestion, how are companies going to innovate without these funds (I assume your answer to the previous question is something like "that's just business risk" - no, it isn't) and how can customers such as me find their desired products if they're banned, arguably for no real reason (no one is getting hurt) at all?


Instead of looking for gov punishment, make people change their minds showing this evidence of why Apple is sucking big time.


You do know you have choices in the operating systems and hardware you use. If that's not a viable option, engaging in dialogue with Apple might be the way to go.


Looking at the clearly sparse docs, Metal looks like some jackass's pet project. For people who get off on making everyone else use their system by systematizing the obvious and the already done, graphics apis have an allure. However, most people, let alone companies are not capable of replacing OpenGL and OpenCL.



say wuuuuuuuuuuuuuuuuuuuuuuuuuuuut


> Apps built using OpenGL and OpenCL will continue to run in macOS 10.14, but these legacy technologies are deprecated in macOS 10.14.

Er...


OpenGL sure.. it's time to die. Replaced with vulkan it has been. Oh wait.. vulkan? uhmmm.. metal you mean to say.


Well, at least MoltenVK is completely open source now (Apache 2 license). https://github.com/KhronosGroup/MoltenVK

So yes, Vulkan is kind of the universal graphics API now.


What the fuck?! I don’t get it, why?! They want us to focus on Metal? I definitely need to but desktop now, plus I want CUDA! Ryzen 5 2600X with GTX1070 and I’m done.


Got a caller on hold, says his name is The Future. He wants to talk to you about Metal Framework.


The future is Vulkan, not Metal.


On Linux and some Android devices, everywhere else not really.


Apple killed Flash and now they kill OpenGL. Both incredibly crappy technologies that survived for so long because of their popularity.

Nowadays Unreal Engine fill the role very well and can sit on top of any platform easily. Popularity wise , together with Unity have completely destroyed both Metal and OpenGL. Vulkan , stands no chance too. Unreal Engine is also open source and very easy to extend because of Blueprints. Unlike OpenGL , is not a C API trying to shoehorn itself into a C++ dominated world.

I have to use OpenGL 3.3 and I am amazed how badly designed it is. A brief look into Vulkan made even less sense.

I don’t agree with everything Apple does but killing Flash and OpenGL is two of my favorites. My only complain is that they have not done it as aggressively as they did with Flash. If they manage to kill JavaScript and HTML/CSS abominations I will become a hardcore Apple fan.


Unreal and Unity are 3D engines, OpenGL is a API on top of the graphics card driver much like Direct3D...they are both very different things!

Unity and Unreal engines are both built on top of APIs such as OpenGL and Direct3D


No they are not , they are both APIs. The fact they run on top of drivers is nothing impressive if anything they make a tiny fraction of the APIs Unreal depends on because it’s so much more than a 3D Engine. Unreal runs on top of Vulkan and could also replace both Vulkan and GPU drivers. API dependence is nothing special and hardly makes something so different. There is no limitation on how low level Unreal can go.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: