Ugggggh. As if graphics support on macOS weren't middling enough already. It's like they're trying to become as irrelevant as possible in that area.
I could understand if they were deprecating it in favor of Vulkan. That would be in-line with Apple's history of aggressively pushing forward new standards. But by no means do they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else. All they'll accomplish is further killing off the already-small presence they have in the gaming space.
Apple must truly hate gaming, or suffer from a serious case of Not Invented Here with their Metal stuff. As if any serious gaming studio would target Metal which doesn't run on Windows.
In fact, they couldn't get their act together, keep with current versions, and as a result titles like Elite Dangerous were being shut down anyway. Reason: OpenGL stuck on an old version without support of compute shaders.
To be fair, most games today are built using Unity3D, Unreal Engine etc, which all support Metal already. Hardly anyone writes their own game engines these days, and if they do they probably have the resources to support Metal.
Overall still a bummer though.
The problem is still with apple forcing them to invest resources, without any reason, but to advance their vendor lock in. And if you're a developper of a small high performance 3d graphics and gpu computing library like me, its just a giant middle finger from apple and I will either need to drop opengl/opencl or apple - there is no way that i can afford to offer both, especially since i'd need to buy apple hardware to test things.
The Witcher 3 is using RedEngine, GTA V RAGE, the Battlefields and SW:Battlefront {1,2} are using Frostbite IIRC, the two new Tomb Raider on Horizon, Rainbow 6 Siege and the Assassin's creed are on Anvil, Overwatch & SC2 have their own engines too, same for League of Legends, CoD are on a heavily customized id Engine, Minecraft is custom, Bethesda have their own engines too for Skyrim & Fallout, Path of Exile cutom too, all taken from Steam 100 most played.
That's a nice list, quite complete. Many console exclusives also use custom engines by the way, e.g. Decima for Horizon:ZD, KillZone and Death Stranding, Naughty Dog has their own engine (don't know the name), etc.
The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though. I think it's hard to argue that any of the companies that develop these engines would not be able to also add a Metal back-end. Many of them already work across a pretty wide range of back-ends anyway. Xbox One, PS4 and Switch all use completely different API's, for example. I think most of the work is not in adding an additional backend like Metal, but in tuning the back-end for some specific piece of hardware (NVidia vs. AMD vs. mobile GPU, etc).
Whether companies are actually willing to invest in a Metal back-end remains to be seen, but considering many of them license their engine for commerical use, I would be surprised if the major players will simply ignore Metal.
I tend to agree with Jonathan Blow's comments on Twitter, that the low-level graphics API should be just that: as low level as possible, small, focussed, and not actually intended (but still allowing!) to be used directly. Engines or higher-level API's can be built on top of that, with the option to dive down to the lowest level when needed (which will probably be a rare occasion).
DirectX will definitely not be this API because it is Windows specific. Likewise for Metal because it is Apple-specific. Blow appears to be of the opinion that Vulkan is also not moving in the right direction, because it is becoming too complex, and trying to be too many things for too many applications at the same time.
If true, in a sense, it's not that surprising Apple is doubling down on their own API. I think they should consider making the API open-source though, and develop something like MoltenVK (but the other way around) for Windows/Linux.
The top 10 most played today in steam are using UE4 (2), Source 2, Source (2), and custom engines (5: AnvilNext, RAGE, Evolution). That's a lot of variety, there's almost no reuse.
With a bit of luck, Godot Engine. Sort of a dark horse, but I like it and my very smart corporate-programmer brother likes it. He says it's designed like a programmer would design it: everything's a node. I know I did a game in Unity (which has become overcomplicated) and had a surprisingly easy time jumping into Godot.
Go back in time six years ago. What were Apple’s choices?
(1) continue to live with the deficiencies of OpenGL. Remember that, over time, it had come to fail at one of its primary purposes which was to provide efficient access to GPU hardware. Further, sticking with OpenGL would be to accept the leadership of a group that had allowed its flagship standard to falter.
(2) They could marshal their resources and create the better API that the Khronos Group wouldn’t/couldn’t.
They really had no choice. Note that Vulkan wasn’t announced until after Metal was released.
The gripes in this are should really be leveled at the Khronos group, which fumbled their stewardship of OpenGL and, with it, the chance to lead open GPU APIs.
The time table is being pretty generous to Apple. Metal, Vulkan, and DX12 are reworked versions of Mantle.
The entire point of Mantle was to be a proof of concept that could be reworked into cross platform API (which became Vulkan), there was plenty of work already being done by Khronos in 2014 (and Apple knew this). And they just went out and released Metal anyway.
I also blame Microsoft for the same thing, early parts of the DX12 docs were taken word for word out of the Mantle docs, that's how similar they are. But Microsoft at least had couple decades of having a competing API, but Apple went out to create a new one for some reason.
Talk about rewriting history, Mantle wasn't never supposed to become Vulkan, it only happened because AMD was generous and Khronos would otherwise still be thinking how OpenGL Next would look like.
While I get the concern, everybody's history here is backwards. Apple released Metal 2 YEARS before Vulkan. Why? Because OpenGL wasn't hacking it anymore and had become too asymmetric. Vulkan copied Metal, not the other way around.
I'm not sure they should have spun around and dropped Metal for Vulkan once it became available, or slow down the pace of progress til the rest of the market caught up. Doesn't make sense.
Also Apple is perhaps the largest GPU manufacturer in the world, with 200-250M GPUs shipped in 2017. That is 4-5X of Nvidia! Also Apple is investing highly in AI from tools to devices to GPUs, being able to customize may have tremendous value.
It is highly possible that Apple sees owning their interface stack as a means to keep their software-hardware warchest a couple years ahead of the competition. Which in mobile has been paying off of the last 5 years, as they constantly have crushed all others by 2-3X.
Does it matter anymore? People are using less and less of the higher level stuff of OpenGL. Most of the graphics code is now in the engine. OpenGL is getting very outdated, who starting a project today would chose it over Vulcan, Directx or Metal? I would bet most small shops would prefer to use some sort of middle layer or engine from a third party. That pushes the problem of implementing the lower layers in Vulcan, DirectX or Metal to a small group of specialists.
No games aren't going to target Metal to support the Mac any more than printer manufacturers are going to go out of thier way to support AirPrint to make printers Mac compatible.
What developers will do is go out of thier way to support iOS and supporting the Mac is just a side benefit. Just like almost every printer company supports the Mac as a byproduct of wanting to support iOS.
Hypothesis: There are more machines in consumer hands which support Metal than DirectX.
This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.
Its true Apple hasn't won the hard core gamer market, but they are no longer the niche player that had to cater to windows users.
If you're counting only gaming PCs (i.e. device used mainly for demanding 3D games) you should also count only gaming Macs/iPads/iPhones. How many are there in the world?
>This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.
There should do, albeit not for gaming but most of the office software is Windows with DirectX support. You won't be playing on, though.
Are there more Android devices that actually have hardware that can actually play high end games decently? The average Android phone is a low end phone - with an average selling price of $225 for all Android phones how can they not be?
OpenGL is part of the platform, they all support it. The stats page doesn't even include 'not supported' [1]
Being able to run anything slightly demanding is other thing, but you can't argue there's no support.
Also, the benchmark you linked is for application load, which is heavily influenced by storage speed and load method (android has to JIT compile sometimes) and has almost no impact from the graphics' performance other than the bus between CPU/memory and GPU
Being able to run something suboptimally doesn't turn into sales. I'm sure that the owner of a $70 Blu R1 HD is not going to be spending money on high end games.
It really isn't. The fastest GPU available is a Vega 64 underclocked to basically the performance of a normal Vega 56. A 1080Ti is ~50% faster. Even if you connect an external 1080Ti it's constrained by TB3 bandwidth.
Sure, but the whole point of both Vulkan and Metal is to bring out more performance by being lower-level. I'd assume that at least part of that benefit is lost when you use something like this.
But you have to rewrite it for every major hardware vendor, or else you won't get the performance you want.
GL should always work for the simple case, but instead you need to rewrite to avoid bugs in its many layers. And once you have Vulkan/Metal, industry-specific wrappers are better than the impossible to debug procedural GL junk.
I'm not sure I agree with the claim, but even if we take a full rewrite at face value, "every major graphics vendor" for desktop applications is NVIDIA and AMD/ATI. On mobiles, you're probably using Unity or similar middleware and therefore not thinking about bare metal (no pun intended)
Yes, but OpenGL is so outdated that the people that should use it the most (game and 3D application developers) were avoiding it due to a hardware vs. API incompatibility.
Vulcan was created to get that same portability, with an API that fits modern hardware.
All the target systems had latest Windows updates, and they all run Direct3D 11 software just fine (I mostly develop for D3D and I test on them). On some systems it works in 10.1 compatibility mode, MS calls that “feature levels”. Not a big deal in practice, the majority of D3D11 stuff still works OK.
No, you're getting stuck at 3.0 because you're hitting the deprecation strategy. You need to specifically request a post-3.0 context with wglCreateContextAttribsARB which you're not doing. Thus the system thinks you're an old legacy OpenGL app, and is giving you 3.0 as that was the last version before things were removed.
> No, you're getting stuck at 3.0 because you're hitting the deprecation strategy.
I think you’re wrong here. Two reasons.
1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.
2. Please read Intel’s documentation: https://www.intel.com/content/www/us/en/support/articles/000... Specifically, please expand “2nd Generation Intel® Core™ Processors” section. As you see in that table, Intel says HD Graphics 3000/2000 only support OpenGL 3.1, which is exactly what I’m getting from the GLEW library I’m using in that project.
Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.
> 1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.
Behavior depends on if the device supports 3.2+ compatibility mode which is optional.
> Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.
ok, so? 4.5 isn't really outdated, either. It still supports all the modern good stuff. And, as we've established at this point, it's not Window's stopping you from leveraging the full extent of the hardware you have. By contrast macOS does stop you from using the hardware you got to the fullest, as it's stuck on 4.1
> depends on if the device supports 3.2+ compatibility mode which is optional.
For the systems I have in this house it’s not required, i.e. I’m getting the same OpenGL version that’s advertised by the GPU vendors.
> You need to use the method I mentioned to get real post-3.0 OpenGL.
Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.
> ok, so? 4.5 isn't really outdated, either.
Right, but 3.1 (Intel Sandy Bridge) is. And 4.0 is outdated, too (Intel Ivy Bridge). Meanwhile, modern Direct3D works fine on these GPUs, 11.0 feature level 10.1, and native 11.0, respectively.
> Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.
Go read the extension I linked, it explains the behavior you're seeing. Also go read the tutorial I linked, it's using GLEW and shows you how to create a context.
You have a bug if your intention is to get a post-3.0 OpenGL context. Whether or not you care is up to you. You may be perfectly happy being in the compatibility bucket. I don't know. But you're not in the explicit 3.1 or later path.
> Right, but 3.1 (Intel Sandy Bridge) is.
Sandy Bridge is a 7 year old CPU. Of course it's outdated...? And D3D 10.1 is from 2007, it's also hugely outdated. You're getting anything more modern out of the hardware with D3D than you are OpenGL here. I don't even know what the argument you're trying to make is at this point.
No. Both ATI and Nvidia drivers include recent OpenGL versions, so OpenGL support problems are limited to actually not capable hardware.
In the old link you offer as example, Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions (hence the need to downgrade the client), and fortunately obsolete. Current Intel integrated graphics have improved. And VMware is a virtual machine, not hardware; it should be expected to be terrible.
> Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions
Technically that’s probably true. However, if you drop support of Intel GPUs, your GL4+ software will no longer run on a huge count of older Windows laptops people are still using. For many kinds of software this is a bad tradeoff. That’s exactly why all modern browsers implement WebGL on top of Direct3D, and overwhelming majority of multi-platform games and 3D apps use D3D when running on Windows.
> VMware is a virtual machine, not hardware; it should be expected to be terrible.
It’s only terrible for OpenGL. The virtual GPU driver uses host GPU to render stuff, and it runs D3D11-based software just fine. I don’t use it for gaming but it’s nice to be able to use a VM to reproduce and fix bugs in my software caused by outdated OS, windows localizations, and other environmental factors.
That's not why they do that at all. They don't need anything recent from OpenGL or Direct3D, which is why they target DX9. And DX9 specifically is targetted because that also works on XP, which D3D10 doesn't.
Intel GPUs D3D drivers have historically been better than their OpenGL ones (which isn't saying much since their D3D drivers are also trash), but now we're talking driver quality of one player which has nothing to do with the API itself or opengl somehow being outdated on windows.
But ANGLE also targets desktop OpenGL (and vulkan), and as OpenGL 4.3 adoption increases I'd expect increasingly more browsers to use it for WebGL 2.0 since you don't need translation there at all. OpenGL 4.3 provides full compatibility with OpenGL ES 3.0.
You seem to be pretty confused on how OpenGL versions line up with the D3D ones, too. For reference OpenGL 3.1 is roughly equivalent to D3D 10.1. When you're complaining about only getting GL 3.1, you're also complaining about being stuck with D3D 10.1
I can kind of understand iOS, but it’s not like there’s a thriving graphical computing market worth locking in on the mac side. All major titles already use game engines. They’d just be locking out smaller developers who can’t invest in porting all their shaders: it’s not gonna be worth the effort.
You notice that Apple supported OpenGL while they were making most of their money from desktop and laptop sales; but once iOS became so profitable, they decided to go their own way and start pushing Metal.
Lock in, or at least getting people to do more iOS first development (helped by lower profits on app store sales on Android, Android fragmentation, etc), helps Apple out a lot. You get the first version of apps and games, or more polished versions of apps and games, on iOS this way.
Maybe.... why would you assume they continue developing for macs at all? Small studios might not have the resources, and the market is tiny for many apps, eg indie games, modeling software, and ML (to be fair, apple has repeatedly emphasized they don’t care about ML on the desktop by not offering nvidia cards...).
And again, I don’t see the benefit for apple over supporting cross platform apis to encourage development. It seems like a net loss for everyone but some line in their budget on driver maintenance.
They do make some money on Macs, and Mac software, but not nearly as much as on iOS.
Providing macOS gives a developer and designer platform for iOS. That is really important for them. So Metal being available on macOS is important for that reason. But it's also important in that the Mac platform is still important, just not nearly as important as iOS.
OpenGL doesn't really have much of a future. Everyone is moving towards the next generation frameworks. It just happens that there was a lot of uncertainty about whether OpenGL could adapt or whether there would be a successor, and during that time Apple decided to invest in developing Metal. It wasn't until a couple of years later than Vulkan was released.
In the meantime, Apple has built up quite a lot of tooling around Metal.
And it's not like it's that difficult to write cross platform apps that target the Mac. If you write against major engines, they will already have support for the different backends. If you are writing your own software, you can still target OpenGL, or you can target Vulkan and use MoltenVK to run it on macOS.
And for the next several years, people writing portable software are going to have to either just target OpenGL, for compatibility with older graphics cards, or maintain at least two backends, OpenGL and Vulkan. Given that lots of games target DirectX first, and consider any of the cross-platform frameworks a porting effort, Apple probably doesn't consider it a big loss to add one more platform that needs to be ported to.
Tailored to their hardware, more modern and is written in Objective-C which makes it much easier for Mac developers to integrate in their projects, since Objective-C interface nicely with Swift and most script languages.
I've been using a cross-platform GUI framework/engine to do app development on all the platforms: Linux, MacOS, Windows, iOS and Android - and it has been a joy to deploy one app on all of these systems.
One of the reasons this has been so feasible has been the fact that the engine (MOAI) uses GL ES to manage the framebuffer itself - giving the same look and feel on all platforms the host runs. This has been, honestly, revolutionary in terms of building a single app that runs everywhere.
This now becomes more complicated because the engine has to be modified for Apple platforms to use Metal, and represents another fork/splinter in the unity of the host itself.
I wonder if their decision to use a non-standard graphics API is due to them wanting to make this style of development a lot more difficult in the future - i.e. are Apple passively antagonizing the cross-platform framework builders in order to establish an outlier condition for their platforms? The cynic in me says yes, of course this is what they are doing ..
>All they'll accomplish is further killing off the already-small presence they have in the gaming space.
In the AAA game space you mean. Else, in the casual gaming space, iOS is perhaps the most popular platform -- and the new integration effort means all those games will be able to run on macOS as well soon.
And those games are horrible. Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted. The biggest difference between those games and gambling is that you don't carry a slot machine in your pocket. For the most part the only exceptions to that are the games that were ported from desktop.
They work fine for me -- both as implementation and as gameplay.
>Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted.
I think you confused casual gaming with Zynga or something. I was referring to smaller, non-AAA megatitles. Could be anything from a platform game, to Angry Birds, Monument Valley, Threes, Letterpress, racing games, RPGs and so on...
I'm not saying there aren't decent games on iOS. You can find gems like Monument Valley, Florence, or, as I mentioned, the games ported from other platforms like Limbo, Terraria, and so on. But take a look at the top charts on the iOS app store and compare that to the top games on Steam. With few exceptions, the games on iOS are riddled with ads, microtransactions, and are designed to be as addictive as possible.
The point is that the kind of games that thrive on the app store tend to be exploitative and low quality. Desktop gaming isn't immune from that, but it's a dramatically better platform.
Not if Microsoft has any say. I can't count the number of Windows Updates that re-installed the previously uninstalled Candy Crush Saga, Bubble Witch 3 Saga and March of Empires (among titles).
I played the fuck out of Angry Birds - on Android. How exactly does forcing developers to adopt a platform specific API help anyone? That was a rhetorical question BTW, don't even try to answer it. Apple are being arrogant as fuck with this.
>How exactly does forcing developers to adopt a platform specific API help anyone?
Well, platform specific APIs aren't lowest-common-denominator affairs, and get support for native platform capabilities faster (plus can be more optimized).
You talk of a subset. A lot of casual games on iOS a very good: Cut the rope, Angry Birds, Bad Piggies, Simple Rockets. Civilization for iPad was very good. I can't remember all the stuff I've played but a lot of games are not the candy crush kind.
Also a lot went wrong when Apple opened up for ads and in game purchases.
PUBG Mobile is one of the best game I have played in a long time. And it doesn't cost me a penny. Nor do I have to pay to win. ( Actually I may have to upgrade my phone to play better )
But not every game are gambling. Fortnite seems to be doing great. And that shouldn't be a pay to win game.
PUBG was ported from desktop. It originally started out as an ARMA 2 mod, was turned into PUBG, and only much later ported over to mobile. It's a perfect example of the kind of game that can come out of the desktop gaming community. You don't get games like PUBG, Minecraft, Starcraft, Terraria, Civ, Kerbal, and so on without desktop gaming.
The games that grow out of the app store ecosystem are games like Candy Crush, Clash of Clans, Clash Royale, etc. I'm not saying good games don't exist on the platform, I'm saying the platform is conducive to low quality games. Almost all of the great games on the app store did not grow out of the platform.
That I certainly agree. But I do think the future are Games built on top of Game engine. Unreal seems to have a massive improvement changelog every 6 months and if Unity didn't exist I did doubt how anyone is going to compete within a reasonable budget.
Game Engine Choices would become the old day of OpenGL vs DirectX. No one sane would write their own game from scratch.
I don't expect there will ever be an AAA game presence on macOS at this point, given so few of their machines offer dedicated GPUs anymore.
And even in cases where they are available, for example Macbook Pros, the cost difference involved in stepping up from an integrated GPU to an entry-level dedicated card is greater than the cost of buying an Xbox or PlayStation.
Additionally, I don't think indie developers have loads of time on their hands to port their niche games over to a new technology. I can see Unity supporting Metal, but smaller platforms (jMonkeyEngine) will have a slower adoption rate, and in that time hopefully open-source middleware will come out to handle legacy APIs.
Well, at least in the Rust ecosystem there is https://github.com/gfx-rs/gfx which provides backends for Vulcan, Direct X 12, Metal and OpenGL. I'm not sure it's super relevant outside of the Rust ecosystem right now, but it's worth spreading the word around of such solutions.
All they'll accomplish is further killing off the already-small presence they have in the gaming space.
That of course depends on the definition of "gaming space".
The classical desktop gaming space, Apple was never a player in it. They simply don't care about it. Hence why they treated OpenGL on macOS the way they did.
But: Apple is arguably the biggest player in the mobile gaming space. That's what they care about. So instead of spending a large amount of money to attract a low number of AAA desktop titles to their OS they just tap into the vast (game-)developer base that they already have in iOS and make it easy for them to deploy and sell their games on macOS too [1].
The move to deprecate OpenGL and OpenCL in favor of Metal makes total sense in that regard.
I believe, as of now, there are more AAA games running Metal than there are running Vulkan. Pretty much every new macOS release is running Metal now, meanwhile a game running Vulkan on PC is considered a rarity.
"But by no means to they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else"
To be fair, the overwhelming majority of game shops develop on engines, and leave the engines to deal with the platforms. Unreal Engine, Unity, etc, support Metal, among others.
Not everything that runs on OpenGL is a video game. Tons of applications out there that just won't have the budget to do a rewrite(and even fewer were probably setup with the right architecture if they were depending on OpenGL in the first place).
Indeed, I directly thought of 3d applications like Blender, Maya, etc which use OpenGL.
It's a very weird move to me, even if the software in question will be kept compatible with Apple's Legacy OpenGL, these versions will be worse than their counterparts running on other platforms making use of new shiny OpenGL features.
It's like Apple is saying 'we don't care' to the 3d professional market, also doesn't Photoshop rely on OpenGL these days as well ?
After discontinuing AutoCAD for Mac in 1994 people begged for 18 years to get it back and now Apple says "eh, we didn't want that anyway."
I heard they have a WebAssembly/WebGL version now, betting that'll get wrapped up in a WebView and we can all pretend it's a native program still.
Speaking of WebGL, that's basically OpenGL ES 2.0, but I assume the implementation in WebKit is backed by Metal? What about other browsers like Firefox?
AutoCAD is a dead technology. Architects/Stuctural Engineers/MEP Engineers are moving to BIM platforms (Revit, ArchiCAD, etc)! Product/Automotive/Industrial design and engineering use PLM tools (Catia, SolidWorks, etc). Besides, AutoCAD didn’t/doesn’t need much graphics power at all. AFAIK it never really used OpenGL.
I'm not sure about that. Or maybe Adobe just doesn't care. My 2017 Macbook Pro has horseshit graphical bugs in both Illustrator and Photoshop. I'm exclusively doing all my graphics work on my Windows 10 machine now (even though windows and my Wacom tablet do not play nice together.)
I'm in the same boat, my so called Pro machine the USB-C 2017 MBP has had glitching out completely unusable rendering on the latest version of Illustrator since October 2017. Adobe blame Apple, presumably Apple blame Adobe because neither of them are fixing it.
As if my deteriorating keys on this machine were not bad enough. This wasn't a good WWDC for me. My PC is working great despite being an obscure setup with mismatched GPUs, can't say I understand why the graphic designer workhorse machine MBP is unusable with Illustrator and that is working just fine..
microsoft changed the pen behavior in one of the creators' updates and now the pen buttons behave strangely (randomly dont work in certain applications) as well as the pen being registered as a finger in legacy applications for a while... making windows 7 the only really viable way to use wacom for a professional (speaking as one)
let alone the inability to reconfigure things like n-trig pens to have hover right click/middle click functionality, it's been INCREDIBLY frustrating without any communication from microsoft.
I've got the Intuos Pro from a couple models back. Windows Ink randomly causes pressure sensitivity to drop out (especially since the creators update.) On Windows 8 I never had trouble with the wireless adapter, now I have to run wired. Button clicks don't always register and sometimes will send the wrong input.
Overall it's rough, there are days where it seems better than others - but I'll randomly lose sensitivity and multiple reboots appears to be the only pseudo-consistent means of getting it back.
That being said - It's still way more usable than Photoshop/Illustrator on my Mac.
I miss 15 years ago when I had CS2 + Intuos Pro 2 and everything just worked.
True, but they didn't remove OpenGL, they simply deprecated it (e.g. don't expect any updates to it, new tooling will not be built around it, etc). That shouldn't affect legacy apps.
Yes, and deprecation doesn’t mean a lot on the Mac. Apple often deprecates stuff and still leaves it in. They remove it only when there’s something to be gained.
(eg. linking with the system-provided OpenSSL has been deprecated for years, but AFAIK they still ship it.)
Apple can get away with that on iOS, but they're a lot more conservative with macOS.
To expand on your example, I maintain a legacy app that is stuck in 32-bit land because it relies on the QuickTime framework. QuickTime has been deprecated for seven years, and the transition to 64-bit has been in progress for over a decade, and yet my legacy app runs just fine even under the Mojave beta. There are multiple time bombs lurking in that app, and one of these days I'm going to have to rewrite it from the ground up, but I've been astonished at how long it has lasted.
Apple knows it would be bad karma to make a large number of legacy apps and games suddenly break on the Mac. They're not idiots; they have a perfectly good idea of the scale of mutiny that would ensue. So I'll eat my hat if OpenGL doesn't continue to work for at least the better part of the next decade.
They said in the Platform State of the Union that Mojave will be the last macOS that runs 32bit apps, so QuickTime.framework and your app are running out of time!
True — 10.12 is my recollection as well — but I’ve been bitten so many times by compiling under a new SDK, especially with an older build target, that I do that as a matter of course anyway.
I find it hard to fathom that people think a huge software company like Apple doesn’t have awareness of the impact of its changes or people responsible for compatibility.
If there isn't one already, I'm sure someone will implement OpenGL on top of Metal when it's needed badly enough. At least they're going closer to the hardware, not further away.
Why use Libre Office on a Mac when Pages, Numbers and Keynote are free (as in beer)? I’m going to go out on a limb and make a baseless argument that the Libre Office install base on the Mac is very low. On iOS it’s non-existent.
There is no way that Pages, Numbers and Keynote can open as wide a range of file formats that LibreOffice can. And there are way more features in LibreOffice.
Having used pages and word, please don't tell people to use Pages for everything.
It doesn't have the features you need when you're creating more complex documents. Last time I've used it it didn't even allow you to have different sections which allow you switch between the rotation of pages.
Every game developer I know turns off the metal rendering pipeline and uses the much more stable and refined OpenGL one unless getting every tiny bit of performance needs to be squeezed out.
I’ve witnessed plenty of last minute builds be saved by a Unity game dev on a Mac just flipping their renderer settings.
I don't think it's much of an incentive. According to Valve's hardware surveys roughly 3% of Steam's market is MacOS. Those type of numbers are similar across different distribution platforms that a Unity game dev will target. It'll be hard to nudge it away from low priority with that share.
The other figure to look at is the amount of money spent. If it's similar to the hardware percentage then you're right, if on the other hand, macOS users spend more on games then a rethink is in order.
This is a problem that is entirely of Apple's own doing.
Microsoft could not care less about OpenGL on Windows. However, it works just fine.
You know why? As soon as you install your video card drivers, your OpenGL implementation is no longer from Microsoft. It comes from AMD, NVidia or Intel, with all needed optimizations for their hardware.
Apple insisted in not allowing this and doing the OpenGL implementation themselves (which was always crappy and outdated).
Had they allowed the GPU vendors the ability to provide their own implementation, this would have been a non issue.
OpenGL is very much a second class citizen on Windows. Mass-market OpenGL apps like browsers currently use ANGLE to emulate OpenGL on top of D3D. Native OpenGL is used in professional apps that can make demands on GPU and driver setups.
(Many toolkits, like Qt and Cocos2d, also use ANGLE on Windows for OpenGL functionality)
What makes you think that Apple refuses to allow GPU vendors to provide an OpenGL implementation?
The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?
If producing an OpenGL implementation doesn't provide a competitive advantage for selling their products, why would they bother?
From 2012. Which means it is still accurate, given how out of date drivers are.
There are more references, you can look it up.
> The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?
They still have GPUs, which can be Intel, AMD or NVidia depending on year and model. Just because they are soldered on, doesn't mean they don't need drivers.
EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.
> Also, when did you last download a driver update from NVidia for your Mac?
Last week.
Nvidia releases drivers for cards that the drivers which ship with macos don't support. I would also guess that the nvidia drivers which ship with macos are written by nvidia under some agreement with apple, same is likely true of AMD and intel.
Yes, but you need to switch to an older version of of Xcode/ developers tools if you want to program CUDA on a Mac. Specifically I have to switch back to last Decembers release when I want to do any CUDA development on my 2015 MacBook (I don't think there is any later Mac's that even support nVidia).
Yes, once upon a time both Microsoft and Apple provided an implementation of OpenGL with their OS.
When Microsoft abandoned OpenGL for DirectX, GPU vendors produced their own OpenGL implementations because doing so provided a competitive advantage that allowed them to sell more product.
The question is, why would those GPU vendors do the same thing now that Apple is following the same path?
Apple doesn't even produce a computer with slots you can install their products into.
>EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.
Keep doing research, because NVidia provides downloadable Pascal drivers even though the last time Apple produced a computer with a PCI slot was the Cheese Grater Mac Pro which came out over a decade ago.
It just goes to show that NVidia thinks supporting CUDA everywhere is very much in their competitive interest, while creating and supporting an OpenGL implementation simply is not.
Next Office for Windows 10 is only available via the store.
Microsoft has taken the other approach, if apps don't come to the store, the store comes to the apps.
So thanks to the outcome of Project Centennial, they are now merging the UWP and Win32 worlds into Windows containers and making the store Win32 aware as well.
As a long-time professional game engine programmer, it is hard for me to see consternation over things like this, and avoid judging it as mainly ignorance. The amount of code in an engine that needs to touch the graphics API is tiny. A handful of methods for device init, filling buffers, uploading shaders, setting state, and saying "draw!" All of the graphics API code can easily fit in one medium-sized source file. As a proportion of the whole engine, it's very small. As a proportion of the whole game or app, it's negligible. It's also boilerplate. There are only so many ways to copy a stream of structured data into a buffer.
Legacy software, blah, blah, blah. No legacy software runs forever, and least of all on Apple platforms. Who cares.
Professional game engines are not the only application for OpenGL, though. Many people would like to build cross-platform software without using major abstraction layers, such as game engines. This could be research software, prototypes, small tools in general.. -- I think there is a long list.
For these people life might get harder.
Gaming is all about legacy software, especially single player games. As a gamer I'm very happy that almost all older games still work on Windows today (either directly or using an emulator like DOSBox).
(Of course that does not mean that the OS needs built in OpenGL support. If you can convince an old game to use some kind of OpenGL-Metal compatibility wrapper without needing access to the game's source code or support from the original developer, that's fine with me as well.)
I'm one of the happy persons about this move, because my competition use OpenGL on Mac and I just use software rendering. It was easy to see this deprecation coming...
OpenGL isn't pretty, but it's at least cross-platform. And my impression was that OpenGL support is mostly handled by the GPU manufacturers, so I'm not sure how much Apple gains here by deprecating OpenGL.
Requiring developers to use an API locked to a particular platform feels pretty hostile to me. Doesn't matter if that API isn't perfect, or even far from it.
Although I agree it's a terrible decision for Apple only to have Apple-specific graphics APIs, please note that:
* Being deprecated does not mean that things will suddenly stop working. It will take a few more releases of macOS before this can be removed.
* Next to MoltenVK there is MoltenGL, which is an implementation of OpenGL ES 2.0 that runs on (edit) Metal [1]. That indicates it's at least feasible to wrap OpenGL applications in the future if necessary.
Furthermore, Apple wil drop support for all Macs that don't support Vulkan in this release of macOS [2]. Ouch, what a waste.
Nah. The GPU on Intel chips is free and the eGPU thing, to me, is official notification that Apple think GPU's should be on the outside. I bet this generation of MacBook Pros are the last to have discrete graphics...
I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac? And last I checked I have the choice of OpenGL and Vulkan on Windows because these days MS doesn't control the hardware stack from top to bottom on their software platform.
>I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac?
Plenty of big 3D/CAD/etc players? In lots of creative areas, the Mac dominates still (despite stories about people moving to Windows nobody's going anywhere, where nobody = quite few creatives overall).
Besides, with Metal they'll target iOS as well, and that's a huge platform, and where most of the profits are for mobile.
CAD on Mac is pretty much non-existent, as is any professional 3D market - the market share isn't there, the hardware support is terrible, so few major players bother with supporting Macs. All this stuff is either Windows (CAD) or Linux (3D simulation, visualization) these days.
And with this deprecation Mac is pretty much dead as a platform for professional 3D.
Creative Suite has run better on PC and at better price-performance ratio for almost a decade.
Graphic Designers still like Macs for the most part I guess -- and I still see them in video production a lot, but that's starting to change pretty quickly.
That depends on who you ask. OpenGL is in the deprecated API section on MSDN[1]. Because of the ICD model, Microsoft can't prevent GPU vendors from adding OpenGL features, but they don't bother integrating it with modern Windows APIs. You can't create an OpenGL context on a DirectComposition surface or in a UWP app. It integrates poorly with the compositor. You can't get composition feedback, and most drivers will flicker or show artifacts when windows are resized. OpenGL apps don't get windowed-fullscreen optimizations and you can't control when they enter exclusive fullscreen mode. I don't think you can use windowed-stereoscopic or windowed-HDR either. All these issues push developers away from OpenGL and towards DirectX, which is what Microsoft wants.
It’s not deprecated because it’s not even there to begin with — Windows 10 doesn’t ship OpenGL by default; GPU vendors provide their own implementations.
Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them
At least on Windows, the OpenGL implementation is part of the graphics driver. Why? Because by default Windows only has between rudimentary (at least up to Vista, I think; I am not sure about Windows 7 and 8.1) and no (Windows 10) OpenGL support - this is what the GPU vendors provides as part of his graphics driver.
Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them
I'm not sure. NVIDIA provides updates for CUDA and an extremely limited amount of updates for their graphics stack (AFAIK none at all for integrated graphics, for example).
OpenGL is pretty. Much prettier than these Metal and Vulkan abominations.
The difference is that OpenGL is designed to be easy for humans. glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd(); you can't beat that. The issue is that it hard for the driver to optimize.
That's where Metal and Vulkan come into play. These are low level APIs, sacrificing user friendliness for a greater control over the hardware. It is designed for 3D engines, not for application developers.
Nope, glVertex3f was deprecated years ago by OpenGL itself. That is not the way the API works any more. [1]
Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.
1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.
Maybe the games were not very complex? Professional game programmers building games with lots of shaders are very familiar with what I am talking about. See for example this thread:
> What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.
State-based recompilation is a known issue in many GL drivers, particularly on mobile. E.g. changing blending settings may cause shaders to get recompiled. This can take up to a second.
Some engines work around this by doing a dummy draw to an offscreen surface with all pipeline configurations that they use at init time. This (usually) guarantees that all the shaders are pre-compiled.
I think the recompilations being talked about here are shaders generated by the OpenGL implementation behind your back. That is, your program never sees them as shader or program objects because they implement some permutation of blend mode, depth test, culling type, etc..
While Vulkan is a bit verbose, it's not an order of magnitude difference if you follow modern OpenGL best practices. If you rely on default state and use the default framebuffer and rely on implicit synchonization, you can squeeze it down to a few hundred lines but that's not a good foundation to build practical apps on.
To give a ballpark figure, my Vulkan "base code" is less than 2x what my OpenGL boilerplate is for the same functionality. The big difference: the Vulkan code is easy to understand, but the GL code is not.
Comparing "Hello World" doesn't make much sense, OpenGL gets really darn complicated once you get past the basics.
In my opinion a similar difference exists between CUDA and OpenCL. OpenCL takes more code to get something simple going. But at least it doesn't break if you upgrade your gcc or use a different GPU vendor.
Each to their own but over the last 6 months I've written a graphics engine in openGL + SDL. Once you truly understand modern openGL you realise how beautiful it is.
You will think it's less beautiful when you ship that game on several platforms and find that it has different bugs on each platform, on each hardware version, and on each driver version. And most of these bugs you can't fix or work around, you just have to bug the vendor and hope they ship a fix in a few months, which they usually won't because your game is too small for them to care about.
This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.
> glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd();
That's fine for a "hello triangle" program, but quickly becomes ridiculous for anything approaching a serious engine. There's a reason that glDrawArrays() has been around since 1995 (and part of the core specification since 1997).
While this is startling, it seems pretty consistent with Apple's modus operandi in a lot of areas -- leap forward to where they think the industry is going and hope they're right. OpenGL is effectively being deprecated by its own developers in favor of Vulkan, which has an open source implementation for macOS and iOS, developed in part by Valve, built on top of Metal:
If game developers -- and game engine developers -- targeting OpenGL now are in the process of moving to target Vulkan, and if MoltenVK ends up offering better performance on macOS than Apple's legendarily anemic OpenGL stack, isn't this likely to be better in the long run despite the short-term pain?
OpenGL is "learnable" by someone in the process of learning. Vulkan and Metal are much less approachable. This will put a huge damper on low-level graphics programming as a hobby.
I would disagree with that, especially with regards to Metal. It's a very approachable and well-designed API. It might not have the volume of resources that OpenGL does, but the docs themselves are good, and I have seen plenty of intro-level tutorials that are decent enough. Debuggability is also much better than OpenGL, which I think is important for newcomers. Debugging OpenGL issues is very, very painful, especially with macOS's lack of debug extensions. Metal is described as "low-level", but it's not quite at the level of Vulkan -- things are simpler and more "streamlined".
There's also the problem that a large chunk of OpenGL learning materials out there are hopelessly outdated, and IMO actively detrimental to learning modern graphics techniques. Judging from the types of questions I see around various forums, it seems to be VERY hard for newcomers to distinguish between "bad" and "good" OpenGL tutorials. In general, there's too much cruft for learners to focus in on the stuff that is actually part of "good OpenGL".
> OpenGL is "learnable" by someone in the process of learning.
If this is the case, it's only because there is much more material and tutorials written. Not because OpenGL is simpler or better.
I know because I watch noobs stumble with OpenGL all the time over at ##opengl in freenode. It usually takes them a week or two to get a triangle on screen, and they're super confused about semi-opaque concepts such as "vertex array objects" (they're well documented in the OpenGL wiki, but reading documentation seems to be out of fashion).
It would certainly help (them) if they had a good knowledge about 3d graphics in general before stumbling into OpenGL. But if they had, they'd be able to do it using Vulkan or Metal with no great difficulty. OpenGL isn't at all better here.
I believe (hope) that OpenGL continues to be that developer-friendly API. Building OpenGL on top of Vulkan shouldn't be too hard, and it means we don't have to pointlessly deprecate and recreate the huge number of OpenGL resources out there.
OpenGL isn't friendly to developers on either side. Tutorials are generally not trustworthy, there's no cross platform debugging tools, and errors just get you a "something happened vOv" code.
If you care about performance it will just do something slow at uncontrollable points in the background, like copy CPU-GPU-CPU memory, synchronize against the GPU, do format conversions, etc.
If you're the one implementing GL, it's gotten gigantic again since they simplified it. GL 4.3/4.6 core has compute shaders, which means you have to implement OpenCL twice but different this time.
Right, but it still involves several gigabytes of largely unused functionality to get a “hello world” and it pigeonholes you into a specific ecosystem. Unity is an entirely different offering from a graphics api.
I'm not sure why you say that, their official stance has always been that OpenGL will continue to evolve as GPUs evolve and need to expose new functionality.
I would be a lot more okay with this if Apple supported Vulkan, the more portable comparable API, rather than just the macOS/iOS-only Metal.
I also wonder what means for WebGL and its future. Right now, WebGL works in browsers on macOS, Linux, Windows, iOS, Android, which is incredible. There is no equivalent.
Sure, Apple has started working on WebGPU, but that’s not yet ready nor is it guaranteed to gain Linux, Windows, Android support.
Apple has so little to gain over Vulkan by developing its own API but so much to lose by not adopting Vulkan (gaming companies may actually prefer developing games on the cross-platform Vulkan to target macOS/iOS devices, too, at the same time, instead of using DirectX).
Obsidian didn't manage to materialize a Khronos working group, so it's not moving forward. Apple instead went with the W3C to form the GPUWeb group, based on their work on WebGPU. The Obsidian folks at Mozilla have decided to follow this path instead, see here:
However, there's no real writeup of what the API will end up remotely looking like right now, so it's too early to speculate. WebGPU's original prototype used Metal's shading language for instance (since the prototype came from WebKit), but any real standard probably will probably change things up.
I believe the webgpu-servo folks have, in the mean time, begun working on lower level components/libraries to target Vulkan/DX12/Metal, for use by systems like WebGPU. Sort of like ANGLE by the Chrome team, but for newer GFX APIs.
TL;DR absolutely nothing is fleshed out at all yet and it seems plenty will probably change
Maybe Nadella will push for Vulkan support on Xbox, or maybe Xbox will die off, who knows. Unless one of those happens, DirectX is not going away. As soon as consoles come into the equation, you are stuck writing a PAL (or using an existing engine that already has one) because they use proprietary APIs and that's unlikely to change.
It's not ideal, but that's the reality. Apple is following the idiotic status quo, but it's not fair to single them out for it (that being said, at least Microsoft supports Vulkan and OGL on one of their platforms - but the 3rd-party driver developers are mostly responsible for the great support).
My bet is they spin the whole gaming division off or sell it to someone like Amazon. It’s not really a great fit anymore and IMO they need to be bolted to a company with a greater interest in the creative side of the business (i.e. running a movie/game studio) than Microsoft.
They don't have anything to loose regarding not adopting Vulkan, because all game engines that matter to professional game studios already added Metal support.
Same applies to Photoshop and other relevant 2D and video editing professional tooling.
Professional game studios always favored hardware specific APIs that allows them to extract all the juice up to the last drop.
For example, OpenGL ES 1.0 + Cg on the PS3 was an adoption failure, with everyone adopting the PS3 specific APIs.
Seconded. This seems like a major step back for x-platform GPGPU. I always just assumed a natural transition from GL, CL support to Vulkan would occur at some point, but this is just a shame.
Maybe there is no equivalent, but WebGL is not a mature technology. Webgl stuff still breaks or has performance bugs whenever some part of the OS/Browser/GPU Driver/GPU Hardware sandwich changes. You can run the conformance tests yourself.
I don't see how a technology can get to a mature status if a major hardware company decides to not support it. The real question is WHY don't they support it? Is there a webMetal?
All we know is that Apple was bankrolling a portion of OpenGL development on OSX and now they feel otherwise. OpenGL ARB is a committee and Apple has only one vote. Maybe they were not satisfied with the direction the spec was going. Its certainly not an unfounded belief.
>I don't see how a technology can get to a mature status if a major hardware company decides to not support it.
Alternate reading - They gave it their time and money, and it didn't work out.
As someone who read this with an editor full of OpenCL kernels, I think apple must really have missed the point of these sort of frameworks - heterogeneous computing.
If I wanted the best possible speed, latest features ect. I would write multiple back ends in things like CUDA.
I choose OpenCL because I can develop code on my Macbook pro, and run that on a computer with a discrete GPU on a different operating system, and have a fair amount of confidence that it would work.
> I choose OpenCL because I can develop code on my Macbook pro, and run that on a computer with a discrete GPU on a different operating system, and have a fair amount of confidence that it would work.
That was the promise, but it never became reality. When writing kernels for real-world applications, OpenCL breaks down in numerous ways. The result is usually neither stable, nor portable, nor fast and a pain to work with. There was never OpenCL support for 3rd party developers on iOS.
You say you are writing OpenCL kernels on a MBP and they are portable, maybe you got lucky? Lots of comments I see on the deprecation on OpenCL seem to come from people who like the idea of having OpenCL (and its promises, which are awesome), but never had the awful experience of actually working with it.
I remember the open letter from the Blender developers on the sad state of OpenCL support on Mac (http://preta3d.com/os-x-users-unite/) from 2015. Some GPU vendors (AMD, Intel and Qualcomm) continued to put resources to better OpenCL support over the last couple years, but maybe too little, too late? It seems at least Apple had already given up on OpenCL by the time of this letter (and moved their resources completely to Metal), as nothing new has happened for OpenCL since then.
I'd prefer if we had a working OpenCL on many platforms. As we don't, especially not on Apple platforms, the step of deprecating it is regrettable, but at least honest.
I know that Apple is commercial organisation and not a charity but projects like Blender bring a lot to the platform.
It would be great to find out later that Apple had reached out to the Blender dev team with a strategy on how to move to either Metal or a Vulcan/Metal adaptor.
Personally I was thinking about getting an eGPU just for Blender use. It would be a shame to have to leave macOS just to run Blender.
Agreed, I am in a similar situation. This is very sad. Also, while OpenCL is a bit verbose to interact with directly, Vulkan compute shaders are much much worse. I realise that at some point I will have to start using it, but I'm not looking forward it.
>because I can develop code on my Macbook pro, and run that on a computer with a discrete GPU on a different operating system, and have a fair amount of confidence that it would work.
I'm not an OpenCL programmer by trade, but I have dabbled in it (Wrote an AES decrypter in OpenCL) and I have never found this proposition to be true.
All things considered, I think there are some companies that are worse to the FOSS community than Apple, but I can't think of one that has Apple's degree of baldfaced cynicism to exploiting FOSS and open standards only to the degree that it benefits Apple, and then throwing them under the bus the instant they're no longer useful.
Apple loved HTML5 when they had to kill Flash and get web developers to support mobile, but then as soon as it became a threat to the App Store, Safari's compliance came to a screeching halt and now Safari is in last place, even behind Microsoft's browsers, in HTML5 support.
OpenGL was useful when it was a way to potentially lure people away from Windows, but as soon as Apple had the clout to not care about it and force develops onto its proprietary API, that's what happened.
I almost prefer old-Microsoft's honesty about wanting to kill FOSS, rather than this blatant acknowledgement of FOSS as a tool to be ripped off to improve one's ecosystem dominance and then promptly thrown aside. Makes you wonder what's going to happen if and when Apple no longer needs Clang/LLVM, or, hell, Unix.
> I almost prefer old-Microsoft's honesty about wanting to kill FOSS, rather than this blatant acknowledgement of FOSS as a tool to be ripped off to improve one's ecosystem dominance and then promptly thrown aside.
Well, to be honest, Apple has always been quite consistent here. They created their own ecosystem and made interoperability with other systems as difficult as possible, at software and hardware level. So this announcement is quite in line with that.
Mac OS X is based on the closed source NextStep; in many respects, the Darwin kernel is the continuation of NextStep's kernel. A fair amount of the user space code of OS X is from FreeBSD, which to the best of my knowledge continues along merrily open source as ever. Apple actually hired one of FreeBSD's lead developers to manage their BSD technology group.
And, I mean, c'mon. The move from NextStep to Darwin moved the kernel to open source. Webkit? Open source. CUPS? Still open source. Clang and LLVM and Swift? Open source, open source, open source. Apple maintains an open source page. They put stuff on GitHub.
I get some of the hostility toward Apple here; they're not always good at playing with others, there have been complaints about the way they do (or don't) contribute to projects they benefit from. But the narrative that Apple hates everything open under all circumstance and is all about proprietary everything all the time just isn't supported by reality.
Apple has a tiny market share when it comes to 3D applications - OpenGL is mostly the "pro" 3D world, be it CAD, 3D visualization, 3D simulation and similar. None of that runs on Macs, everything is Windows/Linux these days.
So there will be little "forcing" into their proprietary APIs - the few 3D developers that actually tried to support Mac will kill the platform off because nobody is going to rewrite major piece of software to use Mac-only Metal. Too much effort for little to no benefit.
Basically Apple just killed off any 3D support they may have hoped for on Mac. Including any hopes on anything VR related (so much Oculus/Vive fans hoping for seeing a Mac support - it is now even less likely than Linux one ...).
There is a 3rdparty port of Vulkan and I am sure there will be 3rdparty OpenGL drivers (e.g. Mesa) but nobody is going build a CAD system on top of that, IMO. Without official vendor support it is just too risky.
Okay, thats one explanation. The alternate explanation is that Apple supports mature and robust technologies because they want whats in their users' best interest. Neither OpenGL nor OpenCL in their current form are robust. Certainly, that is not to deny that Apple might have a vested interest, but its naive to think that everything is just black or white.
RE: HTML5 - Apple simply made a mistake. Jobs famously said that they don't want to support native apps because bad apps could bring down cellphone towers.
OpenGL and OpenCL aren’t “robust” on macOS because Apple stopped updating their drivers after the version 4.2, which has been released circa 2011. Current version of the standard is 4.6, released July 2017.
Robustness is orthogonal to versioning. They were funding opengl dev on osx, and decided to stop. You can insert your own reasoning but I believe the more reasonable assumption here is that they were not happy with the direction the spec was going. Apple is strongly biased towards vertical integration. Owning the spec + OS + driver + hardware is the best way of achieving a high level of robustness (Whether they actually do achieve that remains to be seen).
Aren't most of the games on MacOS running on OpenGL?
This is going to kill all the older titles that are not maintained anymore. Terrible move just to push Metal down people throats. As if MacOS gaming wasn't dead enough.
I would be surprised if it disappears entirely. More likely some third-party hero (or perhaps Apple themselves) will spin off their GL implementation as a separate package - see XQuartz.
At least you'll have the option with the games being part of steam. How many games are unplayable and never will be playable again on any phone that upgraded to iOS 11?
Nope most of them use some sort of game Engine, unreal and Unity dominate.
MacOS gaming is not only not dead it’s flourishing rapidly. Take a trip to Steam shop. You will find a massive amount of games are made for MacOS and the numbers are growing rapidly since Apple lost interest into OpenGL. But then MacOS has reached a 10% of the desktop market making it much harder to ignore than in the past when it barely reached 3%. With iOS dominating mobile revenue , Appple is the undisputed king of gaming for over a decade now.
The backend plays a minor role, as I repeatedly say and I am repeatedly downvoted by ignorant people. Unreal supports on MacOS not one but 3 technologies. OpenGL, Vulkan and Metal and that is just for graphics. There is also CUDA, OpenCL, PhysicsX and much more that go far beyond the scope of 3d graphics.
A well designed graphics engine never ties itself to a platform however popular that platform may be. Whether that platform may be a OS , a graphics API or any kind of SDK.
At least Unreal is neither a game engine nor a graphics engine , its an entire ecosystem of tools , APIs. Unreal even extends C++ to facilitate for GC and some rudimentary reflective abilities.
So there are a ton of things go on, from low level to very high level.
Overall MacOS develpers have been very quick into embracing Metal for new projects and so did Unreal. Which is no big suprise because Metal gives access to both MacOS and iOS which is a variant/fork of MacOS.
If you look at the most recent Steam hardware survey though 96.3% of players are on Windows and 3.07% are on macOS. I don't know if that's "flourishing rapidly". It's certainly going up slowly.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...
Well I will just mention that i am an iMac user since 2007 and I replying to your message with Win 10 via Bootcamp. Mac gamers choose the bootcamp route because its an easy solution and gives access to more games and software. I am willing to bet that Mac gamers are at least 8% if not more. Bootcamp became popular way before Steam did and way before we saw the big switch of big Game studio from pure Windows to Windows/MacOS. Windows also has a notorious reputaiton of being far more stable on bootcamp that a regular pc and and its a reputation that my personal experience confirms it. Because I am a developer I have decided to stick with Win 10 and bootcamp and it has been a smooth ride so far but from time to time I go back to MacOS. Windows 10 also has been the first reliable OS that came from Microsoft.
Of course bootcamp is not the only solution, there is also wine like solution and vm solution but Bootcamp is by far the most popular for gaming.
Another reason to stick with bootcamp is that games may offer a MacOS support but are not quick to fix bugs and resolve issues usually becayse they develop on windows and use crossplatform APIs to port to MacOS using Windows and not MacOS devs which can cause all sort of issues that can take time to resolve.
Mac has twice the amount of users on Steam that Linux has. And game studios mostly ignore Linux. Apple making themselves precious is not going to win them any hearts in the game industry.
To be fair, PC gaming is a small fraction of the overall gaming market. All the consoles have their own proprietary graphics apis. It's ok if apple's api is proprietary too. Consoles support opengl, but it's not optimal. If they all strictly adhere to standard graphics apis, how do they differentiate themselves? Why make custom silicon with their apple 'bionic' chip? It's going to need an api to go with it.
Lol, Metal is Mac-only which relies heavily on CoreWhatever dependencies and thus can never be cross-platform, right? The only reason any game or CAD developer even supports Mac at all is because OpenGL is a cross-platform API that works great on Windows, Mac, and Linux, so they only have to write one type of shader program, etc. No game developer in the world will write both an OpenGL/DirectX/Vulcan and Metal renderer for the purpose of staying up to date with Apple's "deprecations".
If you're Pixelmator or Apple's own Final Cut team, sure, use Metal. For anyone else that wants to make a living, supporting multiple platforms is a given, so you won't pay the slightest attention to this deprecation notice.
> No game developer in the world will write both an OpenGL/DirectX/Vulcan and Metal renderer for the purpose of staying up to date with Apple's "deprecations".
Actually, most game developers do that. Pretty much every game (even ones with a custom, "non-AAA" engine) will have some kind of abstraction layer for dealing with graphics API's. Writing an additional backend for Metal is not a monumental undertaking -- it's a tiny fraction of the overall code you will end up writing. Also game consoles, for the most part, use their own graphics API's which are not portable to desktop PC's.
That's kind of true for games, but he's totally right about CAD and other software. Big 3d applications Nuke, Houdini, Maya all use OpenGL. Before OSX, Mac versions either didn't exist or were kept up about as well as IE or Word was. Over the past 5-10 years they've been reliably released for OSX at the same time Windows and Linux versions were (they all originated on IRIX). They all have a lot of other development going on and I don't see any of these companies making and maintaining a Metal version.
I'm sure the Mac version of these tools aren't used by any studio with more than 5 people, but independent contractors, small studios, and individuals working at home really benefit (or else they wouldn't have bothered to port and maintain up until now).
Yeah, but that really sucks for people who use powerhouse tools that took 15 years to develop because we were able to use Macs, but it's looking less and less likely going forward.
I really like the idea of creating more space for the little guy. I want them to do well, but honestly I haven't had the best experience. As someone who uses the professional tools during the day, but has only the occasional needs for personal use I was happy to pay for a Mac-only tool. In practice, for a tool I only grab every few months I often find there was a regression in some feature that I guess is only used moderately often (selection boxes), it doesn't currently support that feature (I can't remember, but it was something like HSV color space, more than 8bpp, or some file format like TGA or PNG--not features you'd add to v1, but nothing too obscure), or there's another paid upgrade.
Even beloved Mac tools like Panic's Transmit, I feel like I can't 100% rely on. At a moment's notice I'll have to jump to something else to finish the task.
While that may be somewhat true (and I disagree with the tiny fraction assessment, unless you are measuring in some other metric than time investment), I feel like it would still greatly increase your QA budget. Each additional back-end that is supported now requires extensive testing through the game development process.
In my experience, trying to get OpenGL to behave the same way (and with the same performance) on a bunch of different systems is actually more work than just maintaining multiple backends. Testing OpenGL on Windows does not in any way guarantee that it's going to work on Linux, or macOS, or whatever else -- even if you don't add a crazy OpenGL extension support matrix into the mix. So your QA budget is already going to include testing OpenGL on all those platforms.
I think they're expecting developers to use existing engines such as Unity and Unreal. Most game developers probably already use those engines are will not be effected.
On the other hand, they definitely made it harder for developers to create a new game engine.
Game engine not so much, Apple has always been an "also ran" when it came to games. Moreover, these days nobody is writing new major game engines - it is just way too expensive and difficult to justify vs. downloading Unity and starting building your game. Furthermore, games have always been something that is on the market for a year or two and then it is done for, with the developers moving on.
However, where it will have major effect is availability of any professional software. Such as CAD. Mac has always been a pain in the ... platform to support because of their weird "Unix but not quite" ways of doing things and now there will be no justification to support it anymore, especially in the OSS sphere. E.g. I fully expect things like KiCAD (and also the commercial Eagle which has Mac support) PCB CAD to disappear from Mac as soon as OpenGL is removed. Nobody has resources to rewrite such software to use Apple-specific Metal. Another such project is OpenSceneGraph, a large building block for 3D visualization and simulation applications.
If this means that macOS will lose opengl support even for X11 apps, a substantial part of academia will switch away from Apple. It's highly unlikely that software like ROOT or geant4 will ever get ported to something else.
I just recently bumped to John Carmacks stories about Steve Jobs, among which he convinces him to adopt OpenGL[1] (HN commentary [2]). Thought this might be an appropriate historic reference here as OpenGL now appears to be on it's way out on MacOS.
I too thought about this as Carmack mentioned that he believes it was one of his most important achievements that Apple adopted OpenGL back in the day.
As one of the creators of Direct X at Microsoft commented when Metal was first announced, "Why help Android siphon off their game developers by propping up OpenGL?"
I'm getting a bit worried about Apple dropping out of the business of producing professional tools and rig. I always liked that you can walk into an Apple retail shop and get a decent Unix notebook (though I've opted for a different notebook for my last purchase because of the lack of display, keyboard, and port options). But with Apple pulling out of OpenGL, what little pro (or at least pro enough for me) F/OSS software for 3D (Blender) and other graphical stuff was running on Mac OS won't any longer. I can't imagine Blender has the resources and inclination to port their software over to Metal, especially when Apple deliberately torpedoes their efforts.
Disclaimer: I'm not trying to be the Devil's advocate here, but just wanted to share an observation.
Apple always removes stuff which looks untimely or just plain stupid (headphone jacks, optical drives, USB/Firewire ports, optical in & out, rosetta, APIs, etc).
Always the same outrage has happened, but things normalize then. People, companies adapt, hell does not freeze over, company doesn't go bankrupt.
I feel that maintaining OpenGL & OpenCL felt like a burden to Apple. We all know that Apple likes to control everything from hardware to user interface, and GPU drivers are one of the most notoriously complex, overprotected part of the software stack. In the OpenCL world compilers and other stuff (I don't remember the terms clearly, sorry) also gets in, and makes everything much more complex.
Maybe this move will help them to slim the drivers to the basic "hardware-software" interface level and build metal and related technologies to their own term on top of this relatively simple interface.
I have a feeling that metal can be directly translated bidirectionally and relatively cheaply to OpenGL (and maybe Vulkan and OpenCL too), so at the end things don't become extremely complex for everyone.
Apple doesn't feel that backwards compatibility is strictly necessary unless things can be translated and made to work with relatively good performance.
As a Linux and Mac user for 10+ years, these are my observations. They may be wrong, technically incomplete or else. Feel free to discuss, debunk, or downvote.
> Always the same outrage has happened, but things normalize then. People, companies adapt, hell does not freeze over, company doesn't go bankrupt.
But that doesn't mean we took the best path. There is always an alternative future if x hadn't happened at all. If x hadn't happened, we might be in a better place, rather than accepting it and doing the best we can.
> But that doesn't mean we took the best path. There is always an alternative future if x hadn't happened at all. If x hadn't happened, we might be in a better place, rather than accepting it and doing the best we can.
Of course. I didn't mean to say that Apple is designing the best possible future or making the best possible decisions. I'd rather have a future where these technologies are supported as first class citizens by everyone, and I can just cross compile this stuff with virtually no porting or optimization effort, however unfortunately this is not the world we live in right now.
As I said, mine is an observation rather than taking one side.
Very much in line with Apple's way of doing things. God forbid they'd adopt some sort of open standard - even though they've hugely benefited from them.
Yes, and this was so much against their DNA that it required the chutzpah of a Carmack to get the Apple's CEO to implement the decision.
Apple is and has always been a strict leech on open ecosystems.
They try to build closed proprietary stuff first to lock you to the platform, and when what they build is bested by OpenSource, embrace it and move on to the next opportunity.
They did the exact same thing with BSD: grab a beautiful piece of OpenSource tech, bolt on a metric ton of proprietary closed source tech on top of it, and call the whole thing open source to get love and applause from the OSS crowd.
Yes, when they decided to drop all other ports AND their iphones still don’t have USB-C, so you cant use iOS headphones on your mac, and you cannot charge or sync via the default cable on your mac.
Nobody has USB-C stuff everything. Every owner is using a collection of dongles and it is absolutely stupid.
The OLED bar is also stupid.
I just hope my 2015 macbook never gets obsolete because I dislike the newer ones... and yes, I owned one, and I sold t after 2 weeks and got my 2015 back
At this state if i'm coding a game engine and want HWA graphics in all platforms i'm better of using middleware/user level frameworks like BGFX, GFX-RS etc, that abstract away D3D, Metal and Vulkan. Best choice. All that time learning OpenGL will not go to waste completely.
I think it’s a great idea to alienate developers from every other operating system out there and ensure that great effort is needed to maintain and port apps from other platforms. This is what we expect from Apple anyway, right?
Who’s going to start working on an OpenGL to Metal wrapper?
I suppose it got flagged as a dupe due to duplicated URL. However, I believe it shouldn't be - "what's new in macOS" is completely uninteresting to me, while deprecation of OpenGL and OpenCL is a big news.
This is bad: a hobbyist will be faced with a huge burden to bring anything 3D cross-platform to their audience. In the past, it was possible to use Qt or a nasty GLU/GLUT wrapper to write portable code.
Way back in 2006, in CS175, we implemented almost all of the core 1.5 pipeline in C++. Software OpenGL implementations may not be the fastest, but they’re more or less trivial (quaternions, trapezoid-based triangle engine, painters algorithm, z-buffering, texture mapping, bump-mapping, lighting and various shading models), and therefore accelerate-able with CPU and GPGPU SIMD ops.
Qt supports multiple backends for its own rendering, but the hobbyists will still have to rewrite their rendering code to support that new backend. For a lot of my programs, that's about equivalent to rewriting in another language.
Vulkan on Metal is already faster than OpenGL so this is a non-story. If you want cross-platform support use Vulkan, if you want ultimate performance target DirectX and Metal.
Would it be possible for this driver to add OpenGL/OpenCL and Vulkan support to macOS like it is done on Windows? (Or am I completely misunderstanding how this works)
Microsoft deprecated OpenGL back in Windows Vista over a decade ago. They still have to support it, because many major packages didn't switch to Direct-X.
I wonder at what point developers will leave. I don't think they will. Changes from Apple might seem to hurt, but people adapt. The changes are not big enough, nor damaging enough to those who have invested themselves in Apple to switch. Normal users won't see much difference.
If you have Apple desktop, a laptop, a phone, connected accounts, apps and associated data, maybe a watch or a TV you are fully within the Apple ecosystem. You cannot leave without it costing you major hassle and stress. This might cause surface inconvenience but it's not going to be enough to push anyone out from that ecosystem. Apple has their users where it wants them, and the users are happy.
Related to the article (but not the subject here); as a
user of "dark" themes:
The precise wording they use seems to indicate a view of dark themed sets as being less colorful. High contrast themes and visibility aiding limitations in theme color use have their places; so too do themes based around darker, more night time, friendly colors. Thinking of a system as having only one true theme, or of light/dark as being full / visually impaired themes is a dangerously limiting misconception.
So Adobe, Blender, Maxthon (Cinema4D), Maya, CaptureOne, DaVinci Resolve and 99% of the film, image, photo industry have to rewrite all their OpenCL kernels.
Also currently Metal and Accelerate are completely unsuitable replacement to OpenCL for Deep Learning ... Not that deep learning on OpenCL was a thing yet but I was adding support to it in my own framework.
Very disappointed about the deprecation of OpenCL. We use it to achieve cross-platform GPU compute usage (Windows, Linux, Mac). Dropping OpenCL is not going to encourage developers to target the Mac for such software as ours.
Why is anyone surprised by this? Apple has a decades-long history of taking outdated technology and making it obsolete by forcefully removing it (which they haven't actually done yet with OpenGL):
Parallel Ports
Floppy Drives
CD/DVD Drives
Older USB, Firewire ports
Network ports
They were the first to remove all this stuff, and everyone was shocked. Now they are doing this with an API. Both Apple and Microsoft have long ago created much more modern, highly performant graphics technologies over what OpenGL offers, and serious vendors support platform-specific APIs most of the time. If it takes Apple to do this and say to the world "wake up, OpenGL sucks," I view that as progress.
Despite, I don't think your OpenGL app will fail to run on Mac any time soon. I suspect it's years away before they actually remove it entirely.
Would it be correct to assume that swathes of old games will just stop working when 10.14 arrives? I'm not a MacOS user, but as a gamer that would frustrate the hell out of me.
How does this affect Angle, the WebGL implementation underneath Chrome? I believe Angle only has bindings for OpenGL (with Vulkan in alpha). It has no bindings to Metal.
I didn't know you could replace OpenCL with Metal. I thought for GPU computing there are just OpenCl and CUDA. Does Vulkan offer an alternative to OpenCL too?
you're all completely correct: Apple is doomed. Since no one plays any games on iOS (where Metal is the only real option), forcing Metal adoption on MacOS is doomed from the start. iOS devices will be left in their moribund state, used only for a few limited tasks like DTP and blogging or whatever.
They should take a page from Microsoft and adopt an open, cross-platform technology like DirectX.
To me this signals that the Second Moribundity of Apple is nigh.
The First Moribundity was the period in the 1990s when Apple was coasting on the DTP and Photoshop advantage over Windows they had, reducing the features of their desktops and not innovating. Schindler, a smart but ineffective leader, was replaced by Gil Amelio, a star in his prior field but unable to get Apple headed in the right direction. It took Jobs' return to right the ship that time.
The emphasis on thin but less functional and less serviceable laptops, the dropping of OpenGL, the cruft piling on top of OS X to no new net benefit for users, and their coasting on the desktop market all point to this IMHO.
Everyone is coasting in the desktop / laptop market. Refresh cycles are like 5-10 years on PCs now, and it’s not clear tablets even have a refresh cycle. You can’t get growth on PC hardware anymore, not even if you’re Apple. So it goes into maintenance mode. Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes.
Apple is all about mobile phones, the Apple Watch and air pods right now. The MacBook is barely a blip on their product radar.
"Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes."
I believe you should embrace the novelty concept called 'work' sometime. Then you may see some 'niche' application of PCs.
Mobile devices can, yes, be sold in bigger numbers, may produce higher profit, many trivial applications like browsing for news and sharing our newest and greatest experiences by tweets and photos do not require, luckily, a desktop anymore and so these paramount usages could been shifted to the only important platform of mobile devices, yet the second grade activities of design, manufacturing, academia and so still rely on the archaic concept of desktop computers and they keep alive this dying artefact. Even traitor mobile developers (all of them, the fools!) dare to use desktop computers still instead of solely relying on mobile phones. But not long, soon the last aircraft engineer or corporate accountant will sell their desks and throw away the last of the keyboards moving to the only necessary platform of mobiles so the PC can go extinct finally!
> Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes.
My personal impression is that while pre-built PCs are mostly sold to businesses, there is a strong growing trend since a few years to build your own PC from parts. I really do observe that if mobility is not an issue, people now tend to move away from laptops towards self-built PCs (or often rather: let a good friend build one). I don't want to go into the details what advantages these have over laptops, but just mention customizability with respect to requirements (e.g. very silent, very high-power, ...) and repairability.
A small (but only small) contributing factor is that laptops do not have sufficient power for VR, so you really need a stationary PC for VR.
The strongest "dampening factors" with respect to this trend are the growing prices of GPUs, because lots of cryptocurrency miners (in the last years in particular Ethereum) hoard them (but it is my impression that this will somewhat ebb away as soon as attractive ASICs for Ethereum get released) and the growing prices of RAM lately. On the other hand the release of Ryzen made building of PCs with either very high-performance or very good cost-benefit ratio feasible.
Probably the last couple decades saw custom PC building decline in market share as prebuilt solutions became mainstream, but perhaps it's seeing a resurgence now that most consumers have moved from focusing on computers to focusing on smartphones as their personal tech hub.
> Custom built PCs are a niche. The vast majority of people just want to buy a box that works.
That is why I wrote
"(or often rather: let a good friend build one)." :-)
Seriously: In my opinion (but others might disagree) an advantage of self-built PCs over one that some company produces is that you know exactly what components are inside (in particular you can buy components in which you trust), which makes you far less dependent on driver support by the manufacturer (i.e. you can find drivers in the internet by yourself if necessary). I often had bad experience with driver support by manufacturers (expect for a few well-respected names).
Also lots of cheap PC manufacturers install lots of crapware by default, while your self-built one is a very clean install.
In summary a self-built PC often "just works" far better than a pre-built one.
You're forgetting the most important reason to DIY: money. Where manufacturers are happy to charge you 2x the price to double your RAM or put in a non-shitty SSD, DIYing can shave off a few hundred Euro.
Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).
Looking on amazon or geizhals.eu (price comparison site), 16GB ECC DDR4 2666MHz sticks cost 200-220€ x 2 = ~450€ für 32GB. That means you can save 500€ on just RAM alone.
Same applies to SSD storage. And graphics cards. And CPU. And basically anything you could want to upgrade.
> Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).
Specifically Apple is well-known for their expensive pricing for better optional components (which is often necessary to pay, since for many models you cannot simply replace the component (RAM, SSD etc.) by another one on your own).
PC gaming is growing and thats where mostly the trend for self-built PCs comes from and most of the growth in PC market. It is still dwarfed by Laptop sales though.
As an anecdote, today i know hardly anyone who owns a desktop PC while 10-15 years ago almost everyone had one.
The trend around me is to see the shops that used to sell PC parts closing down, while consumer stores have 80% of their PC surface full with laptops, tablets and mobile devices.
It could, but I am willing to bet that is even more niche than going to a physical shop.
The kind of experts capable of understanding if they are ordering hardware that is supposed to work together instead of blowing up in some form when assembled, is quite tiny.
I'm sure it's rarer than I'd like to think, but if I managed to figure it out as a teen in the mid 90's without much in the way of internet access or budget, it's still probably easier today.
Anyone with a modern internet connection and a bit more patience than money (or at least a willingness to learn) can hop on Reddit or PC Part Picker and get a pretty good idea of what is out there and works together.
Compared to the days of making sure you had the right number and type of ISA, PCI, and AGP slots, assembling a PC from parts today is a breeze. Shopping online keeps costs low and places like Microcenter are great for buying in person.
I only haven't built one in a while because my current 5-year-old workhorse media/editing/gaming/everything PC shows no sign of needing a full upgrade any time soon. Sure I bought a new GPU after a few years when I got bitten by the experimental VR bug but other than that, it was an afternoon buying parts and snapping them together, an evening installing software, and 5+ years of "just working".
The “enthusiast” segment of the market is small. Even then you don’t have nearly the choice in vendors you used to, so people mix and match parts from the same 5 or 6 vendors who’ve been there forever.
Cryptocurrency mining has kept the enthusiast hardware market afloat, I have a feeling...
GPU prices seem to be normalizing the past week, at least here in Europe using historical price data from Geizhals[0]. If things continue we will be back to "normal" 2017 prices very shortly.
I disagree, I think Apple is still rather committed to the Mac even if it is no longer their #1 priority. Even if it's a small part of their profits, it's a big source of the strength of their brand (visually speaking, a Mac stands out against a PC far more than an iPhone stands out against any other smartphone), and that's something that's crucial for all of their products. Bad press about the Mac damages the brand, and if they don't continue to put out a good product (and correct their recent mistakes) they are going to keep getting bad press. I would argue that the Watch and Air Pods are more in the category of the Apple TV than they are the iPhone, because they're just a means of pushing up the average value of each iPhone customer. The Mac both does this and serves as a "halo" which is crucial even if it's less profitable. Honda certainly isn't making much money on NSX sales but that doesn't mean it's not still vital for them to produce it just for the sake of the brand (although I admit that's a bit of an extreme example).
The big signal for me that Macs are becoming second class citizens is if they ever start supporting Xcode on something that isn’t MacOS. That is the only thing keeping app developers on their platform, and if/when that changes, you know the final nail is in.
I would be surprised if Apple doesn’t throw in with Microsoft soon around mobile development. There’s a lot of symbiosis there — Apple and Microsoft don’t really compete in very many markets against eachother anymore.
Visual Studio is actually an awesome development platform (I’m including VSTS in this). There’s not much secret sauce in Xcode to threaten Apple’s App Store revenue — it’s just a bunch of signing keys that Apple controls and manages.
I do think Apple will make a strategic decision to diversify from core Mac OS soon. The ever-shrinking space allocated to traditional computers at the Apple Store has convinced me of that. They have more floor space allocated to the Apple Watch than they do the Mac these days.
Sure; but users stopped buying computer hardware on specs last decade (or at least the mass market Apple is chasing). Macs aren’t even any more expensive than top-end PC laptops. They’re just a brand choice.
They definitely aren’t usually buying them because of typical CPU/RAM specs or whatever, but I think things like battery life, trackpad, keyboard, display, general build quality are all still very relevant aspects of hardware that people pay attention to. The slow upgrade cycle is definitely hurting things and generally slowing everything down, but it’s not as terrible as it’s often made out to be. It’s been a few years since the 5K iMac launched and it’s still a great desktop. The Mac Pro was botched but at least they’re fixing it. We’ve only had 2 years of weak MacBook Pro releases, and it’s very possible that the next release later this year will address both major issues: a bad keyboard and no 32GB RAM option. The keyboard will matter more for most people, but the added RAM plus 2 more CPU cores and Vega graphics will be a big win for true pro users. I don’t think they’ve necessarily “coasted” on Macs just because the last couple years have been weak. It’s more accurate IMO to say they’ve just fucked up a couple times in the last 5 years — specifically with the Mac Pro and the butterfly keyboard devices.
Oh I agree; Apple’s brand carries a ton of heft in the laptop market because they generally get the “product” features (keyboards, battery life, screen quality, etc) right. It’s only so noticeable because their track record has been so good — we would still have plastic laptops and trackpads with mushy membrane keyboards without Apple’s industrial design incorporating metal and glass.
And honestly, their desktop line looks very similar to many PC companies. They make the Mac Mini (aka the Apple NUC), the all-in-one iMac, and the Mac Pro (ok, the Mac Pro sucks and there’s no excuse for it).
Anyone doing serious workstation tasks these days is likely using a cloud-based solution. There are a few specialized exceptions (ML development stands out to me), but none big enough to build a product around. Especially when doubling the entire PCs and Laptops category wouldn’t move the needle for Apple at all.
Approximate yearly sales of desktop PCs worldwide (all kinds) are 200 million per year... I can't find 5 year refresh cycles on any HP, Dell, etc. desktops or laptops.
They tend to refresh in line with corporate cycles, i.e. 18 month to 3 years, from what I see.
The main reason to buy a new phone these days surely is "the old one broke". Being with us all the time. The planned obsolence - unreplaceable battery, glued screen, glassback etc mean phones break all the time.
PCs stand still, even laptops tend be mostly stationary. No wonder my work laptop is now 6 years old and I feel no urge to request upgrade.
People drop their smartphones more often than their desktops.
But I expect cheaper devices to take more share, since the "high-end" won't give you much benefits over them. My 120 EUR Xiaomi already offers amazing cost performance compared to most other Androids.
The difference is that in the 90s they were close to bankruptcy whereas today they are swimming in cash. That's not going to help them much in the long run if they have lost their vision, but they are definitely too large nowadays to go away within a few years.
I think it is a good point, and they could probably survive about 2.5 years of losses without too much pain - but Apple's margins back in the 1990s were pretty good also: what hurt them was that sales dropped off a cliff, so their good margins weren't enough to save them...
There could be a number of factors that would hurt them, like margin compression, reaching market saturation, etc. All the ailments that Business Schools will warn you about...
However, they (IMHO) have become arrogant and out of touch with their users. VISION is their problem.
They need the MacBooks and desktop Macs to be seen as very desirable, in order to project the "creative people buy Apple" halo over the rest of their products.
This time around, they have way more cash and maintain a larger share of much larger markets, and keep tighter control of spending and resource use. Their decline, if you are right about the writing being on the wall, will be an extremely long and slow one when it begins (we're talking about the first derivative here, right?). Too slow, probably, to reach "moribundity" rather than an equilibrium point of mediocrity.
Their B.S. is getting out of hand. List of gripes:
1. Heavy handed regulation of app store: This needs intervention from regulatory authorities. Apple has successfully inserted itself between the consumer and producer of apps. It plays kingmaker, and very clearly promotes internal apps to the competition (recent ban of the Steam app is a prime example). It's only legal recourse stems from being ~17% of the mobile hardware market, but from a paid-app developer's perspective, not being on the app-store is a definitive death-knell. It's a clear monopoly in the 'paid-app' space, and others are arguing a monopoly classification on other grounds [1][2]
2. Unreasonable restrictions placed on approved apps - like disallowing non-webkit based browser engines. Restricting API access though user security isn't compromised (dlopen, webgl-2, opencl, vulkan, etc). Restrictions designed to choke-hold potential competitors & restrict user choice
3. Forced licensing of hardware parts for accessory makers (look up the Apple MFI program)
4. Purposeful non-conformance towards industry standards (like OpenGl, Vulkan, WebGl, many, many holes in open-wen compliance). This non-conformance is steeped in monopolistic, unfair-trade-practices psychology to maintain absolute, unfair control over any potential competition.
I hope France fucks them for purposefully slowing down older iPhones/iPads. And hope the FTC, EU regulatory authorities pay heed to app-developers (and consumers) getting shafted by a beast of a corporation.
1. If cable tv providers can get away with disintermediating TV viewers from TV stations I don’t see how Apple is any worse, especially since I can still buy third party apps, but I can’t either set up or tune into an indie TV station. Amazon is a far worse problem with respect to books (and without any demonstrable user benefit) so good luck waiting for regulation here.
2. Again, you can use Chrome on a Mac. The App Store is not stopping anyone from using anything, it’s just facilitating certain cases. Microsoft installs its apps on your machine without asking you (and has an App Store). Don’t even get me started on Nintendo. Seriously, you’re complaining about the least worst (commercial) player in this space.
3. Clutching at straws here. No one cares.
4. You mean like OpenGL, and two other OpenGL things. Again, Microsoft DirectX is what?
5. Apple intentionally slows devices down to prevent random shutdowns from peak power draw when the battery has aged. This is a good thing, well implemented, but poorly explained. Apple has paid a PR penalty for its lack of transparency, and having them suffer regulatory punishment for this seems harsh.
Ok, I get it, you hate Apple. But unless you avoid all commercial vendors I don’t think you’re going to find someone that satisfies your peculiarly quixotic requirements.
Everything on that list is relevant, but please keep in mind that banning custom browser engines isn't particularly bad thing for everyone.
Of course Apple doing it to force web developers to keep supporting Safari since no one can simply expect iOS users to install Chrome, but it's also benefit open web greatly since it's preserve status quo that we have more than just one browser engine. The day Apple stop doing it would be the day web developers going to abandon support for Firefox too.
I myself had to fix weird Safari and sometimes even iOS-only bugs more than once, but it's just Apple cripple it's own platform. Also even if it's annoying for web developers these things still keep our eye keen on how what we work on look in something other than Chrome.
Unfortunately Google has abysmal power over web right now and I cant count how many times I see them cripple experience for Firefox. Now they even outright saying that new Adwords interface only work in Chrome. And AMP only prove how little we can trust Google.
I would argue #2 has turned out for the better. Googles store is a dumpster fire of crap, malware, and knockoffs. On iOS there is less crap and less knockoffs and generally no malware.
Steam link is weird, but I thought I read somewhere that they would have had to pay Apple 30% of in-app sales.
But long story short is that I am paying them to curate the apps.
There is a story there that might be interesting to hear. I wonder if it has to do with it taking away from the incentive to produce ios native games vs streaming/controlling a game on a pc.
1. Not monopoly. That is like saying Walmart has an monoploy of its own selection in store.
2. What ?
3. MFi? The greatest invention ever. Now I am assured every single MFi Lightning cable is of quality and decent, instead of the crap USB-C has true into.
4. Industry standards of what? May be Playstation, Nintendo , or Xbox should all forced to be using OpenGL?
Yes apple should be fined for slowing down iPhone. Absolutely! But this. calling of B.S ... is just ... wow .....
And if you don't like or even hate Apple to bits, You can always buy an Android phone. Oneplus, Samsung, and the Vizo are all fairly good in Hardware spec. You have a choice.
You're right that this is ridiculous. It's time that we boycott Apple for their hostile policies!
Stop developing for Apple products. Stop buying Apple products. Tell everyone you know not to buy Apple products. If enough people protest, they will eventually be forced to change.
Boycott Apple... and then what? What's the viable alternative? Android? No, thank you. At least you have some leverage over Apple by the virtue of being the customer. The competitor's business model makes it much harder to influence, which is hostile to the customer and their privacy.
If you think Android is really open, think again. Without Google Play Services and Play Store you are basically hosed.
There's a lot of options. Easiest is just don't upgrade -- stick with your old phone as long as you can, and don't buy any more apps. If you need to replace your phone and don't want to use standard Android, you can use an Android fork that doesn't depend on Google. (I've been hearing good things about LineageOS, but haven't tried it personally.) Alternatively, you could try a less popular mobile OS like Tizen. Another alternative would be to use an older style non-smart phone. You could even ditch the phone entirely and go back to the way people lived before the last decade or two if you want.
Just don't give Apple any more money until they change.
Fair points. I don't think they are going to effective though for an average enough user to change Apple, and that just addresses the customer angle. As a developer you don't have much choice to boycott Apple.
You have to pick your battles in life. There are a huge number of companies I have chosen not to do business with and a long list that I'd boycott before Apple. I admire you if this is the cause you want to spend your life fighting for like I admire RMS but if that's the case one should really consider spending their energy ensuring a viable good alternative exists, and maybe lobby for regulations. Merely shouting 'boycott' in a niche community will not change Apple's behavior I don't think. This of course is not a discouragement to try, but a personal opinion on practically of achieving the desirable outcome. Especially so when the competition is much much more anticonsumer in a myriad of ways.
Android let's you install any app outside the store through sideloading. Imagine that, being able to install the software of your choice on the hardware you paid for!
FYI, it's been a while since you can get a signing developer certificate for free and sign and deploy any app you want for your own device. You can't easily distribute it though, as the complexity is equivalent to sideloading. It's a fair point that Android supports non-Google Play app stores and "Unknown Sources" app downloads. Google Play Services is a a very key tie-in though that's hard to avoid even if you're on LineageOS.
I use iOS also because of its strict regulation of appstore, and I'm definitely not the only one. Who will pay Apple for lost revenue after someone implements your suggestion, how are companies going to innovate without these funds (I assume your answer to the previous question is something like "that's just business risk" - no, it isn't) and how can customers such as me find their desired products if they're banned, arguably for no real reason (no one is getting hurt) at all?
You do know you have choices in the operating systems and hardware you use. If that's not a viable option, engaging in dialogue with Apple might be the way to go.
Looking at the clearly sparse docs, Metal looks like some jackass's pet project. For people who get off on making everyone else use their system by systematizing the obvious and the already done, graphics apis have an allure. However, most people, let alone companies are not capable of replacing OpenGL and OpenCL.
What the fuck?! I don’t get it, why?! They want us to focus on Metal? I definitely need to but desktop now, plus I want CUDA! Ryzen 5 2600X with GTX1070 and I’m done.
Apple killed Flash and now they kill OpenGL. Both incredibly crappy technologies that survived for so long because of their popularity.
Nowadays Unreal Engine fill the role very well and can sit on top of any platform easily. Popularity wise , together with Unity have completely destroyed both Metal and OpenGL. Vulkan , stands no chance too. Unreal Engine is also open source and very easy to extend because of Blueprints. Unlike OpenGL , is not a C API trying to shoehorn itself into a C++ dominated world.
I have to use OpenGL 3.3 and I am amazed how badly designed it is. A brief look into Vulkan made even less sense.
I don’t agree with everything Apple does but killing Flash and OpenGL is two of my favorites. My only complain is that they have not done it as aggressively as they did with Flash. If they manage to kill JavaScript and HTML/CSS abominations I will become a hardcore Apple fan.
No they are not , they are both APIs. The fact they run on top of drivers is nothing impressive if anything they make a tiny fraction of the APIs Unreal depends on because it’s so much more than a 3D Engine. Unreal runs on top of Vulkan and could also replace both Vulkan and GPU drivers. API dependence is nothing special and hardly makes something so different. There is no limitation on how low level Unreal can go.
I could understand if they were deprecating it in favor of Vulkan. That would be in-line with Apple's history of aggressively pushing forward new standards. But by no means do they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else. All they'll accomplish is further killing off the already-small presence they have in the gaming space.