Yeah, no. There's talks of Blizzard for instance dropping support for Mac. Dice has a blog post that's messaging concern. It looks like a bunch of engines are going to limp on MoltenVK which kind or imposes a weird impedance mismatch and gives weird perf issues soemtimes that you probs wouldn't see with a native Metal backend.
And that's before getting into the release of Metal 2. There's a non zero amount of work to support it, and it's not clear how long Apple is going to support Metal 1.
And all of that is before all sorts of other crazy stuff with Apple changing their app signing requirements, messaging that they're going to require all apps to be signed by Apple in some future macOS release (but won't tell you when that is).
I wasn't talking about in-house solutions, rather engines that many AAA studios buy in order to actually focus on the game itself.
As for the rest of your remark, it comes up in places like HN, but not at all when attending local game developer meetups, developer articles on Making Games, Gamasutra, Connection, IGDA, or many other professional publications.
I mean, my day job is supporting an application across Win/Mac/Linux. Even ignoring the graphics, Apple is easily the hardest to support. I don't really care if you haven't read a magazine article on it.
And to pretend like FrostBite doesn't matter is ridiculous.
So wait, the AAA developers not supporting don't count against your argument? Even in the case of Blizzard who has famously been one of the biggest Mac supporters? Isn't "all AAA support Mac... except all the ones that don't support Mac" a tautology?
Also, just noticed that you lumped in Unity with AAA, lolz. What's next libgdx?
> Because their focus is clearly PlayStation, Xbox and PC, not even Nintendo hardware.
> There are plenty of other AAA studios using Unreal, Unity, CryEngine.
Blizzard's focus had been on Mac in addition to Windows. With the switch to Metal, they're probably abandoning it. FrostBite means that EA AAA games probably won't either. Ubisoft didn't release Assassin's Creed Odessey on Mac. And even looking at Unreal Engine 4 games, only Fortnite and a twoer defense game have been released for Mac. Looking at CryEngine, no games have ever been released for Mac. So where is all this AAA support for Metal that you're talking about?
To lead you to water, Mac support is a nice to have so that their in house tools work with the artists' platforms they're used to. But they don't care enough to finish out the QA, or put in any work to make the game actually shippable on that platform. The switch to Metal means that you can't justify it with "well we can just support OpenGL and get Mac for free" like they used to.
> Maybe you should check again the names of some studios using Unity, ever heard of Nintendo and Microsoft?
AAA is about the games, not the studios. Name a single AAA game on Unity.
It was about three years ago, that through pseudo public channels, that Apple started messaging that OpenGL was on it's way out. Oh, look what Blizzard game came out (Overwatch) which has pretty flagrantly disregarded the idea of Mac support, even entertaining the idea of possible switch support.
> Yet OpenGL doesn't make them support Linux any better.
> So support or not for Metal is not the real reason why they don't want to focus on the Mac.
"But they don't care enough to finish out the QA, or put in any work to make the game actually shippable on that platform.". Mac was a fixed platform, and you used to be able to justify the engineering because the work ultimately helped make your Windows port better ("the end user will have a way out if there's a bunch in their DirectX drivers"), and let your artists do all the work on the tools they were used to. Then if you're running your tooling on Mac, you've been supporting it the whole time and there's very little QA overhead for release since it's a relatively fixed platform. That last part doesn't apply to Linux. This whole time I've been saying it's not just OpenGL->Metal, it's a Nexus of several things all coming together to break the camel's back.
> As for games, Nascar Heat 3, for example.
You know that a game that's less than $50 at release isn't a AAA game, right?
Today I learned that games like Sea of Thieves, Fortnite, Hitman, GTA, Assassins Creed aren't AAA because they are too cheap according to your price table.
(not in the game industry, but a graphics programmer)
Are there really no games out there that program their own graphics anymore and don’t rely on “middleware” engines? This seems shocking to me. Then again I was shocked the first time I learned that most games don’t hand-code assembly anymore. Things move so fast.
AAA studios always use middleware, if it isn't bought, it is done in-house.
The actual set of 3D API is a tiny portion of everything that a game engine requires, among scene management, materials handling, graphical editor, plugins, sound, physics,....
So one always ends up with a pluggable rendering layer, where adding a new API is relatively simple.
Now what has been happening is that with production costs skyrocketing, most studios are increasable adopting external middleware that they just adapt to their purposes than writing everything from scratch.
For example, you can get Unreal and get support for NVidia's raytracing features out of the box, or invest the money to develop the same features from scratch in-house.
The culture in the games industry is that what matters is the story, gameplay, taking advantage of hardware features and getting the game out there, tech comes and goes.