Not that these Apple chips will run x86 or x64 code. Unless they bother with a Rosetta 2.0 then gaming is out of the question if Apple makes this jump.
The Catalina move already killed off a lot of games where the developers didn't provide an updated (x86_64) binary. A move to ARM would kill more, of course, but it's not like there's not precedent.
I'm half-convinced that Apple killed 32bit support so early precisely to see how developers coped; if it had been really bad they could have re-introduced it in a point release of Catalina. As it was, it wasn't very bad and most developers complied, which is an argument in favour of an ARM transition being feasible.
The only other reason I can think of to so aggressively move to 64bit is security, but most of the apps that were stuck on 32bit were not that big a security concern.
Having worked at big companies, I start to suspect that many recent Apple deprecations are detached from technical reasons or customer scenarios. They are playing internal games and don't care if it makes sense nor will they have any interest in reversing or revisiting the wrong call later on.
I am not just talking about 32-bit support. It shows up in a lot of random libraries that wind up deprecated and replaced with something less capable. That's a pattern I have seen a lot elsewhere and it's usually a bad sign for overall product quality.
Yes, I think the true reason, that Catalina is so incompatible with old programs, is, that they wanted to have the big compatibility breaking before the announce the ARM transition, which then would look as not so much a big step.
This does not make sense. I can give an counter-example for this: the education market, especially the higher-education market. This market exists so long and aged so well that, in real life, enterprise-level deployment of Macs well-likely exist mostly in this market now. The down-side for this market: they are slow or reluctant to change. Those ones who make research-related or educational software never are quick enough to do the big jump for an architecture change. And they also might not be able hire more people to do this. And the customer also hate to do those type of changes, both the IT department and the researchers, nobody wants to find their code couldn’t run properly anymore on these new machines.
But that is already true. You cannot buy a mac any more which runs 32 bit software. So Apple is obviously willing to make things miserable for a considerably part of their user base. Me included. I will avoid anything with Catalina, because I still have a very few 32bit programs I want to run and can't upgrade. As long as macOS runs on x86 hardware, there is not much justification for such a break. Yes, they clean up the software stack, but at a very high price.
The only good reason I can imagine is, that when Tim Cook announces macOS on ARM, he will claim "runs everything that runs on Catalina".
Doesn’t that invalidate your earlier claim that “it wasn't very bad and most developers complied, which is an argument in favour of an ARM transition being feasible”? It doesn’t matter how nice the experience is for those who upgrade if a significant number of people avoid upgrading because the experience would be terrible. That’s selection bias.
Sure, Coke sales are down 50%, but the customers who are buying New Coke say they like it just as much as the old recipe!
Which earlier claim of mine? Are you mistaking me for another poster?
My point was, that they already had the breaking change so the change for the Catalina users - which certainly is only a part of the Mac users, many stayed on Mojave because of the 32bit support - will be smooth.
Not sure what you mean by "diverse". There is a lot of legit macOS software, which is 32 bit only and no longer updated - for example because the company went out of business or couldn't justify the effort for a port. Cutting support off for these programs is a harsh step. While I can understand that Apple doesn't want infinite backwards compatibility, it hits a lot of users. One reason for this might be the preparation for the bigger switch to a new cpu architecture.
Apple market is: creatives, iOS developers, some other developers and MAC enthusiasts.
Enterprise is Windows territory, most businesses are Windows territory, education is Windows territory, most home users / small businesses are also Windows territory.
So if Adobe apps will have an ARM build, that will satisfy a huge part of their user base. The rest would use Apple tools which will get ARM builds and open source tools which already have or will have ARM builds.
> I'm half-convinced that Apple killed 32bit support so early precisely to see how developers coped; if it had been really bad they could have re-introduced it in a point release of Catalina.
Is it really so easy to introduce 32-bit support back?
I mean, it depends on exactly what they did to get rid of it. For many Linux distros you add 32bit support back by "apt-get install glibc-x86" or similar.
_If_ they were taking the approach of a deliberately early deprecation, which it seems like they were given the timing relative to the rest of the industry, it would only make sense to make it be an easily reversible decision.
Have you looked at the iOS App Store lately? There are tens of thousands of games there. Apple Arcade has a small (100+) but well-curated selection of good games that run on both iOS and MacOS.
Oh, you meant PC games? Sure, the Mac only has a small percentage of (for example) Steam games, but that percentage is steadily rising - it's now over 25%. Switching architecture is unlikely to present a major problem for most developers, especially given that they're probably using Unity or Unreal Engine.
>Switching architecture is unlikely to present a major problem for most developers, especially given that they're probably using Unity or Unreal Engine.
As a former game developer I can tell you that that is an issue. Apple is totally against using cross platform tools. They break compatibility as much as they can.
Framework change, architecture change and so on.
Instead of going with OpenGL ES, Vulkan, OpenCL they made Metal.
If user base is large enough, as is the case with iOS, there's an incentive to go trough the pains of releasing for that platform. But that isn't the case with macOS. Maybe for Adobe is worth it to spend resources to build software for macOS, but for other companies that might not be the case.
Anyway, you make much more money by targeting Play station and Xbox, and the resources needed are the same in money and man hours so it kame sense to target macOS last, if ever.
And no, not everybody is using Unity and Unreal. That is true mostly for indies.
LLVM Bitcode is still architecture specific. That is, bitcode generated targetting x86 is not compatible with bitcode targetting ARM is not compatible with bitcode targetting aarch. The reason they are distributed as bitcode is to enable additional optimisations based on what exact chip will run the code (i.e. -march=native equivalent)
Using LLVM IR merely solves the problem of being able to compile to a specific instruction. You still need a compatibility wrapper for all the platform specific APIs like wine or Windows on Windows. If Apple puts in the effort then it might work out.