I wonder how long it will be before Apple releases a laptop with an A-X chip.
They’re already shipping their custom T2 chips in their laptops. The compiler toolchain can build great binaries for their A chips. They’ve swapped CPU architecture before and the modern Mach binary format can hold versions of the executable built for different architectures.
They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch. That might be tricky because of the huge surface area of the x86 instruction set. But I think it will only be a matter of time. It might also explain why they have kept the MacBook and MacBook Air product lines - they might want one of them to stay with intel’s cpus and the other to switch to their A* chips going forward. Or maybe they’ll just wait another generation or two and switch CPUs across their whole line in one go.
>They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch.
Well, and all the applications that won't switch. Also even on the Mac virtualization/containerization is not nothing. The Mac is a different market and use profile then iOS, and while Apple based on past history won't support an old arch indefinitely neither are they likely to completely blow off backwards compatibility. Compared to previous transitions dropping x86 would have extra complexities as well, so previous experience may not be entirely applicable. In particular Apple would be moving away from the full fat computer standard rather then no change or towards one, which may change the payoff for users despite Apple being much bigger. The absolute performance differences (immediate and future) also aren't likely to be as big.
I don't want to underestimate them, and huge disruption is inevitably coming down the pipe anyway and Arm may well emerge a winner there regardless, but it's also just a really big challenge.
>That might be tricky because of the huge surface area of the x86 instruction set.
Transmeta was able to do a decent job, and I think Novafora is still around and licensing their IP? Granted a lot of instructions have been added since then, but Apple certainly has a lot of expertise there as well and a great deal of capital to aim at the issue.
> To think that deep in the Apple labs that don't already have A-X laptops running - and have for, for a while is not thinking like Apple would.
Right. To me it's unthinkable given the amount of low-level code shared between iOS and macOS that macOS hasn't been running on ARM since day 1 of iOS.
Yeah; if nothing else, the first versions would have been "port Darwin to this SoC". Maybe they never ported the GUI components, but it would be unsurprising if they did.
>Apple's track record of being able to switch architectures is absolutely incredible. If they want to switch, they can switch.
Sure, but don't discount how every specific instance can be a bit different either. As I said they've got expertise, they've got capital, and there are even previous paths to follow here. But at the same time every time has its own unique hurdles. Previously with 68k -> PPC and then PPC -> x86 for example they were going to something that was not merely just an improvement in some important respect right off but also had a clear long very steep growth ramp ahead thanks to fabrication improvements if nothing else. In the PC world we were still very much in either a very steep or at least steeper part of the S-curve. But those days are just plain done, the issues presented by physics and the geometry sizes being worked with now are simply fundamentally harder. There is certainly more room for improvement year to year for a long while, and more chances to grow horizontally with valuable new features, but it's not like a system made now will be obsolete in 3 years either.
Additionally those were coming at points in Apple's life with a dramatically smaller installed base, and they were also going away from something more proprietary (at least in principle, obviously CHRP never actually worked out that well). The move to x86 which brought Macs in line with everything else meant a huge amount of software opened up to more trivial porting, huge amounts more opened up to trivial virtualization, a vast 3rd party hardware market became more easily accessible, etc. That definitely helped offset some of the old Mac software that ultimately didn't make it, even more so because due to above it is still quite possible to run old Mac software fine: Classic can be emulated, and 10.6 can be run under virtualization still which in turn grants access to Rosetta even on new Macs, and the absolute performance advantages vs 12+ year old systems are significant enough that even with the overhead it's still fine.
Basically there are a lot of subtle day-to-day advantages that come from everything running the same instruction set underneath, or at least being able to stick some sort of translation layer in there. Again, absolutely not saying it's something Apple can't tackle, just that it's a big challenge and I think it's bigger now then it was any time previously. Of course, Apple too is bigger now then any time previously! They're not infallible though and I hope they get the balance right here.
Would dual chips make sense? A bit like their dual graphics, run most of the OS and anything compatible in one, and have another, low power x86 core take over transparently on demand (like, when running an x86 only program)
Aside from performance, a huge reason to abandon Intel is security. I don't see Apple keeping an Intel chip if they switch to the A-X series as this would defeat much of the point.
Anyways, Apple has never been one for smooth transitions. Their history is dotted with big, bold changes. If they kept x86 they would slow the adoption of their new architecture. Apple will likely take a "take it or leave it" attitude like they did with the CD drive and headphone jack.
I don't think they will. They have already expressed the iPad is their vision of the future of computing. A completely locked down platform allows apple to have full control over the entire computing life cycle and achieve results unparalleled elsewhere, whilst also making their product incredibly sticky.
I think Apple can and will justify it. If you think back to Steve's analogy about cars and trucks, there's still (obviously) a very healthy market for trucks.
Despite their ongoing efforts to make the iPad more capable, I think and hope they'll recognize the value in keeping it simple enough for anyone to use, and thus having a separate macOS experience with more tools/flexibility in a laptop/desktop form factor.
Apple doesn't really care much about backwards compatibility. I'm sure they will have an x86 emulator, but it probably won't be very fast (read: useless for games) and will probably be dropped entirely a few years down the line.
>Apple doesn't really care much about backwards compatibility
Depends on what you mean by "much". They have made good backwards compatibility an important part of every single architecture transition so far, and on the Mac there were good 3-4 year official transitions at least (the 68k emu still ran under Blue Box/Classic Environment so it lasted through 10.4 Tiger, Rosetta lasted through 10.6 Snow Leopard). And that's official, in practice there have continued to be longer last options.
It's certainly not the degree that Microsoft has traditionally cared, but it's not at all been blown off either.
This is what I'm waiting for. I can't find a reason to upgrade to the current line of MacBook (Pro, Air, etc.) Better performance, better battery life than Intel and I will no longer be funding Intel's lazy way.
Bloomberg had a report earlier this year saying they'd have ARM Macs in 2020, which makes sense given that lines up with the work they're doing in Marzipan now [1]. They're also making gains in the legal fight to crack the Qualcomm business model, so that means ARM Macs could have cellular at the same time [2][3].
I don't see the point of an A-X chip in a laptop. You aren't taking photos on your laptop, why does it need the image signal processor? There's no FaceID (yet, maybe it's coming) on laptops, so we don't need the ML chip.
When you strip away all the stuff a laptop doesn't need, you're left with... an x86 chip!
ISP is also used for the video, as in the front camera used for video meetings.
FaceID is not solely dependent on ML, it's also managed by the secure enclave co-processor which is also used for Touch ID which is available on Macs now. ML helps to reduce false positives.
Apple's T2 chip is an ARM-based processor that's already in almost all of the newest Macs, it is used as a storage controller (which allows Apple to encrypt the drive very fast and transparently), security enclave processing (touch ID on MBA), Siri processing, and more. Every year, more and more of the processing is moved to Apple's T series co-processor.
Apple's custom silicon allows them to integrate software and hardware on a deeper level. Intel develops CPU for the mass market. Apple develops for their own customers only.
> There's no FaceID (yet, maybe it's coming) on laptops, so we don't need the ML chip.
With Apple's focus on on-device ML, I would guess this will be the first part of the A-series trifecta (CPU, GPU, Neural) to be included on a Mac and exposed to developers. I can imagine a bunch of possibilities for such a chip, not just FaceID.
They’re already shipping their custom T2 chips in their laptops. The compiler toolchain can build great binaries for their A chips. They’ve swapped CPU architecture before and the modern Mach binary format can hold versions of the executable built for different architectures.
They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch. That might be tricky because of the huge surface area of the x86 instruction set. But I think it will only be a matter of time. It might also explain why they have kept the MacBook and MacBook Air product lines - they might want one of them to stay with intel’s cpus and the other to switch to their A* chips going forward. Or maybe they’ll just wait another generation or two and switch CPUs across their whole line in one go.