Hacker News new | past | comments | ask | show | jobs | submit login
Apple Said to Work on Mac Chip That Would Lessen Intel Role (bloomberg.com)
230 points by VeXocide on Feb 1, 2017 | hide | past | favorite | 257 comments



It has been interesting to watch the ARM ecosystem unfold. Sort of like seeing a levy breach where just a trickle of water starts flowing over it and then more and more and more.

Apple has the resources and the motivation to build a "desktop" 64 bit ARM processor. The history here is also interesting. After debuting the Macintosh in 1984, and getting into fights with Motorola over what should be in the next generation chips, they did the unthinkable and adopted an entirely new instruction set architecture, PowerPC.

That relationship ended on a fairly rocky note and Apple did not have the IP rights to continue PowerPC development or the infrastructure access. They switched to Intel's architecture with great fan fare and started using ARM in their mobile offering. Now ARM has reached the point where it is adopting "high end" features faster than Intel can invent new ones. And Apple has both proven to itself that it can design and fab a completely bespoke CPU design without any input from the original owner.

So here we are looking down the barrel of "bespoke chip architecture round II."

It gives Apple some unique advantages that neither Microsoft nor Google can match.


ARM was bespoke round 1 -- Apple was an early investor and used ARM chips in the Newton family (also some laser printers I believe). PowerPC was round 2. Apple acquihired a Boutique chip design firm (PA Semi) that specialized in low power PowerPC designs to design its ARM cores, so the current round is almost a hybrid of the earlier two.

The 65816 (used in the IIgs) was also pretty much bespoke.


Apple didn't adopt PowerPC after fighting with Motorola and it was no shock to Motorola, or the world. Motorola and Apple and IBM worked together on PowerPC in a consortium, after the Motorola 88000 RISC (which Apple also tried) failed for various reasons.

This was all in the exiled-Jobs years while Apple was thrashing about trying to build a real operating system (or systems) to replace classic Mac OS. (NeXT also played with the m88k.)

And ARM wasn't new to Apple with the iOS systems, either. The Newton was built around ARM back in the 90s.


Yes. In fact, the Newton was the reason ARM was formed as a separate company. The chip design was originally done by Acorn (using VTI tools). When the ARM operation was spun out of Acorn under joint ownership (Acorn, Apple and VTI), they changed the A from Acorn to Advanced.


It's really interesting how the historical strengths of companies can persist even after decades of evolution. If the people are coming and going, how is this institutionalized success being cultivated and maintained?


To be honest I don't know the answer to that. That said, Apple has always had corporate value of being maximally responsible for their own destiny. It's hallmark card wisdom to know the things you can change and accept the things you can't change, but in Apple's case their history has been filled with finding things we can't change and replacing them with something we can. So where one company might say "Well we have to work with what ever the microprocessor company will agree too" Apple wants to say "The microprocessor company will do what ever we require of them." And if they can't find such a company they look at becoming a microprocessor company.

A lot of that thinking was laid out fairly extensively in the lawsuit over manufacturing sapphire they got into. The manufacturer (GT Advanced) complained (reasonably I think) that Apple's contracts were so onerous as to make them employees of the company in everything but name. And that appears to be how Apple likes it, complete control of their destiny when ever possible.


So is the lesson taking contracts from Apple is not a good idea?



At least part of it is that the people aren't coming and going.

When new, key players arrive they're attracted and vetted by the existing community. And that community is well populated with folks that have long tenure.

There are plenty of folks at Apple in key technical positions with 10+ years at the company, and not a few with more than 20.


Apple isn't going down this road again because it's "in their culture", they're doing it because it puts them at an advantage and because they can. Changing the architecture means changes from the silicon level to the application level; it's not so much about engineering talent as it is controlling the entire stack. Still, 30 years later, Apple is the only company that could pull it off.


There has to be at least a little bit of culture having an impact here, even if only in the sense of people being able to say they've done a wholesale change of architecture before so they should be able to do it again. Also the culture of being responsible for the entire stack including hardware, OS, and software gives them a lot more flexibility than somewhere like Dell where they'd have to go through extensive negotiation with Microsoft to make a change to CPU architecture fly.


My guess would be that such specific historical strengths are local manifestations of a more general instinct that is nurtured within a company culture. This instinct would result in similar behaviour when presented with a similar situation — for better or worse.

I also suspect that a company like Apple would have a portion of employees that stick around for the very long term, who have been there since early days and can act as the company's institutional memory.

Specifically in this instance though, it's pretty clear that Apple is going down this road because they've already started going down this road. This is all just straightforward technology and skill reuse: the touch bar is pretty much an Apple Watch in a different package.


Apple did not have the IP rights to continue PowerPC development

Didn't they get those rights when they bought PA Semi? I think it didn't make sense for Apple to develop their own processor in 2005 because they were just in a much weaker position than they are now.

And it was said that in the old days Exponential got their PowerPC license from Apple.


I don't believe PA Semi ever had a transferable license to Power. They had their own architecture which was very power efficient and they were a CPU design house with ARM experience (they did StrongARM at DEC). What it looks like from the outside looking in, is that Apple bought out a full architecture license (with derivative ownership) from ARM and bought PA Semi to be the core team to start building CPUs that they had 100% of the rights to.


That's how I saw it. I did recently discover PA's PPC looking into various implementations of the architecture that might be salvageable. It was quite impressive. Still better specs on 65nm than the Rocket RISC-V that's on 45nm:

https://en.wikipedia.org/wiki/PWRficient

EDIT: Funnier is it was used in a desktop for Amiga's (AmigaOne X1000) before acquired by Apple. Had to hurt what little ego the Amiga people had left.


Linked in there was the EE Times article (http://www.eetimes.com/document.asp?doc_id=1168406) which refreshed my understanding of the licensing issue. It reads like the Power license they had would have to be re-negotiated with IBM as a transfer if they were to keep selling and making their parts.


Another source I had said they do sell EOL'd parts to legacy customers. License must have went through at least for that.


Windows RT and ChromeOS have both supported ARM for a while. In the case of Windows, adoption is low because not all applications work on ARM. Apple would hit the same problem initially and would have to wait for application developers to distribute ARM versions.


> would have to wait for application developers to distribute ARM versions.

Or they can switch to their bitcode bundles for Mac apps like the iOS apps and transcode for Intel / ARM on delivery. Plus they have previous form in this area with the "run PPC apps on Intel" via Rhapsody (IIRC.) It'll be a stumbling block, nothing more.


I'm mainly talking about unmanaged apps, not apps distributed through the Mac App store. I don't think that x86 emulation is an option, that would be terribly slow.


If they still have the Rhapsody tech, it's not strict emulation, it's more like transliteration / JIT; much like Transmeta were working on, IIRC.


Ah, Binary Translation at the silicon level. That would be very interesting. Apparently NVIDIA's Project Denver was supposed to do this for x86 but they were unable to get the proper licenses, so they shifted it to do it for ARM. https://en.wikipedia.org/wiki/Project_Denver


> It gives Apple some unique advantages that neither Microsoft nor Google can match.

Isn't Google working on some in-house chip for training neural networks? (That's only server-side as far as I know, but still.)


If I recall, Jobs also considered manufacturing the hardware was one of Apple's mistake because it's not their core value.


I think that in the long run you'll see a hybrid ARM/x64 Mac sharing the same case in the same way we have hybrid graphics.

If you look at most of the processes running on a Mac, how many of those need to be running on the power-hungrier Intel chip? You could easily offload much of the kernel, Dock, compositor, Twitter desktop, etc to a lower-power ARM. In the case where you aren't running any x64 apps, you could then power that chip down and run solely on ARM.


This sounds incredibly kludgey and at odds with Apple's de-investment in macOS. (As in "reduced investment", not "divestment".) Adding massive complexity to eke out another hour or two of battery life seems like a poor choice when there are so many things to be fixed in the OS.


Why so? A Mac is a very high ASP item - why not, for example allow iOS apps to run in macOS (using the ARM processor).

Ultimately, the 2016 MBPs are a great example of why Apple needs to divest from Intel. The poor timing for Kaby Lake mobile processors and lack of low-power RAM were both Apple's reasons stated when the topic of "the MBP isn't keeping up with the competition" arises.

And who's to say that macOS.next will be anything similar to the current macOS?


Apple wouldn't need an ARM chip to allow iOS apps to run on Macs. I would think that all iOS apps are already compatible with x86. When a developer writes apps today using Xcode and runs the app in the simulator, they are running an x86 compiled version of their app linked against an x86 version of the iOS framework. It wouldn't be that much of a technical hurdle to polish the experience and officially support fat binaries that run on x86 and ARM.


Only in theory. In practice the existing base of software in the App Store would cause compatibility issues.

Intel based Android devices have an ARM emulator on board (libhoudini) and claim to be ARM when querying the store, in order to not have a software offering disadvantage.


I'm not saying it's a good idea, but we are talking about Apple allowing IOS apps run natively on Macs. iOS apps are already running natively on macOS - but only if you have Xcode. Apple could really just include the x86 based iOS framework that it already has in the next version of macOS and tell developers if you want to run on Macs, just modify your UI to support a third target - you are already probably targeting iPhone and iPad - let Xcode bundle the x86 build its already doing while you're testing.

As far as Intel based phones, that's even easier, tell developers they are realeasing an x86 based phone in the next year, either you bundle an x86 version - again you're already testing an x86 build every time you run the emulator - or you'll lose compatibility. Apple has never been afraid to abandon apps when it transistioned processors.

The Android situation is different, if people have a choice between buying an Android device that is 70 percent compatible and one that is 100% compatible. The one that is 70 percent compatible is at a disadvantage. But if people want an iOS device and Apple switches to Intel and they lose some apps what choice do they have?


I'd be amazed if Apple ever gave up ARM for x86. Has Intel improved that much in comparison to ARM for power efficiency?


why not, for example allow iOS apps to run in macOS

That's not what the article is talking about and it's a bad idea for UX reasons (no touchscreen).

And who's to say that macOS.next will be anything similar to the current macOS?

Don't even go there; a vocal minority of Mac users including me do not want anything like that. And getting back to the article, if the OS is radically different then presumably it doesn't need both Intel and ARM processors.


> That's not what the article is talking about and it's a bad idea for UX reasons (no touchscreen).

Windows 10 Apps seem to have bridged the divide between mobile and desktop, and run fine on both.

It wasn't instant, Windows has been trying to bring this divide closer since Windows 8... and after so long they're pretty much succeeded with the Surface tablet/laptop.


> Don't even go there; a vocal minority of Mac users including me do not want anything like that.

I'm a Mac user too. Unfortunately, Apple is notorious for doing what they think is right, regardless of their user base. Need I say "lightning headphones" or "no magsafe, just USB-C"?

Both of these strike me as positioning for the future Apple wants despite the concerns the vocal parts of their userbase.


Skate where the puck will be, not where it is.


No reason they couldn't put a touch screen on a MacBook or imac, though I guess they can't guarantee one on a mac pro


Well, one reason is that touch screens for desktop OSes suck. Holding your arm out screen operations for any length of time is awful, and wiping the prints off your screen on your pant leg doesn't work very well on a laptop.

That's not to say Apple doesn't release terrible things at times (hockey puck mouse[1]?), but I have trouble imagining Apple releasing one of those. (Laplet? tabletop?)

[1] I actually think that may have been calculated - it is impossible that Jobs didn't know it was a shitty mouse, but alongside the iMac, it got a ton of attention at the time Apple really needed it.


> Well, one reason is that touch screens for desktop OSes suck

They work perfectly well on desktops like the Surface Studio, where you can bring the screen down to a drawing board angle. They also work well with a "desktop OS" on convertible laptops and 2-in-1s with multiple use modes including "tent" and "tablet".

They're actually not too bad on real desktops with vertical screens, especially if you're standing up. (Try putting an all-in-one in your kitchen.)

It would be foolish to assume that just because you have a touch screen that you have to touch it all the time. You don't. You can still use a mouse, a pen, an air-mouse, a games controller, and several other things. Having a touch screen doesn't make you do anything you don't want to do, it just gives you an extra option.

It's a bit like claiming that if you use a mouse you can't use keyboard shortcuts. Really, that's not how it works....


Except that a touch screen can't (reasonably) be optically coated to reduce reflected glare, because skin oils from fingerprints disable the effect, showing up as bright spots.

Those of us who are still bummed that we can't get a matte screen on a Mac laptop anymore will be triply upset if we can't even have an optical coating.

For the use cases of tablets/convertibles, it presumably doesn't matter so much, but for the more intensive creative work that laptops are better for, it does.

Okay -- Apple could have one touchscreen laptop model and leave the rest as they are, or make a touchscreen optional on the whole line. But I would never want to buy a touchscreen laptop.


How many have you bought? The last time I bought a non-touch laptop -- last year -- I quickly realised my mistake, took it back and paid more for one with a touch-screen.

Anything without a touch-screen just feels horribly broken and out of date now.

Otherwise I'm not sure what your complaint is. I see lots of people happily using touch screens for all sorts of purposes. That includes smartphones, tablets, convertibles, laptops and desktops.

I don't really see why something that works on screen sizes from 4-inches to 28 inches (Surface Studio) across multiple form factors should somehow not work on some particular device.

> the more intensive creative work that laptops are better for, it does.

Not sure what you mean, here. Laptops are less powerful than desktops and ergonomically inferior. They are not better for anything except moving around.


I have a hp spectre with a touch screen and windows 10, and it certainly doesn't suck. Great for sticking in tablet mode and browsing, organising photos etc.


No touch screen at the moment. Apple could be looking at making a hybrid ipad/mac.


That would cannibalise sales of the iPad. Its why they have yet to release a proper iPad Pro, just a gigantic iPod touch with a stylus.


Macs cost more so why would that be a bad thing?

iPad sales were down by 22% in the last quarter and are roughly half what they were at the peak. As someone has pointed out, the iPad sales chart is looking like the iPod's, except it might be going down a little faster...

http://www.zdnet.com/article/the-ipad-is-on-life-support/

Tim Cook's "why on earth would you buy a PC?" routine isn't looking quite as hot now iPads are the fourth ranked revenue earner at Apple, behind the iPhone, Macs, and Services.


>The poor timing for Kaby Lake mobile processors [was one of] Apple's reasons stated when the topic of "the MBP isn't keeping up with the competition" arises.

I've heard similar sentiments quite a few times. However, there's little reason to believe that Apple would have been able to meet their needs better than Intel could have. It's not like Intel's development schedule was misaligned with Apple. They just couldn't get the processor out in time.


> why not, for example allow iOS apps to run in macOS

Never say never, but that is the sort of thing Apple would never do.


I believe the Touch Bar on new MacBook Pros is a modularized component running a version of iOS on ARM.


It's running a stripped-down watchOS actually


Hadn't heard that, but I can believe it. It is also entirely different than users running Infinity Blade or whatever.


Is apple divesting its mac offering? Sounds like FUD.


This seems... suspect. There are definitely serious concerns (see the Mac Pro), but given the continued yearly releases of macOS and and the (much maligned, but a company spanning effort) touchbar, it hardly seems like they have plans to divest.


Divestment is an oversimplification of the direction Apple seems to be taking the Mac. Consolidation would be a better term: trying to prune the product line into a lineup of hits and nothing else, often to the exclusion of non-mainstream (professional) users.

So the test to apply is: will this enable a headline feature? In this case, if it lets Apple put a significantly bigger battery life number on the product page, I'd bet on it.


I mean, I'm not a fan of the new MBP myself, but I don't think they're ignoring professional users. Didn't their MBP sales go UP?


"often to the exclusion of non-mainstream (professional) users"

The allusion is to the Mac Pro, not the Macbook Pro.


There was pent-up demand from people who had been holding off buying the old model....


Because you can't buy something else from the PC industry?

If there is pent up demand, it's because Apple has been treating their costumeira much better than every other OEM.


> Because you can't buy something else from the PC industry?

You can, but it doesn't run MacOS or Final Cut Pro.

> If there is pent up demand, it's because Apple has been treating their costumeira much better than every other OEM

Or rather, it was by treating their customers worse than other OEMs by not updating their machines.

I didn't notice any other PC manufacturers where the buyer's guide said "Do not buy" for all but one machine (and the exception was the slowest MacBook).


Proof?


Apple has done many, many things to eke out a couple more hours of battery life. And there has always been a market for those improvements. Same with making thinner cases. it doesn't seem like much, but it sells.


> Apple's de-investment in macOS

Citation needed.

"We love the Mac and are as committed to it, in both desktops and notebooks, as we ever have been."

http://www.independent.co.uk/life-style/gadgets-and-tech/fea...


Actions speak louder than words, and it's been 3 years and 1 month since the Mac Pro saw any action.


They upgrade macOS every year. New iMacs should be out in March or April. Sure, something is wrong with the Mac Pro, but the overall Mac investment is still strong, perhaps a little slow at the moment.


I think Ben Thompson had the best take on this. Something was fucked up bad in the product, and they aren't budgeted to make the necessary capital investment so soon to dig out a marginal product that nobody buys anyway.


> [...] to dig out a marginal product that nobody buys anyway

I wonder if they could expand the Pro desktop to be for serious gamers as well as dev's, videographers, animators etc. It'd be interesting to see what Razor and Alienware make on their systems - could it be enough to justify a presence in the high-end desktop market?

As for bringing gamers to the platform, Apple already has experience with this...from the losing side at least. Originally Halo was meant for the Mac however Microsoft purchased Bungie in 2000 and much like GoldenEye for the N64, Halo was the game that sold enough Xboxes to make the platform solid. Apple would similarly need some top-tier titles to encourage people to embrace an expensive desktop.

Apple's Anobit acquisition seems to be what's giving it the edge in the NVMe arena. They have experience with quiet cooling solutions. The touch bar would be good for key-mapping in games. Back-lit keyboards are their jam. They have a serious investment in Imagination Technologies Group who are behind the PowerVR architectures. Smaller, more reliable PSUs. Their austere industrial design is second to none and deeply contrasting to the garish Razor and Alienware designs.

It would come down to whether there's money to be made in the area, and I suspect most serious gamers build their own systems.


> It would come down to whether there's money to be made in the area, and I suspect most serious gamers build their own systems.

Spot on. The ultra-performance professionals and the big-spending gamers often do their own builds and then keep upgrading them.

Apple has always seen itself more of a sealed-box consumer appliance supplier, basically de-skilling computing.


> The ultra-performance professionals ... often do their own builds and then keep upgrading them.

I think this claim stands in pretty stark contrast to the history of Apple products in the professional graphic/video sector.

Even for people doing stuff like CAD that is less sensitive to image quality and more performance-oriented - nobody at a company of any real size (say >25 users) assembles their own hardware. I'm sitting two feet from an HP Z400 workstation that my (architect) father-in-law snagged for me when his company upgraded their CAD workstations.

The reality is that assembling PCs, debugging them when hardware breaks, etc is not free. Once you approach a certain scale (I would imagine this is roughly around the point of a single tech) it starts becoming worthwhile to externalize the cost instead of hiding it in salaries. For a price, Dell or HP will make sure that you don't have to worry about your hardware. If you hire a bunch of new people, a bunch of new boxes will show up tomorrow. If you have a PC crap out, they'll get it back up tomorrow.

Of course they do charge for this, and they make a profit doing it (i.e. they charge more than it actually costs). Once you are at a really huge scale, it might be worthwhile to bring it back in-house and have someone do it full-time. But like any managed service, it's a viable proposition at certain scales. You buy Amazon AWS instead of running your own servers, a lot of businesses buy HP or Dell workstations instead of worrying about it themselves.

Power-gamers are pretty much always individuals, who (like tiny startups) are price-sensitive and build their own hardware. When you don't have an income stream tied to the product, it's worth spending an evening building it yourself. Although this still assumes a power-gamer with some decent knowledge of hardware. First-time builders have a bit of a hump to get over, and beige boxes sold for family PCs drastically outnumber the power-gamers.

Still though, to the more general point, Apple still has some pretty serious mind-share in the professional market, although it fades year over year especially when they're pushing shit like the trash-can Mac. I think it's premature to say that Apple can never succeed in the professional market, especially given their historic inroads in that market.

But again, they do themselves no favors by not upgrading their hardware. The current base model Mac Pro GPU uses the workstation version of the Radeon HD 7870. The high-end model uses the workstation version of the Radeon HD 7970. That's pretty ancient in tech terms, especially now that NVIDIA is moving away from Kepler. It's three generations old, they should have moved on to Fiji-based chips a year ago.


Not entirely convinced. The cheesegrater Mac design let you upgrade on the fly...

I certainly agree about the scale problem, but Apple has never really competed in that market. You could easily buy 1,500 Dells but the graphics department only got 6 Mac Pros. And they still upgraded them in ways that weren't too onerous.

> Apple still has some pretty serious mind-share in the professional market, although it fades year over year

Yes. Sadly for Apple, that self-build or custom-built PC gives you double the performance for half the price. When time is money, it makes a big difference.

And if you're running Adobe software, it's not that big a shift.


But the comment was about MacOS not the Mac Pro


I've read that there is no longer a separate macOS development team; it's been folded in to the iOS team. I think that's a very bad idea.


I don't think we'll see an Arm/x86 hybrid, that wouldn't make sense as they're completely different ISAs and they either need to cross translate between them, or dobble compile everything.

But a heterogenous CPU certainly does make sense. A super efficient, but not particular fast core for specific tasks that are more or less always running, and the beefy full core for regular work. If it eeks out another 2 hours of battery life, I can see Apple doing it when they're redoing a CPU from the ground up.


Too late, Apple already invented that: https://en.wikipedia.org/wiki/Apple_A10

The A10 is the new chip in the iPhone 7. It has two fast cores and two energy-efficient cores.

EDIT: Sorry, "invented" was the wrong word. Apple is already doing that. My point was that the parent's idea isn't new.


Apple already invented that

ARM already offered this (A7+A15|A17, A53+A57|A72|A73), and Android phones have been using the design for years.


That is just an argument for them doing it for their laptop chips as well. It makes perfect sense from a battery life point of view.


Actually, eking out another hour or two of battery life at great difficulty sounds exactly like the kind of thing Apple, but few others, would do, if you think about it.


On the other hand, they're commoditizing Intel further and differentiating from the competition.

HP and Dell have gotten smarter and are making better devices. Apple has powerful resources in the ARM space that their competition cannot match -- they are 100% dependent on Intel, who are having issues shipping product.


Either your prove that assertion of "reduced investment" or you retract your comment.

Apple haters...


I see this being the first step as a multi-year transition away from Intel, rather than some sort of merger of iOS and macOS or a sudden architecture switch like PowerPC to Intel.

For now, the ARM will be a low-power co-processor handling a few tasks via specially compiled extensions. However, as Apple continues to iterate on the AX processors, and they, along with developers, figure out how to move more stuff onto ARM, the role of the Intel chips will decrease until they are the low-power co-processor running only legacy stuff which can't be easily ported or emulated. Until one day they just go away.

It makes sense that they'd want to own their core processor tech – they've got the talent and in a few years time will have the technology. It seems that each architecture change they've made has been driven by a supplier not being able to deliver, from Motorola with 68k to Motorola (again) and IBM with PowerPC and now Intel.


I mostly agree with this, but I still haven't figured out what the end-game for this might be. I believe that they'd just emulate low-power-requirement x86 and x64 code rather than keeping a low-end x64 chip in the laptop.

If a significant number of MBP customers are running VMWare or Parallels, however, this might seriously hurt the timeline for moving to 100% ARM. In that case they'd either have higher-end Macs with the x64 chips (potentially complicating the product line) or work on adding x64 emulation acceleration to their Ax chip line.


There already are ARM chips in the new MBPRs, for example - the touchbar and touch ID with its secure enclave runs on ARM. I could easily foresee them putting in more powerful ARM chips and offloading some tasks to them.


Hope it's less crashy/buggy in the future than my current touchbar :/


Funny, because often my touchbar works, but neither my trackpad or keyboard work.


Why doesn't Intel include an Atom core on their CPUs as a next level speed step, like ARMs big.little? It has to be much easier to keep both the high and low power parts the same architecture.


That would probably be less power efficient than just running a single core alone at reduced clock.


Which is why the Atom was discontinued, IIRC: the mainline chips power efficiency had been reduced enough to make the Atom much less viable.


Atom hasn't been discontinued; low-end "Celeron" and "Pentium" chips are actually Atoms.


Both Pentium and Celeron are used for standard mainstream dual-core chips and for Atom chips (eg. J4205, Celeron G3900, Pentium G4400)... so in a way they are Intel's trash-can brand.


I think you meant the mainline's efficiency increased. :)


The Playstation 4 does this as the south bridge is really just an ARM chip. So I think you are right.


And the PS3 before it. And the WiiU, and the Wii. And every AMD chip since about 2013 (AMD Platform Security Processor, their equivalent of Intel's ME).

Honestly, pretty much every complex electronic device (and quite a few not so!) out there has a bunch of ARMs inside of it doing system management. This is borderline a non-story IMO.

Hell, an ARM7TDMI was a pretty common core to use on SD cards a few years back. The damn things are everywhere.


> ARM7TDMI was a pretty common core to use on SD cards

Which is crazy, because that was the main CPU in the GameBoy Advance.


The PS4 southbridge has 256MB of RAM, runs FreeBSD, and downloads game patches and such. This seems far more ambitious than a BMC/ME.


Anyone interested in this should watch the CCC talk: https://www.youtube.com/watch?v=-AoHGJ1g9aM


Direct video link: https://media.ccc.de/v/33c3-7946-console_hacking_2016

The uploader seems to have recorded themselves playing the video and it's missing the start and has buffering issues,


Yeah dude that talk was awesome.


Not really, the ARM in the SB is much more like a ILOM system; not much different from the various processors scattered around hardware these days, be it ARM or some other more niche architecture, or even special-purpose archs.


They'll probably follow what they just did with the A10, at least in the near term. It has a low power mode and a high power mode, but doesn't run them simultaneously. Too complicated. It actually has 4 cores (2 low, 2 high) but can only run one pair at a time.[1]

So while that is similar on its face to hybrid graphics, I'm not sure it would be feasible to offload background tasks to the low power mode as it would be constantly switching modes every few milliseconds. More likely this architecture is better suited to Power Nap where the whole device enters a low-power state with restricted functions. That also sidesteps the issue of having to compile every third party app for two architectures. They can just implement the Power Nap stuff for ARM.

[1] https://en.wikipedia.org/wiki/Apple_A10


Doesn't that complicate the mainboard design vastly? Memory & caching, clocks & time, concurrency, etc.


Both CPUs are connected with a PCI bus and as far as the main SOC is concerned its a bunch of peripherals on PCI

https://media.ccc.de/v/33c3-7946-console_hacking_2016


The CPUs run different OSes and there is no ability to move tasks between them; this is rather unlike a hybrid or "big.LITTLE" approach.


Thankfully Intel has already done alot of new work in this area - allowing PCI devices (GPU, Xeon Phi) etc, to share computing resources more evenly.


Definitely going to complicate things but it could easily end up worth it for a laptop if it can reduce power usage to almost that of a phone when not doing anything particularly demanding.


I know it's not really what you mean, but isn't the touchbar powered by an ARM chip already and also the secure enclave?


Yes, it runs on a "T1" SoC, but the software is a watchOS derivative.


Interesting. The article says that Apple is considering an ARM co-processor, not a replacement for the Intel CPU.

I have no doubts at all that Apple have at least had extensive internal discussion and testing about this. In fact, I believe they've probably done extensive internal discussion and testing about replacing the Intel one altogether, but can't get there yet.

It'd give Apple more control over their device design and release schedule, potentially lower power consumption, and straight up more profit via vertical integration.

Like the PowerPC transition, they could come up with a new Rosetta - which was the emulator that ran old PowerPC apps on x86. Microsoft has recently demoed x86 apps on ARM, and I'm sure Apple could do the same.

As far as I know there's no fundamental reason ARM processors can't be as fast as x86 ones, they just haven't been targeted for those kind of devices yet - but I'd be happy to be corrected by any CPU experts. And with Apple's success in the mobile processor space, I have no doubts that if anyone could pull this off, it'd be them.

So what's holding them back? Thunderbolt. They've gone all-in on that already - touting as the future of high speed external devices, and giving the MacBook Pro nothing but four Thunderbolt enabled USB-C ports and a headphone port. But Thunderbolt is an Intel property, only available on their CPUs.

The 12" MacBook already doesn't have Thunderbolt (the MacBook Pro does), but it feels like with software compatibility, this would be an all-or-nothing thing. I don't see Apple continuing to sell both x86 and ARM machines on an ongoing basis. So how would they get Thunderbolt into the MacBook Pro?

Having a co processor to handle PowerNap would allow them to take a small step in that direction without losing Thunderbolt or having to develop a slow x86 emulator. They could even offload other parts of the OS to the ARM cpu, freeing up the main CPU for other software, and to go into low power mode more often.


That's actually a interesting point - I wonder if it's possible to license the Thunderbolt tech in some way, and have it hooked up to the ARM Chip; something like what Asus does to give AMD thunderbolt: http://www.eteknix.com/asus-give-am3-boards-thunderbolt-supp...

Saying that - and I hope someone corrects me here if I'm wrong, I was under the impression that Apple & Intel jointly created Thunderbolt (or perhaps it is an Intel only tech), which means Apple may have some sway here.

Just thinking about it more - Thunderbolt is just a protocol driven over USB-C now, so I'm fairly certain USB (3/C) might eventually be able to cover everything Thunderbolt does, or apple takes their "lightning" protocol to replace Thunderbolt, and use that over USB instead of Thunderbolt over USB.

Just musing but it is a very under appreciated aspect to all of this as well!


I almost posted that link too, but according to this comment, that product was cancelled and never released because Intel refused: https://www.reddit.com/r/Amd/comments/539kvq/will_amd_cpusmo...

There's still a mention of an expansion card on their site:

https://www.asus.com/us/Motherboard-Accessory/ThunderboltEX-...

But it's only compatible with ASUS motherboards that already have Thunderbolt. So I'm not sure what the point of it is (just to upgrade from ThunderBolt 2.0 to 3.0 maybe?), and as far as I can tell... still requires intel.

So as far as I'm aware, no product has ever released with Thunderbolt that doesn't use an Intel CPU (and even then, only higher-end ones).

Licensing aside, it may be a technical issue - ThunderBolt is an extension of PCI-express, which just isn't part of the ARM design. I'm not sure how feasible it would be to add it.


Plenty of ARMs have PCIe, maybe even the A9.


Any examples?


Nvidia Tegra, all server ARMs, all Wi-Fi APs, higher-end embedded SoCs (e.g. Marvell), etc.


Interesting. I didn't know that.

However, that still doesn't mean Intel will certify anything for Thunderbolt that doesn't use their processors. I still haven't seen any evidence of a certified Thunderbolt host device existing that doesn't have an Intel CPU.


The Freescale(now NXP) i.mx6 has four lanes of PCIe. All that's required is a peripheral on the main memory bus.


This is actually an argument in favor of USB-C, since it's likely that faster versions of the standard will overtake Thunderbolt in the future.


There will be a Thunderbolt 2 which stays a few steps ahead by then.


It always seems a bit like Bluetooth to me, as in "this next version is going to be great."


They're already on Thunderbolt 3.


Who said they won't ditch Thunderbolt in the 3-4 years t would take for ARM Macs to take off? Who says they won't use USB-C? They don't even make Thunderbolt peripherals anymore


There is only one piece of "news" (possibly speculation) in this article: that they may offload some processing to the ARM coprocessor as a transitional step. Interesting idea!

But the idea that they have the Mac OS running on their own ARM chips -- Apple shareholders should be annoyed if the company weren't doing this.

OS X was running for a long time on Intel before anyone decided to make Intel based Macs.


...or iOS will be graduating up to be the new OS across both mobile and desktop platforms?

Earlier today via HN....

http://www.macworld.com/article/3163248/ios/the-future-of-io...


People have been saying this since the inception of iOS; given the trajectory of Cocoa vs UIKit, I don't see much sign of this.


Cocoa over the last few years has made a lot of cleanup/progress on its api's. Making a Mac app in Swift 3 is almost as painless as making an iOS app nowadays. I feel like there has been a big difference in just the last few years. Swift is a big part of this, but also, I'm sure Federighi's group is the real reason.


This is one of the big reasons why I don't buy the whole narrative in these parts around Apple abandoning the Mac. They are spending far too much time on improving Cocoa APIs as you mention to be working towards abandoning or deprecating it.


I'm not familiar with either API, but why wouldn't it be possible to get Cocoa working in iOS? I imagine UIKit would be optimized for small, low power, devices, which will always exist in some, ever shrinking, form.

It's all just a BSD variant, when you get down to it.


There's nothing stopping Cocoa from working on iOS (or UIKit on Mac OS), but they operate on different UI paradigms.


They have used a chimera though, UXKit: https://9to5mac.com/2015/02/05/uxkit-framework/


It depends on app and developer tool support. Raskell (Haskell IDE for iOS) and Pythonista (Python IDE for iOS) are fun enough to use. While I would really like IntelliJ and RubyMine on my iPad Pro, I really like macOS.

All this said, I like to see completion. There are some very interesting Window 10 hybrid devices, and the new tiltable desktop Surface look interesting. I hope that there is never a 'winner take all' for consumer devices.


The impression I got was that they were offloading background checks while the computer was sleeping to the ARM chip. Hopefully bringing iOS's low power performance to the desktop.


> But the idea that they have the Mac OS running on their own ARM chips -- Apple shareholders should be annoyed if the company weren't doing this.

And if the company wasn't floating the idea of switching. Should at least give them leverage in negotiations with Intel.


There's always a hedge afoot. Apple had OS 7 running on Intel as early as 1992.

https://en.wikipedia.org/wiki/Star_Trek_project


System 7 on top of DrDOS... If only this could be released somehow, its an amazing bit of history, a copy has to exist somewhere.


I would be shocked if Apple didn't have macOS running on some ARM development systems internally. It's not even that much of a stretch -- both systems use nearly the same kernel, and the iOS userspace runs on x86 as part of the iOS Simulator.


afaik there is no ACPI with ARM so it's non trivial to setup an ARM system that has modular hardware as we take for granted in the Intel world. Maybe they went through the trouble and are keeping it under wraps. Or maybe not.


NeXT had an Intel version as a hedge as well. Seems those two projects were destined to merge together.


That was actually their product rather than a 'hedge' for a while.


It started out as a hedge against their own hardware being an impediment to sales, then ended up being their entire business. I guess it was a good hedge to have!


Considering they had NeXTSTEP release versions running on x86, PA-RISC, SPARC, and its original home 68000, they did pretty well. I do wonder why the MIPS chips were ignored.


Maybe SGI didn't want to play ball...


Probably, but it just seems rather odd since MIPS had an open system spec and Windows NT had been ported to it.


Not just Intel but PA-RISC and Sparc as well, in addition to M68K.


Also Moto 88110 RISC processors (though those NeXT products never saw the light of day...)


More than that, NeXTSTEP 3.3 and later did really release to the public for Intel x86


It's finally happening! The increase in competition in both the foundry and architectural spaces is exactly what we need to push us past this stagnation of Moore's Law.

Intel needs to sacrifice Microsoft and make x86 a legacy architecture. There are huge parts of their processor designs that no one at the company modifies (or even understands) from one generation to the next, but which they cannot remove for backwards compatibility purposes. It's time to start fresh with a new architecture if they want to maintain their reputation of having the most powerful technology.


> It's time to start fresh with a new architecture if they want to maintain their reputation of having the most powerful technology.

They tried that, though. It was called Itanium and it went nowhere. Now, Itanium had all kinds of other problems and I wouldn't call its failure incontrovertible evidence against "Intel should make a new architecture", but I can understand why the company isn't eager to try that again.

Backwards compatibility exerts an powerful gravitational pull that is extremely difficult to break away from if you're not started a new platform ex nihilo (which Apple did for the iPhone).


Look at what ARM did with AArch64 though: it's backwards compatible via mode switching but is otherwise a fairly clean break at the ISA level.

AMD probably could have done something similar. They just didn't, likely out of risk aversion.


> They tried that, though. It was called Itanium

And the i860, and i960, and i432...


> It was called Itanium and it went nowhere.

It was popular in its intended niche (HPC) for a while, but the value proposition was not sufficient. It dropped some x86 baggage but it didn't add anything fundamentally superior on top of it -- the ILP could easily be attained by having more cores on x86. To be honest I actually really like what they did with Itanium, but it wasn't a good enough innovation to break off the family line of backwards compatibility.


It wasn't intended to be a niche product. That was just how it ended up....


Was the Itanium fiasco more HP or Intel's fault?

Either way, it's been 20+ years they could try again.


I thought they learned that exposing the complexity was a big problem. So now the chips are internally complex but hide it behind the instruction set interface while the chip does its own stuff in the microcode.


Why does intel need to do this?

x86 decode is a tiny, tiny part of the overhead of modern x86 CPUs. Architecture for the most part at this point doesn't matter unless you are talking about truly novel approaches like the Mill. Arm certainly is not a significant improvement.


It's more then just decode. Segmentation, x87 emulation, and the interaction with other features are all pain points.


None of that is 1/10th as complex as the virt extensions, IOMMU/ATS, etc, etc, etc, Stuff that is common on high end processors these days. If you think x86 is complex, I suggest that you look at aarch64, the smmu, gicv3, etc docs.

Plus, x86-64 basically disables both the things you list. Not that it matters because what is a flat fs/gs/etc register when there is another whole level of page tables for the hypervisor. AKA, you do the translation and store it in a TLB. If you really want to compare this, time how long a modern x86 takes on TLB misses, or for that matter how fast its TLBs are. I think you fill find that they are industry leading...

Same basic thing for the x87, its likely mostly powered down, and when active is probably feeding micro-ops through a SSE functional unit....

So the original posters comment is likely correct, and that has been known for a decade+. X86 if anything has a few accidental advantages, and the idea that its somehow "worse" than the alternatives are provably wrong.


None of that stuff matters in reality.

You should think about ISA implementations as verification problems instead of problems in building a silicon implementation of the ISA. From that perspective it should be obvious that intel has the best, richest, deepest verification set that exists in the CPU space and since it is tied to x86, that is an advantage for the two x86 vendors. Verification is incredibly hard for complex CPUs and building up verification is time intensive. This is part of the reason that the brainiac end of the design spectrum has pretty much become just x86 and POWER which has nearly the same history.


That's still part of the ISA right? And the 99% of silicon is not spent on those pain points you talk about. Architectural techniques are largely the same for high performance processors, be it ARM/PPC/x86. If you want high performance, you gotta put in the HW required. LSU, prediction, SMT hardware, branch prediction etc... The single thread speed has already been solved a long time ago. There's no point throwing away x86 in favor of a new ISA that has no guarantee of succeeding. In fact, having a new ISA that is incompatible with x86 will just be detrimental to intel/amd.


Intel needs to sacrifice Microsoft and make x86 a legacy architecture.

The last time Intel did this it was called Itanium and it was horrible. Historically, almost all attempts to replace an enormously successful product line with something incompatible have failed.


> Historically, almost all attempts to replace an enormously successful product line with something incompatible have failed.

Except AArch64, which has a legacy 32 bit mode for compatibility but is otherwise a totally separate ISA, unlike x86-64 and i386.


Aarch64 looks at first glance to be pretty dissimilar, but if you consider that its really a follow-on to THUMB2 it doesn't look as foreign. Also a lot of things appear different at first glance but aren't because they have basically been renamed.

one example CPUID->MIDR

http://infocenter.arm.com/help/topic/com.arm.doc.ddi0432c/Bh...

https://developer.arm.com/docs/ddi0500/f/4-system-control/45...

The memory model is the same.. etc, etc, etc.


If AMD hadn't come up with x64, they would have succeeded it.


6502 - 68000


Why can't they keep doing x86 stuff and start something else at the same time?


They could,. Unfortunately, it requires a huge investment and needs a lot of highly skilled staff, who are currently working on its mainstream product. Also, when you've done your new chip design, you still have no hardware devices and no software to run on them.

It's taken roughly 40 years to get x86 to its current position, and almost as long to get ARM to its current position. For how many years would you have to invest in a new architecture (ie lose money) before you could compete with them?

AMD has just done a new chip design (Zen) but it's x86-compatible because that's where the software is....


I have read multiple article but i have yet to see anyone mention it.

Isn't this about Security more then anything else?

IF it is about power, Intel SoC uses very little power during Power NAP. And any saving by switching to ARM should be negligible, when you consider the memory and I/O needed during Power NAP.

If it is about cost, Apple could have worked with AMD on Mac. I am pretty sure Ryzen and Vega works better on dollar / performance / Power. Especially on the Desktop Mac. Considering AMD has done special chip with console, I dont see why Apple couldn't use this model as well.

Since Intel's CPU has IME which Apple has no control, would that be a reason? Apple taking security in their own hands.


This has been rumored for some time. I believe it's true, but then again, Apple kills off most projects before they see the light of day. Whether this ever makes it to consumers is the important question.

That being said, I've seen a number of comments about running iOS apps on Macs, or "convertible" Macs/iPads, and so on. I think those are off base given the supposed purity that Apple always talks about.

However, there is one thing I've never seen mentioned - a combined Mac/iOS binary. Similar to the old Universal binary for PPC/Intel, this could be a single app that just has different UX depending on the device it's run on.

I'm not sure why nobody has talked about that option, as it seems most likely to me, and gives more weight to the ARM everywhere strategy.

One "app", and it would be native, with native UI on whichever device it's running on (Mac/iOS/TV/Watch/etc.). Given Apple's investment into a streamlined complier that they rolled out with Watch, many if not most of the pieces are already in place for this.

It would not surprise me if that is the big announcement for this summer (or next at the latest).


I feel like they've been doing this to osX, starting with 10.9 it seems like a lot of the visual flair of iOS has filtered into the macOS environment.

I fear that this will dumb it down too much for the power users, and conversely over-complicate it for the average iOS user. I'd love to be wrong...


Yes, but this is not specifically what I'm talking about.

Think of something like MS Outlook - it's a big, complex app with lots of features. Now imagine that you only write the app once, but design two interfaces for it - one set for the Mac, and the other for the iPad. You compile through XCode, and upload to the App store. Apple notes that their are interfaces for both iPad and Mac, and posts it in both app stores.

Obviously the mechanics would not be exactly like this (I presume a manifest of some kind), but it should be pretty close.

For Apple, this would bring more developers onto their tooling (yes, even many of those that complain about XCode), and allow more developers to target multiple platforms with little extra effort.

In fact, one of the more interesting things is that Apple could even create new platforms that just require some interface additions and an updated manifest.

Have a bug in your code? One fix. One upload. One code review.


One basic problem - macOS has many, many more features than iOS.

There would be a lot more involved than redesigning the UI. The two platforms have fundamentally different class trees all the way from the view and user event classes down to core OS hooks.

There's some limited overlap, but not as much as you might expect.

Merging the two platforms and creating a UI-level split wouldn't be impossible, but it would be a huge job and would effectively mean a rewrite of both OSs - and probably of WatchOS and TvOS too.

It's not obvious this would be a good thing to do. MS tried it, and that was pretty much a disaster.


> MS tried it, and that was pretty much a disaster.

Do you mean the Windows Runtime subsystem? How is it a disaster?


> However, there is one thing I've never seen mentioned - a combined Mac/iOS binary.

Because they're already doing something better. All iOS apps compile to Bitcode, an intermediary representation that can then compile an optimized binary for the target device architecture.[1]

So if there ever were a desire to run iOS apps on Mac, they'd just compile from Bitcode. Having said that, there's explicitly zero desire to do this.

[1] https://developer.apple.com/library/content/documentation/ID...


Bitcode allows you to optimize between chip versions with different opcodes, but not different architectures, word sizes, or endianness.

Source: the recent Accidental Tech Podcast that interviewed Chris Lattner.


Ok fine, but the broader App Thinning featureset of the App Store still means they wouldn't do a universal binary. They'd deliver just what the target device needs.


True, but AArch64 vs AMD64 are not so different (64 bits, little endian, probably quite similar ABIs), it is not unthinkable that the same bitcode could target both architectures.


iOS apps already run on intel via the simulator. There's no need to have a dual binary.


Good news for Apple and overall progress, but sad news for those of us in the Hackintosh community. Apple's support for Intel CPUs is really what enabled the whole movement to get traction. I really prefer to see Apple moving toward standardized vs proprietary hardware options.


And inversely, I wonder how this news is for Linux on Mac hardware....


Sounds similar to big.LITTLE where by big are Intel cores for performance and power nap and maybe in the future less power intensive tasks are handled by LITTLE (ARM part).


I think the problem is while ARM is good at lower power the moment you need sustained performance the power requirements shoot up and throttling begins.

The flagship ARM A72 and A73 are not adequate for true desktop class performance, somewhat catching up to Intel's low end core u laptop chips. It's increasingly looking like to get x86 level performance ARM would lose its low power advantages.

The AMD Zen on the other hand appears to be extremely efficient. I fiddled with ARM boards on and off for over 3 years excited by the possibilities of tiny form factor desktop class computing at laughably minimal costs with ARM SOCs. The reality is the extremely poor to non existent driver support and the 'effectively closed' ARM ecosystem is a deal breaker that left me disillusioned and strangely relieved with the the open PC ecosystem.


Oddly, ARM is eating the PC architecture from the inside. What's on your raid card? An arm. What's in your usb devices? An arm. What's in your SSD? An arm. A PC is beginning to look like a network of arms with an intel driving. How long will the intel core last?


Intel margins in that single chip are probably a few orders of magnitude of ARM royalties on all those embedded CPUs combined. Intel has little to fear from those chips.

On the other hand, Apple does seem to have the capability to design competitive desktop class CPUs, so who knows.


The absurd margins are in fact the reason why Intel need to fear ARM. It gives an incentive to lower the costs by designing them out of systems, even if it means sacrificing peak performance.


Apple doesn't use ARM designs. Their own A10 leaves the ARM designs far behind. They just use the architecture.


As long as it's still x86/AMD64 we're good. I really don't want to see PPC/Intel fragmentation again.


I thought it was mostly painless. Vast majority of applications provided an Intel binary right away without many issues. Rosetta wasn't too terrible for simple tasks for the rest of them.


Rosetta was mostly painless because the raw performance of the Intel chips were multiples of the G4 chips in the macs (especially considering most systems doubled the available core count). Even so Creative Suite was slower as high performance plugins used altivec which was not emulated. This wasn't an issue for compatibility as the G3's never had it, so almost every app had non altivec code paths.

The overhead was mostly acceptable because the performance boost from the chip change hid a lot of it. Not to mention the CISC vs RISC thing worked in Apple's Favour. There are few PPC instructions that were required to be supported that couldn't be covered with a small number of x86 instructions.

There is not a publicly available ARM chip that has single thread performance anywhere near the i5 / i7 chips being used. Add to this the complexity of converting SSE to equivalent instructions and the overheads dealing with edge cases, I am not sure there is much chance of equivalent performance.

The one thing going for Apple with an intel -> arm switch is that almost all major apps a built using Xcode. When PPC -> x86 happened, the apps that lagged the most were the ones stuck on CodeWarrior and other 3rd party IDEs that apple never gave sufficient notice to port in advance of the general announcement (however I will concede that codewarrior was owned by Motorola so it may not have made the jump anyway).

I think if Apple does release ARM based macs, it will be with apps specifically recompiled, not using an emulation layer.


Performance per watt is pretty good on ARM. Most performance these days is about IO. How fast is the memory subsystem?

There are intel chips with 400 gigs per second memory subsystems. But the cores run at around 1.4ghz. Even a single thread with that much memory bandwidth can be much faster. Likewise with cache. If you have 32MB of cache, you can do quite a lot more processing in cache, and be lots faster for many things.

Considering that macbookpro laptops haven't really gotten better single thread performance in years - the things that really matter are IO, and performance within the power/heat budget.

ARM is competitive now within the macbook power requirements. The A10 chip is 75% as fast as the i7-6600U in some single core benchmarks. At a much lower power budget.

Within the chromebook space, where ARM and Intel laptops are currently competing - Intel often produces faster laptops. However, they are also often more expensive. Lots of people in many-many reviews say their ARM based chomebooks are too slow.

Having been through the raspberrypi 1,2,3 performance leaps, I can say that the latest one is quite usable as a desktop machine for word processing, web browsing and media playing.

The Samsung Chromebook plus with ARM is shipping, and the one with Intel isn't yet. The ARM one is cheaper, and the intel one is apparently more performant. Both have a similar battery life. The ARM one has better support of android apps. GUI app development is now 10x more active on ARM compared to intel, and the apps are optimized more for ARM.

GPU performance is improving faster on mobile chipsets than Intel is improving. Because of the demand from mobile VR applications. Some are predicting performance will be faster than laptops in a mobile power budget this year (see ARM Mali-G71). It's even faster than some nvidia discrete mobile chipsets. Since many apps are performance limited by the GPU, and not the CPU, I'd say ARM chipsets will win for many users low-mid users this year.

ARM is already taking over the low end laptop market. I won't be at all surprised if they take the mid range market by the end of the year. Especially as the market for chromebooks, and the low end laptops picks up. Because then more laptop class ARM based chips will go into production.

I think the demand will be there, as the android using market also want to use laptops that sync better with their apps.

[ED: add note about high performance ARM chips coming] High performance ARM chips are coming to super computers. The Post-K super computer by Fujitsu is expected to be one of the fastest computers in the entire world. If not the fastest. Will ARM take the super computer performance crown? I think so.


I have no doubt ARM will be in mid to high end laptops. I doubt though that the transition will be made emulating x86 apps for most users like it was with rosetta. 75% of Single core benchmarks is likely to become 40% even with a highly efficient emulating layer.

It makes sense to simply make vendors rebuild the app for the new platform then to use an emulating layer.

An emulator may be used for legacy apps, where performance doesn't matter, but apple has never worried too much about legacy in the past, so the mac community won't expect it. All the laptop needs to have is some other killer feature and the community will put enough pressure on the vendors to produce an OSX/arm build of their products.


There will certainly be ARM-based laptops running Windows 10. Have you looked into that?

They are different from the Surface RT (Windows on ARM) machines because the also run traditional x86 apps (not just a port of Office).


Yes, good point. It will be interesting to see how popular they are this year. I also wonder if Office performance will be good enough for most people. I'm pretty sure MS know what they are doing with that, and the answer will be yes.


I'll buy one if they are cheap and include a sim slot. It would basically be a smartphone with Windows desktop software (mostly Office in my case) running on an 11.6-inch screen...


It wasn't that great for Adobe product users.


Remember the time when software X on wintel ran circles around software X on macos/ppc and people pretended it was not the case?

Guess we are going to see more of that soon.


Speculating about what Apple's going to do is a great way to be wrong but I don't really see it. In both of their CPU migrations the switch was to substantially more powerful CPUs. I can see them making more 'pro' iOS devices, Macs with ARMs as coprocessors, etc. Ditching Intel altogether? I can't think of ways the advantages would outweigh the disadvantages.


My guess is that the CPU "leap" will be in power efficiency rather than raw performance. My guess is that the returns for each jump in CPU power these days isn't helping the vast majority of users, but they certainly care about battery life.


The Core M in the MacBook is already down to 4.5W. How much lower would the ARM go, while delivering the same performance? Speculation welcome ;-)


Licensing x86/AMD64 requires licenses from both Intel and AMD, which probably would be quite expensive, even if they can threaten to go ARM otherwise, and engineers with the low-level experience probably are way harder to find than for ARM (where quite a few companies hold various levels of licensing, and Apple likely already has quite a lot of talent inhouse)


> engineers with the low-level experience probably are way harder to find than for ARM

Eh, the instruction set doesn't matter that much for modern CPU designers (within reason). Apple's in house expertise came (in part) from their acquisition of P.A. Semi and their PowerPC engineers; Apple it seems immediately redirected them to working on ARM designs. And the Transmeta guys proved that a third party can put out an interesting x86 chip if you can get access to the licenses.


The Transmeta guys actually ended up releasing Project Denver (Tegra) for NVIDIA.


Probably arm this time around and if you look at the linux ecosystem you'll see that there's nothing to fear.


No he means that running x86 legacy mac programs on an arm mac with an emulator as a compatibility shim would be slow. Linux is totally different, because you can clone an entire DISTRO, cross compile it, and have a native arm environment with no compatibility shims slowing things down.


What?


Apple does not have an x86 license.


I often wonder why Apple doesn't purchase something like AMD. They could design their own x64 chips and GPU's.


AMD's license for the parts of x64 owned by Intel is conditioned on AMD not getting sold. The license wouldn't be transferrable to the buyer.

(So, yeah, AMD could do their own slightly-different chip, using just the bits and bobs that are strictly theirs.)


I thought that deal goes both ways, AMD also licensing important parts of AMD64/x86_64 to Intel, so both sides would have a strong interest in keeping it alive in some form.


Is x64 owned by Intel? How much of the x86 stuff would Apple need?


Remember when apple was tied to the power pc chips? This would likely end up the same.


This sounds weird. Are they going to have two versions of every executable? One arm and one intel? Or would this low power/power nap mode only support a tiny subset of functionality that apple chooses to include in their OS release?


You make it sound like they've never done that before (Universal binaries).


But those weren't both loaded into memory at the same time.

Thinking about this I've convinced myself that what will happen is the ARM core will periodically check for various statuses (i.e. connect to your imap/pop server and see if there is new mail) and turn on an indicator or perhaps wake the intel cores. Which is to say that the powernap/low power functionality will be restricted to either things apple delivers ootb or to a special API. Not as something that automatically migrates whatever workload is running on the powerful intel cores into a slow motion version of the same thing running on the ARM cores.


Yes, "power nap" is an existing feature that only does specific OS/apple specific operations (syncing, notifications, mail, calendar, indexing).


They sort of do this already with iOS apps during development. iOS Simulator is a simulator, not an emulator so the developing app is compiled and run in native x86.


From the article it looks like the arm chip will have tasks offloaded onto is, like a GPU has graphics tasks offloaded onto it from the CPU.


Apple should just buy AMD. It would only cost them about 15 Billion or so and they would get an x86 license. If they really want to control their own destiny this would be the way to go.


IIRC AMD's cross-licensing deals with Intel over certain key patents are null and void if AMD is bought, which puts a real damper on the main reason anyone would consider purchasing the company.


I believe this is the agreement you're referring to and it looks like it applies even to a change in ownership of 50% of the company:

https://www.sec.gov/Archives/edgar/data/2488/000119312509236...


Does Apple even need to own a majority ownership to control the company? Couldn't they just purchase 49.9% of the company and set up the board of directors in their favor? With that much stake getting the company to do what you want it to do wouldn't be that difficult it seems.


Please. I buy AMD then Intel loses the 64 licensing rights which AMD owns.

That really sounds like a winning proposition for Apple, given that Intel's own 64-bit arches historically sucked.


> That really sounds like a winning proposition for Apple, given that Intel's own 64-bit arches historically sucked.

Yeah but for various reasons any usable amd64 CPU needs backwards compatibility with x86 - which, if Apple doesn't want to go the Fat Binary emulation approach again, requires that an Apple-bought AMD must achieve a licensing deal with Intel to maintain said compatibility.

Then again, Apple these days doesn't give a ... about BC anymore anyway, so they might as well go down that track and use only amd64.


That's true, but Intel has as much to lose as AMD if that cross-licensing deal is terminated. So I'd imagine that Intel would want to re-negotiate it with the new party, which is obviously a possibility.


You do realize that amd owns a lot of the x64 patents, not intel, right?


Yes, I do realize that. Which ought to have been clear from the use of the term 'cross-licensing', frankly.


From my understanding the x86 license is not transferable upon purchase.


I wonder about the other x86 licenses that have been out there in the past. Via still offers x86 apparently. It looks like Cyrix ended up with Texas Instruments. Transmeta looks messy. IBM used to make x86 chips as well.

And I wonder if any of this relevant to making modern x64 compatible cpus?


I believe the licensing deal would terminate if AMD is sold


You're correct. Here's AMD's statement:

>Advanced Micro Devices has clarified terms of the cross-license agreement with Intel Corp. on Thursday. As it appears, if either AMD or Intel change their control (i.e., gets acquired), the cross-license agreement between the two companies is automatically terminated for both parties.

Since AMD invented x64 this would leave Intel in quite a bind and the only solution would be to work with the new owner to reach an agreement.


Yeah, that would be a really, really dangerous agreement for Intel given the market cap of AMD and how it would gut Intel's CPU lineup. I would assume their would be one damn quick agreement after the fact.


Apple would have them by the plums at least for a little while.


IBM would also be interesting given POWER.


Totally expected, because they need to increase performance and lower battery consumption. Now the extent of taking job from main Intel's CPU and spectrum of possibilities of ARM coprocessors are yet to be seen. I can see it doing some hardware related tasks that do not interfere with actual higher stack of macOS and x86 space. But this is all pretty common. This title looks little clickbaity imho.


I expect the same. Prior approaches to mixing architectures in one system and actually moving applications at runtime dynamically, as in big.LITTLE, were either far too complex to target or far too inefficient (1). I also don't see a technological enabler here, that would change anything about either.

(1) - I can only imagine it with a virtualized ISA ala IBM. Still - even they didn't do that. The capabilities only exist in separation: eg. IBM TIMI allows to move applications across physical processor ISAs, while various clustered virtualization implementations can move live VMs across distinct hosts.


bit.LITTLE isn't a mixing of architectures, it's a mixing of micro-architectures. And the fact that big.LITTLE designs are commonly used now, and even Apple adopting their own version of it in their latest SoC is a validation of its benefits.


Prior approaches to mixing architectures in one system (and actually moving applications at runtime dynamically, as in big.LITTLE,) were ...


I consider this a mostly tactical move to get better prices out of Intel.

Sure, the plan as presented might work, but the migration will take years, be messy, and lead to lots of unhappy customers due to poor emulation performance, dropping legacy app support, bugs in a new architecture etc.


This is interesting given Microsoft's move to full windows-on-arm (and not that travesty that was Windows RT). Something I'm certainly looking forward to—if not only for the devices that it creates, but a chance for more open mobile computing.

(Sh|C)ould this be a concern for Intel?


Can you just imagine how long the battery would last on an arm based macbook? Days?!


It is actually quite difficult to beat Intel on power efficiency once you start talking about laptop class CPUs. The combination of Intel's advanced fabrication tech and their highly efficient core family of microarchitectures gives them a significant advantage outside of the ultra-low-power space.


I don't think that's true. Arm chips just don't clock as high as intels can, and no one* cares about clock speed any more. That ship has sailed. Multiprocess web browsers will drive the sales of massively multicore arms (with tons of ram!), and intel will be sunk.


It's not just about clock speed. While Intel Core CPUs don't scale down to power budgets as low as ARM cores, they are impressively efficient on a perf/watt basis. Smart phones achieve high battery life mostly by being very aggressive about reducing CPU load in software (see iOS killing apps the moment they go into the background). General purpose operating systems don't work like that so there's more reliance on the hardware to keep power draw down.


they are impressively efficient on a perf/watt basis

Is that true? I thought people were trying out ARM rack servers because of the power efficiency vs perf.


The key word there is "trying". ARM servers have had a tough time gaining traction in no small part because the hoped for power efficiency gains have not been realized.


sounds like Intel hit a pretty sweet work/watt by increasing work while keeping lid on wattage. ARM delivers battery life by lowering work.


No that's not how it works at all. ARM has always been power efficiency focused.


There are ARM Chromebooks and I don't think they last for days. I'm pretty sure most of the power draw on a laptop is the monitor


Mine goes 8 hours with a mainline kernel. I usually charge it once a week. Apple could/would/should do better.


It would definitely be advertised as such.


No, but I can imagine how slim Apple would make an ARM-based Macbook.


about 5 hours, just imagine how thin new macs could get!!1

look at iPhone 6 for clues - Apple replaced 0.1 mm thick piece of metal rf shield with a sticker to make it thinner, that removed metal can was reinforcing pcb and all the bga chips, without it iPhone 6/6+ bends, cracks balls under important ICs and develops touch disease.


I would last 12 hours... but the battery would be half the size.


Anyone knows how far is Apple from getting the following desktop system as one SoC?

- 4GHz 12-core 64-bit ARM OooE with 256KiB L2 cache per core

- 12MiB of L3 cache

- Integrated graphics with 128MB eDRAM

- PCI-Express buses

- 35 GB/s DDR4 memory controller

- 5-65W power usage


4GHz 12-core in 65W isn't going to happen. (Intel is currently at 2.1GHz: http://ark.intel.com/products/93356/Intel-Xeon-Processor-D-1... )

Hurricane could probably run at 3GHz but to get to 4GHz might require serious work. Apple hasn't used eDRAM but in theory they could adopt it. The rest looks like no big deal.


I'd always be open to Apple putting a supercharged ARM chip in a prosumer laptop. We could always run a variant of linux on the laptop if OS X were too buggy.


One can hope, but judging how Linux handles MacBooks right now I'm not so sure they are the ones to bet on to provide a bug free experience.


Assuming the bootloader isn't locked down


There's always that risk: they'll do some sort of jail like they do with iOS.


About two weeks ago I commented on a post here (a concept design for a new mac pro) joking that the next mac pro would probably run and arm CPU and have the ram soldered on.

Looks like I was on to something


>Is it IBM compatible though?

Man, I hated hearing that question. It was so utterly retarded, but trying to explain to the person who asked why it was a retarded thing to say always came off as damage control.

>Yep, not IBM compatible. Not interested.

Maybe Apple can get away with their own chips now. It sounds like they just want to take more R&D away from Mac though. "Just stick an A10 in it."


iOS applications are now compiled to an intermediate representation which is then translated to the target CPU instruction at install time.

macOS applications could similarly be built to run on multiple instruction sets with basically no effort from the developers.

Perhaps initially it might be restricted to first-party apps, or maybe AppStore apps, but having fat binaries is something macOS developers are used to doing ...

The difficult bit would be the runtime migration from x86 to ARM and back. For processes that can restarted, it's easy. For running, stateful applications it would be more difficult.

But I think things like Mail.app already use a backend daemon to manage their datastore: you could suspend the UI process and restart the backend daemon on the low-power CPU fairly simply ...


From what I understand bitcode is still somewhat architecture dependent and wouldn't work well for ARM<=>x86 translation.


The need for IBM PC compatibility has been dead longer than software sold in boxes. Longer than compaq has been gone. Longer than USB has been a thing. Longer than current CS students have been alive, even!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: