Hacker News new | past | comments | ask | show | jobs | submit login
Apple walks Ars through the iPad Pro’s A12X system on a chip (arstechnica.com)
170 points by lkurusa on Nov 7, 2018 | hide | past | favorite | 170 comments



Apple are paving the way to A-chip laptops, not only by building excellent tech, but also PR-wise.

Their last keynote was clearly a gymnastics exercise to ignore Intel CPUs and dismiss laptop performance while later praising their own chips which power a tablet that has no software ready to use that much speed.

The fact that one of the most secretive companies executes a PR-stunt by providing an exclusive interview to one of the most respected tech outlets only confirms this strategy. Expect similar movements in the following months.

Now, it is only a matter of "when", not "if", Apple will start selling laptops with their chips.

As as an aside, this strategy is extremely similar to the one they used when dropping the headphone jack on the iphone: "leak" the news to a respected outlet, perform damage control before the keynote, and test the reaction of the market. When they introduced the headphoneless iphone, that topic was so beaten up that it got much less attention compared to a surprise revelation.


Why does it have to be a laptop? What not an iPhone with a desktop mode you can pair with a monitor, keyboard, and trackpad?

I’d love the ability to carry all my software and data around in a phone without lugging around a laptop or having to buy a desktop computer. And I’d love to never concern myself with transferring or syncing data again.

And why not let us connect an iPhone to an eGPU for desktop gaming?


Look up what Tim Cook thinks about convergence. That is not what an iPhone is for. Period. This gimmick is just a pipedream of us nerds. It won't have the slightest chance in the market and it would be a subpar product. Both Huawei and Samsung have implemented such a solution on their flagship smartphones since at least two years. They work as advertised, but nobody could give half a shit about them.


You can't say "Period" like that. Apple's endgame (in the far future) is for you to wear an Apple device that allows you to sit at any Apple computer/display/device anywhere and access your information from iCloud. Right now, that doesn't make sense because we can't make phones or watches that are as powerful as a desktop PC. Tim's (and Steve before him) insistence is that people don't want to do advanced graphic work on a tiny screen. If you could wirelessly and seamlessly use the same device to do that work on a larger screen, powered by that phone, they would absolutely want to push that. They just want people in the ecosystem. It's not that those things aren't what an iPhone or iPad are for, it's that they're not what those devices are for yet.


Ambient computing doesn't mean that all the computing devices can do the same thing, i.e. that a phone or a watch or a tablet will do everything a laptop can do. It means that distinct devices are integrated into a single experience in a complementary way.

> If you could wirelessly and seamlessly use the same device to do that work on a larger screen, powered by that phone, they would absolutely want to push that.

This vision works just as well if the larger screen is powered by its own computer instead of the phone.


Sure, but the vision is that all the information is localized to one device that you own that is on your person at all times. Look at how big Apple is about privacy and security. The goal is to have as few attack vectors as possible while offering a seamless experience to the user.


I don’t think it’s just nerds. I think the idea has quite a bit of mass appeal [1], it’s just that existing versions are in some way unsatisfactory. the phone is already on its way to replace our wallet and I think it’s just a matter of time for it to replace our laptop bag. I’d very much welcome that if it could do all the things I’m doing on my Thinkpad, it’s just that human interface and OS are not there yet (meanwhile the chip, RAM and storage is not a problem anymore). Give me character input solution that’s >60% of a Thinkpad and maybe a fold-out screen that doubles the area of a 6inch device and you got it, for more I’m gonna sit at a desk with docking.

[1] see for example the movie “her”.


"it’s just that existing versions are in some way unsatisfactory"

You say this as if it is an easy problem to solve.


"It’s just that existing versions are in some way unsatisfactory," was exactly the problem that plagued smartphones before the iPhone.


Not easy, but something Apple is good at.


”Look up what Tim Cook thinks about convergence”

You can’t look up what he thinks, only what he says he thinks.

Apple would be stupid if they wouldn’t be researching this option just in case. If they ever manage to make it work well, you can be sure what Cook says he thinks about convergence will change overnight.

For those who need more convincing: look up what Steve Jobs said he thought about products Apple didn’t ship, and compare it with what he said he thought about them later, when Apple _did_ ship them)


Chromebooks are not a gimmick they are great laptop replacements for older folks who spend majority of their time on the web. As Apple keeps improving its own chips I have no doubt an iPhone can run a fast stable "chromeOS like" environment. Now how Apple would feel about canabalizing their iPad lineup by doing that is a different story but a chromebook/phone combo would do great in the markup imo, especially with how many older people there are that already know apples OS interface


>Chromebooks are not a gimmick they are great laptop replacements for older folks who spend majority of their time on the web.

A friend of mine is a master real estate broker. He's a good example of someone can move almost entirely to a chromebook or an ipad because nearly everything is now web based. So it's not just old folks or people just surfing but regular people doing their jobs. In particular this guy is someone who has half his browser eaten up by toolbars and buys a new computer every few years "because the old one is too slow." A chromebook or ipad is a great place to park those people since they're curated environments and not the free-for-all virus delivery and identity theft machines PCs are for the unwary.


Apple executives have a long-standing habit of saying something sucks for whatever reason only to come out a few years later with the same concept only slightly tweaked.

And now that Apple's on board with USB-C, which can do video/power/peripherals, it's not inconceivable that a Monitor+USB hub would be all you need for a plug and play iOS productivity station.

It would be a trivial firmware change for the next release of iMacs to support, and an extension of their existing target-display mode.


Your comment reminded me of Seve Jobs mocking the idea of a pen for iPads. You can’t trust that such statements are true or that even if true today that they will be true in a few years. Ideas, tech, beliefs change and sometimes companies publicly proclaim things that are just plain false.


This is a pet peeve of mine.. Steve didn't mock the idea of styluses in abstract, he didn't want it to be the default and only way people interact with their devices. So Apple made the operation of iPad be all about the touch experience. Now, later on Apple released a much improved stylus as an optional, secondary and not at all necessary input method. That has nothing to do with the truth or untruth of Job's statement, nor is it inconsistent with what he said.


My recollection is different. The only thing I could find on Youtube substantiates your claim. I was wrong. Thank you for pointing this out. I think my overall point still stands though.


Q: How do you close applications when multitasking?

A: (Scott Forstall) You don't have to. The user just uses things and doesn't ever have to worry about it.

A: (Steve Jobs) It's like we said on the iPad, if you see a stylus, they blew it. In multitasking, if you see a task manager... they blew it. Users shouldn't ever have to think about it.

The point was that using a stylus as intermediary when doing basic interaction with a touchscreen is indirect/awkward and unnatural (the mouse is too, frankly), not that nobody should ever use a stylus for drawing.

Styluses clearly have a big precision (and visibility) advantage vs. tracking a whole fingertip touching/sliding around the screen (i.e. if we compare inherent human capabilities, not specific hardware), but relying on a stylus is also much more prescriptive about acceptable hand movements, and all of the stylus-first mobile devices pretty much suck compared to finger-based multitouch, in practice.


“No video on the iPod” was my thought.


They don't work as advertised - they're terrible.

Besides Microsoft, Google has come the closest, but they're not pushing it as convergence.

I think it is the future, but it's going to be a difficult one that requires an excellent launch - at least 3 years off before the next major inroads.


> Both Huawei and Samsung have implemented such a solution on their flagship smartphones since at least two years. They work as advertised, but nobody could give half a shit about them.

That's more a reflection of the fact that Samsung and Huawei suck at software--Bixby, anyone?

Everybody said the same thing about WiFi.

I used WiFi when it first came out--PCMCIA cards, external stick on antennas, etc. Worked as advertised but nobody gave a shit about them---until you sat in front of somebody and used it. It was almost a virus and spread like one.

Then everybody gave a shit. And look where we are now.

Everything on your phone--everywhere--is the endgame.


> Everything on your phone--everywhere--is the endgame.

The endgame is everything on every device you own. Sometimes you'll use your watch. Sometimes your phone. Sometimes your tablet or laptop and sometimes your big screen TV. It's just differently sized screens that all connect to your data in the cloud.

That is the endgame here. Nobody wants to go swimming with their phone or watch a feature film on their phones or do 8 hours of office work on their phones.


“No wireless. Less space than a Nomad. Lame.”


I am on the other side of this bet. I believe it is just a matter of time and the Surface Pro / iPad Pro are paving the way for just that.


Agreed. I think it makes perfect sense to have a phone that can plug into a tablet or any other form factor. The thermals might not let you run an iPhone-in-tablet as fast as a dedicated iPad, but given how fast their chips are getting, it's probably just a matter of time.


But Microsoft have no phone offering. And if they release a new phone, as a an ex-hobbyist Windows Phone developer who experienced all the consumer and developer hostile moves (no upgrade from 7 to 8, numerous lies, etc.) I'll be very reluctant to target this platform again. Other devs with experience on the previous phone platforms will probably have a similar view on the matter.


Actually we don't know that. Apple and Microsoft are the only companies which can really deliver on this promise and I can't see Microsoft being in the position to move their dev base to a convertible future, but Apple... If they try they can deliver (its not a given, but if they execute and the strs align, it could happen).


They don't give 'half a shit' because they are crappy. And that's not necessarily their fault.

Given the current performance numbers and the fact that we are switching to USB-C, which allows high speed connections to multiple types of devices, and we may be reaching a point where this is actually feasible.


You might be right, but I'd like to unpack that a little. Why would that be the case?

Obviously, the laptop form factor is missing, but it would be...interesting if the iPad, iPhone, and MacBook were all pretty much the same compute device with different form factors and different battery sizes and possibly different storage sizes. Honestly, the form factor and user experience are going to be the biggest user-visible differences, and those are also the things Apple seems to care the most about (even if they don't consistently get it right!)

Also, why would you want a single physical device anyway? Because hardware is expensive? Sure, maybe. Because you want to keep all your data in one place? That's the purpose of iCloud. Because you don't trust the cloud and want to keep all of your data physically on the same device and still access it from multiple form factors? That's a small fringe of the market that probably wouldn't buy Apple products anyway.


> why would you want a single physical device anyway?

All my files and apps setup they way I want them on a single device that's with me 24/7. Complete privacy and security because my data never leaves my device (except for backups to a time machine or iCloud).

I don't have to own multiple computers. I own one computer: my iPhone. There's only one device to setup, update, and maintain. If I want to have a larger screen, VR headset, eGPU, mouse, keyboard, speakers, headphones, etc., I just pair my phone to one. I can buy different peripherals for my home, my office, etc. but I don't to buy redundant computers.

Think of the Nintendo Switch.


All my files and apps setup they way I want them on a single device that's with me 24/7. Complete privacy and security because my data never leaves my device (except for backups to a time machine or iCloud).

So you don’t want your data to leave your device for privacy reason but you back it up to iCloud?


Encrypt your backups.


I think a lot of people (except maybe Apple shareholders) would be happy with a single physical device that handles a variety of use cases rather than paying Apple $1500 a pop, several times over, for the exact same hardware in different physical cases.


Hardware is expensive, but how much of that expense is the CPU and how much of it is the variety of screens and cases that you would still need to have to switch form factors?


That depends on if you already own a TV or monitor, both of which are available for a small fraction of the price Apple charges.


If you have a monitor, keyboard, and mouse, it's (theoretically) possible to use a smartphone as a desktop PC. That's also the least popular form factor for a computing device these days. Laptops or tablets would, even on the physical level, be much harder to pull off this way.


For a laptop, all you need is a keyboard/screen clamshell unit with a docking port for the phone where a drive would be on a typical laptop.

A tablet is pretty similar, but the dock needs to be behind the screen, which probably makes a thicker than average tablet. Or you just make a foldable phone which folds out to a tablet.


Even the laptop would be thicker than average unless you’re measuring by early 2000’s standards.


A single unified accessory market to end all accessory markets would probably be worth it.


Mobile OSes haven't been ready to accommodate desktop-level use cases. Apple is moving in that direction with iOS on iPad. Once they get there, convergence makes a lot more sense.


Samsung didn't implement convergence, they implemented Android-with-a-keyboard-on-a-big-screen . True convergence would be something like Ubuntu Touch wanted to do


I worked on Ubuntu Touch (specifically convergence) for a few years.

I think it's really interesting what Apple are trying to do (with Marzipan, pro apps for iPad, desktop level SoC's) but I think they still have a couple of years work ahead of them. It's not an easy task, especially with so much legacy.

Getting Adobe and Autodesk to rewrite their flagship software for iOS was quite a good win, though I still wonder how those compare to the desktop versions


"Getting Adobe and Autodesk to rewrite their flagship software for iOS was quite a good win"

Now if they could only get Apple to rewrite their flagship Pro software for iOS.


Touche. I think perf is becoming less of an issue, it's more about design challenges around touch/pointer UI density, screen real estate (depending on what the range of supported screen sizes should be)


Is there any reason apple couldn't just implement it so that the iPhone runs iOS normally (always) and boots into full macOS when in desktop mode? They share a kernel I believe, and the latest iPhones have plenty of disk space...


I think CarPlay is a test of this use. It's really similar, in that you connect your device to your car, and apps render a different view on a different form factor. You have a certification process so you know you have at least a chance of a good user experience. It's a subset of device functionality, instead of supporting more functionality.

If Apple invests in enhancing CarPlay, it might be a sign that they are scaling to a wider convergence market. If they don't invest or abandon it, then maybe convergence won't happen. They're famous for saying they're not working on projects that they are actually working on, so we have to read between the lines.

After upgrading from an iPhone 6 to an iPhone XR, I've been thinking how close the latest gen devices are to traditional computers anyway. The configuration/settings and features are so far beyond the first gen devices that I think convergence will happen, nobody knows what it will look like yet.


I don't disagree with you, but I'd like to point out that iOS apps have long been able to render a view on a television (via an AppleTV) while also rendering a different view on the device screen. This has been possible for at least 4 or 5 years.

There have been a few party game apps that allow a person to drive the game on their device while the other participants see what is on the TV. I don't think it ever really caught on; perhaps partly because AppleTV was a niche product way back when and partly because it didn't get much advertising.


I think the difference in what the PP was saying is that with CarPlay you can interact with the program while with external video you can only watch.


>>What not an iPhone with a desktop mode you can pair with a monitor, keyboard, and trackpad?

That sounds like the most un-apple thing I can possibly imagine. Their ethos was always to build a device that does something incredibly well - a phone that can be a phone but also a desktop computer is everything but this.


ISTR seeing illustrations from an Apple patent showing a computer display with a slot in the back into which an iPhone can be docked, turning the phone into a full-size desktop workstation.

And this was during the Jobs era.


Apple pays their employees to file patents for pretty much anything.

That doesn't mean they will ever manifest in real products.


How do you figure? The iPhone was unveiled as a 3-in-1 device during its first keynote.


The iPad is already this but with a bigger screen.


No, I'd argue it isn't. Even if you connect an external keyboard an display to the iPad, you are just getting the ipad experience on a larger screen. There is never any transformation of interface when used this way. Apple has never allowed mouse/trackpad on iOS(with some very very narrow exceptions) so it's not transformative - you are still getting the same experience essentialy.

For it to work on the iPhone, they would have to add support for some kind of mouse/trackpad, because the iphone itself is sitting in the dock - and I don't see apple doing that in a million years.


iOS still has a long, long way to go. It seems like they're preparing all the hardware, though.


Personally I'd rather have a "2-in-1" style MacBook + iPad Pro. I'm almost always going want to bring the screen and keyboard with me anyway, but it would be nice to have something I can use for both "real" work and casual use on the couch etc.


Why sell us one device when they can sell us two or three? I own an iPhone and a MacBook Pro. Why would Apple want to leave all that money on the table by converging these two devices?


>What not an iPhone with a desktop mode you can pair with a monitor, keyboard, and trackpad?

Because that would mean Apple doesn’t get to sell you two devices.


iOS is too restrictive for a development/productivity machine since it doesn't allow to install software from outside the app store (something like homebrew would be a security nightmare). The UI model of iOS also is still too touch-centric (while touch input provides zero advantages for a command-line interface).

But attach a proper(!) keyboard with touchpad to an iPad Pro, put macOS on it instead of iOS, and there's your next MBP ;)


Because they want you to buy a phone, a tablet, a laptop, a pc.

Sure - one device could do everything, but that doesn't drive sales or profit.


I can see the iPad being used like a laptop but the iPhone ... the screen is a little small no - even the largest iPhones.


You just described the Samsung DeX.


And the Motorola Atrix before it: https://en.wikipedia.org/wiki/Motorola_Atrix_4G


That sounds like a truthful explanation of their motives. On the one hand, I definitely welcome challengers to the chip industry. On the other, I remember the Power days. Having non Mac users “talk down” to me about using a different processor was beyond annoying. It adds a quill to the Mac haters. Here’s hoping Apple produces cpus that outperform their competitors like their 2 G/s ssds do.

Also makes me not really happy to buy any current hardware of theirs.


This is all hidden tech, you and OP have very valid points, but who is the intended audience of this message?

In the Jobs' days they'd promote the practical value of a product. For example they might say how the new iPod can have 50k songs, not how big the storage is, let alone the storage type. They'd mention a smaller form factor or improved battery life, not about HDD to SSD.

So when they talk about SoCs, cores, GPUs, intel or anything else hidden, are they signaling to customers? Maybe its signaling to investors that apple is innovating and that ought translate to profits; maybe the inner geek in all of us?

The dirty trick with hidden tech's performance figures is that they don't directly translate to customer value. As you mentioned, you're dissuaded from buying because the new stuff will be so much better. Maybe it will but the old apple would tell you that the new macs can process video in final cut 10x faster, or you don't have to buy a separate gaming rig, i.e. you can do more stuff better.

The good part about focusing on what the products can do is that you can't fake it. You can't fake it to [geeks] like me, [non-geeks like] my mom, my kids, investors, etc.


Got any data on the SSD? I believe that consumer NVMe SSDs are all 2-3,000MB/s or more these days (for sequential reads which I believe is the "standard" benchmark people use for speed (ignoring iops))?

E.g. this from 2015 (!): https://www.anandtech.com/show/9396/samsung-sm951-nvme-256gb... This from this year: https://www.anandtech.com/show/13438/the-corsair-force-mp510...


The speed of SSDs is vastly overstated, as they are generally rated at the speed of the fastest component and the real world slows them down. For a comparison of SSDs in actual devices check this out:

https://www.laptopmag.com/articles/2018-macbook-pro-benchmar...



That's a comparison between macOS and Windows, not the SSDs themselves. I would not have believed how terrible Windows is at small file I/O unless I had seen it myself dozens of times over the years. If they had put Linux on one of those laptops for comparison, I would consider a chart like that fair.

If you actually read technical deep dives like AnandTech posts on SSDs and MacBooks alike, you would see that they really do perform identically to what Apple ships... because Apple is shipping industry-standard SSDs. Apple chooses expensive SSD chips, sure, but it's the same NAND everyone else has. Only the most recent generation of MacBooks (since the T2) has actually integrated their own SSD controller, inside the T2, but it's still the same physical industry-standard NAND chips underneath.


And that's exactly my point - the SSDs spec sheet numbers are based on the NAND chips, but the controllers slow them down in real world performance. I've still yet to see a benchmark that shows Macbook Air/Pro/iMac Pro level SSD permormance from any other SSD equipped computer.


No, I’m saying Apple has been using industry standard controllers until literally the MacBook Air refresh that just launched.

https://www.anandtech.com/show/12670/the-samsung-970-evo-ssd...

You can buy this SSD today. There are others like it, and some, like Intel's Optane SSDs actually substantially outperform anything Apple has put in their computers in terms of IOPS, even though the Optane line hasn't focused on raw sequential throughput yet. Off the shelf SSDs get the much vaunted performance of "Apple's" SSDs (they're just normal SSDs...) being discussed in this thread.

In fact, the SSDs in those windows computers are as fast as the one I've linked to. The benchmark you were linking to was benchmarking NTFS vs APFS under a small file I/O workload, which NTFS sucks at. It was not benchmarking the SSDs in any effective manner. I am certain that I pointed this out in my comment above! If those laptops were copying a few large files, the performance would have been identical. If those laptops were running Linux, the performance would have been identical.

Look here! https://www.anandtech.com/show/12167/the-microsoft-surface-b...

Scroll to the bottom and tell me what you see! Yes, that SSD is performing as well as your vaunted MacBook!

Apple's iPad Pros are technological marvels. Their laptops' storage systems are not, and you're just deceiving yourself if you think otherwise.


> On the other, I remember the Power days. Having non Mac users “talk down” to me about using a different processor was beyond annoying.

Sample size of one and all that - but for me it was a selling point. The PowerPC G5 (PPC 970) was intriguing when it first came out. Having been with Intel, Cyrix and AMD systems since the 286 and used them since XT (Intel 8088) days, it was nice to muck around with something new and paired with OSX Tiger it was such a fun world to explore. The move to Intel felt like a bit of a letdown, but by that time I loved OSX and Snow Leopard cemented it as my OS of choice.

Apple desktops/laptops moving to custom silicon would excite the nerd in me. I want competition.


Moving to Intel chips did two or three things for Apple:

1. It helped them jump from a failing CPU platform to a non-failing CPU platform. PPC was not keeping up with x86 anymore and Apple's two PPC vendors were going in opposite directions because there wasn't enough of a market for CPU's for Macs. (There were even rumors of future Power Macs migrating to a full POWER CPU rather than PPC.)

2. It meant you could run Windows, and hence Windows apps, on your Mac if you wanted to, without CPU emulation. This is still a fairly important use case.

3. It may have also simplified matters even for Mac application development, since you didn't have to switch ISA's in addition to switching operating systems. Making matters worse, PowerPC defaults to big-endian and x86 is little-endian.

How does this apply to a potential ARM switch?

1. You can't really say x86 is "failing" if it's still the industry standard, but Apple might believe (rightly so, given the market size of iOS) that they finally have the ability to sustainably outperform the performance of x86 on their A-series chips.

Most of the PowerPC bet was that a newer and more elegant architecture would outperform x86 and provide Apple a competitive advantage, and while that may have occasionally been true sometimes, it was never a huge deciding factor. Intel and ARM kept up because they were able to make investments in keeping x86 afloat. Ironically, Intel themselves also bet that a newer, more elegant architecture would make x86 obsolete, namely Itanium, only for AMD to invent x86-64. Not even Intel themselves could stop the x86 train.

With the rise of mobile devices, ARM now has the same market power as x86, if not more, simply because there are many more ARM-based devices manufactured and sold than PC's. Apple in particular has been able to invest heavily in their A-series chips and has full control of their CPU roadmap and destiny. Perhaps this time, x86 may finally be rendered obsolete. Don't count on it, though.

2. This is really mostly dependent on Apple's strategic priorities. With more and more application functionality moving to mobile and the web, being able to run Windows is less and less important. At the same time, being able to run Linux is more important; for many developers, running a Linux VM in Vagrant or Docker lets us develop in a similar environment to the servers our code will eventually run on. Sure, you can run Linux itself on ARM, and perhaps there will be more Linux distros that support ARM when and if Apple switches the Mac, but it won't actually be the same as the server unless ARM makes serious inroads in the server market.

Maybe they're betting they can surpass x86 enough that they could emulate x86 at respectable speeds. Since they would be migrating CPUs again, they will probably provide a CPU emulation layer again, like they did when migrating from 68k to PowerPC and then from PowerPC to x86. Keeping this emulation layer around would have more of a benefit because, after awhile, nobody needed to run 68k or PowerPC code anymore. This has never been true for x86 code, and it won't be for a long time, so look for Macs to continue to run x86 even if Apple switches.

3. I think A-series is also little-endian by default, and for x86, see above. Maybe Apple is banking on getting more value by running cross-platform iOS/Mac apps than cross-platform Windows/Mac apps. This will probably impact Mac gaming the most, but that's never been a priority for Apple.


Note: there's also an ARM version of windows. Although it lacks full backwards compatibility with the x86 version.


Apple is already blocking Linux installations on new hardware via the T2 chip, so I'm getting the feeling Apple doesn't care much about Linux support on its laptops.


The T2 stops you from booting Linux, unless you specifically disable that functionality, but if you read closely you’ll notice I was mostly talking about running Linux in a VM, which actually is a common use case.


The near total lack of the word "intel" at the keynote was certainly quite noticeable. The question is how much was that a negotiating ploy for better pricing/stock with intel and how much was positioning for arm macs. I think a bit of both.

I went back to remind myself of the details of the switch from powerpc to intel. It was announced at wwdc 2005 (june) when they released a developer transition kit. The announcement included a commitment to ship computers running x86 by wwdc 2006, so there was pretty much 12 months lead time even for outside developers. Apple also committed to moving to intel fully by the end of 2007, a 30 month total process.

I think Apple is further ahead of the game this time around for how quickly they can go from announcement to shipping product. OS X had been running internally on x86 for years, but this time tons of apple software has been publicly running aarch64 for many years. I do think they need more than 30 months to complete a transition this time around as the user base is much larger and apple may never want to invest the serious dollars it will take to build the giant chips they get from intel. It's one thing to swap out the macbook processor. It's a whole other world to do 130 watt dies.

I'm expecting either wwdc 2019 or 2020 we get an announcement, with products shipping after the OS release in the fall of that year.

Apple's laptop naming has been getting steadily worse since the introduction of the retina macbook pro. The macbook modifying words have lost all meaning when systems labeled pro have major expansion and repair limitations, the device named air isn't the smallest or lightest and the model without a modifier isn't the cheapest. They missed an opportunity to restore some naming sanity this year, but a switch to arm could present it again.

As great as A12X is, it's not touching discrete GPUs and CPUs allowed to burn wattage approaching triple digits. If I was apple I'd lean into this and restore "pro" as a designation that means something. Pro devices would stay x86 and be marketed as supporting more software. Non pro devices would make the jump to arm on their regular update cadence. It would give Apple tons of time to get their custom chips to xeon level scale for core count and interconnects. Even gives apple the option to continue using x86 indefinitely as investing in 100+ watt chips may not have the returns to make it worthwhile. Then the ipad pro becomes poorly named, but I can't solve all of apple's self created problems.

If I was apple this is the mac product matrix I'd have when the dust settles:

macbook: First to move as it's perfect for A12X since it already only has one usb-c and due for a refresh. Drop the intel tax and now it's around $1099. Apple could even use the exact panel from the 12.9 inch ipad minus the touch gear.

macbook air: would need the next generation A chip to support more usb-c and more ram. New sub-$1000 price for the 128 gb model and outrageously long battery life for web browsing or note taking. No need to make it any thinner or lighter.

macbook pro: Kill the weird non-touchbar model that was clearly supposed to be the new air but priced way, way too high. spec bump the 13 and 15 inch, especially discrete GPUs

imac: switch to arm or kill and replace with giant beautiful screen that ipad and iphone docks with.

imac pro: spec bump, but this is as close to a perfect device apple has released in a long time.

mac mini: Use the apple tv case to build an arm mac with A12X or higher that can be sold quite cheap. Use as great PR to give away xcode development systems to schools and developing nations and get swift into the hands of people learning to develop applications

mac mini: relabel as mac mini pro and pretty much keep as is

mac pro: Make it unbelievably expensive but also user repairable, multi GPU on their own standard cards, tons of ram and big xeon chips.


"The macbook modifying words have lost all meaning when systems labeled pro have major expansion and repair limitations, the device named air isn't the smallest or lightest and the model without a modifier isn't the cheapest."

This eloquently summarizes much of the disconnect customers have been feeling about the Mac product lineup. Jobs never would have allowed this to happen.


Playing the "Steve would never have done this" game is always dangerous, but in the past apple has been much more focused on having obvious product differentiators.

I think Tim tries to extract all the possible pennies from the supply chain possible. In a vacuum it's a responsible way to run a business, but it has left apple selling some products they should be ashamed to feature and customers unsure which product is built for their needs. When apple sells old iphones more cheaply the number in the name gives people an clear indication they are trading price for newness. The mac line is a confusing mess of exactly what you get when you put down hard earned money.

It's a weird combination of some devices with touch id, how many usb-c you get and whether the usb-c port is also tb3 or not. For those really out of the loop what the hell a touchbar is. The fact every laptop apple makes is thin, light and retina doesn't help.

For something as core to the experience as touch id it really should have triggered across the board refreshes, but apple seems interested only in substantial updates (though I would certainly consider the addition of touch id very substantial).

There is a lot to be said for shipping incremental updates on a regular basis so customers know the expensive product they spend their money on is being given attention. I don't need each revision to blow me away and every customer might not buy every refresh, but every time a product is refreshed it's a strong indicator if I buy this product it will continue to see investment. When it is time for me to put down my hard earned money I can be confident I'm getting good value and not missing some key improvement that is available but not on this particular line yet.

I'm experiencing this problem with ipad mini right now. Adore my ipad mini. Adored the 1st one. 2nd one was a huge upgrade so got that. 3rd and 4th didn't justify the cost, but now that my mini 2 is really showing age buying a 4 doesn't offer very good value when it's unchanged for 3 years. There are finally rumours a new one might be coming, but when? Do I move to a different size class (that I don't really like)? Continue to stick it out for a product I have no reason to believe apple intends to continue developing?


I think one thing you’re missing is how good Apple’s GPU have become (and will be in 1-2 years).

I’m thinking you’re going to have a hybrid architecture for MBP. The T2 will expand to support all of Apple’s own software plus all upgraded one sold through a revamped Mac Appstore (plus hopefully your own compiled stuff when security settings are off). This can power down (all but ~2 cores) and instead power up an x86 coprocessor that supports everything else. If Intel doesn’t deliver that, AMD will. For Macbook I think you’re on the right track, just that the Air will be replaced with a 13 inch Macbook model with the same architecture.


> I think one thing you’re missing is how good Apple’s GPU have become (and will be in 1-2 years).

Not compared to other discreet GPUs they haven't. They are still 5 years behind consoles. The only time mobile ever "catches up" and achieves "console-class" is near the end of a consoles ~5-8 year lifetime.

The performance is superb for mobile, and certainly good enough for integrated (it'd be perfect in something like a macbook air), but it's still getting destroyed by the relatively crappy Radeon Pro 560X in the 2018 MBP.


Right, but I figure this could be scaled up with larger dies. Sure, Apple is not yet on eye level with Nvidia & AMD, but given their massive R&D budgets I don’t think that’s for long. Also, AR & VR drives demand for GPU FLOPS/Watt in a tight envelope, which has proven to scale quite well to larger chips in CPUs (see mobile ARM -> Cavium & co.).


Scaling up is one of those easier said than done things. Until Apple actually does it there's no reason to assume they could do it.

Cavium is sort of your proof that scaling is hard. The 32-core ThunderX2 @ 2.2ghz with 56 PCI-E lanes has the same TDP as the 32-core AMD Epyc with 128 PCI-E lanes. And it's slower than the competition from AMD & Intel at comparable power budgets.


Sure, Cavium is not yet competitive for compute heavy workloads that make use of vectorization. For everything else, e.g. memory bandwidth bounded algos from what I gathered in benchmarks they are quite competitive. And those kind of workloads are actually quite common in HPC from my POV.


I'm a little scared to consider the hybrid approach since I think it's highly likely Apple will only let mac store code execute on the ARM side of the house. They can tout the security benefit (which is real), but they get to collect their 30%. If apple ever gets to a point where they ditch x86 macOS is no longer a general purpose OS and that's pretty much its whole appeal.

I'm not personally that interested in a hybrid mac as getting rid of the power draw of an intel cpu and all the weight and technical baggage it comes is the real draw of an arm mac for me.

What I really want is to hand apple a little over $1000 and walk away with a laptop form factor with nothing but one of their fantastic A series SoCs inside and running full macOS where I make all the decisions. The gpu being so much better than the anemic intel integrated stuff in my current air is a big part of the attraction of the A series chip running macOS.

I don't think the air branding will go away. Apple spoke so much during this last event about macbook air being people's favourite mac I can't see them giving up on such positive branding. It's far and away the best selling mac and has been for close to a decade.


>The question is how much was that a negotiating ploy for better pricing/stock with intel.

From an Apple perspective you could easily justify the price for all the old Intel Chip. 1. Intel has a 2 years+ node advantage which you cant get anywhere even if you pay. 2. Intel has the best performance / watt CPU on the market, also highest performance Core on the market, you cant get it anywhere even if you pay. 3. x86 compatibility, which is more like a x86 tax. Although you could get it from AMD.

Now the first two is gone. TSMC has now edged Intel in node, Apple themselves has the best Pref / Watt work in A12X. AMD has proved to be very competitive at the high end. And Intel is still charging Apple the same when they have far less value.

Apple is now being hold up by Intel, but I don't think Apple could dump Intel just yet. There are two thing that is holding Apple up.

Thunderbolt - TB is currently still an Intel only technology. There are no other host controller on the market other than Intel's one. And they cost a fortune ( relatively speaking ) Apple has invested a lot into TB, but Intel is making all the same Firewire mistakes. May be Apple is working on a USB 4.0 solution, and they would dump TB once and for all in 2020. Intel promised to make TB as an open standard in 2018 but has yet to do so.

Modem - Apple relied on Intel Modem for iPhone, which is Apple's bread and butter. Before any move on the Mac side Apple will need to think about its consequences on Modem. In worst case Apple could switch away from x86 and Intel decide to hike the price of Intel Modem. I think the revenue of Modem is roughly the same as revenue of x86 from Mac. Intel 10nm isn't performing. And Apple is not happy with Intel on both front.


The modem business from apple represents a huge amount of leverage over intel. It doesn't really matter what part it is, if you sell 50 million of something per quarter you're going to do whatever it takes to keep that business. Even if apple stops buying x86 processors modems alone make them one of intel's most important customers, especially when intel is trying to grow their modem business.

thunderbolt is a hurdle to arm macs but I don't think an insurmountable one. As you mentioned Intel already committed to opening the spec and if apple can't negotiate getting tunderbolt host controllers tossed in super cheap and forcing intel to honour its own commitments with modem sales they don't deserve to be valued at 1T.

TB is one of the big reasons I think macbook would be the obvious 1st mac to switch. It currently only has USB-C (no TB) and it's due for a refresh since the lack of touch id makes it an outlier.

When apple switched from ppc it was a multiple year effort and I don't see any reason a switch to arm, whether it's a full move or particular products would be any different.


I really like and share your vision for the Mac line, but I doubt keeping two instruction sets simultaneously is the Apple way.

I hope I’m wrong though.


They have been fully supporting two instruction sets (at times three with arm32 and aarch64) since iphone came out in 2007.

I agree it would be odd for the mac line to bifurcate like that, but even in the ppc -> x86 days they supported ppc macs for quite a long time.

Apple has been getting great returns from its chip investments because they have been able to reuse the blocks in so many devices. If they do go ARM for mac there is diminishing returns as they move up the product line about how much of the silicon they can reuse. They are right at a point where power hungry multiple memory controllers, complex core interconnects and other things that will likely never make it into an iphone are going to be necessary. I suppose they can use the highest end macs as test beds to see if ideas work out, but is apple really going to have chips fabbed with 20+ big cores and complex intercore transport for products they will never sell in huge numbers?

I think for the highest end apple is better continuing to piggy back off intel's server investments and focus their chip team where it has been been absolutely dominating: sub 10W incredible performance per watt.

macOS on two instruction sets would require some extra developer time, but volunteers keep debian running on 18 arches. After the initial port there is some care and feeding, but it's manageable. The biggest issues are cross compiling, something apple is already really good at, and device drivers. Apple has that covered with the T2 chip. Move more and more of the peripheral connectivity into something like the T2 chip and then you only have to write/maintain aarch64 drivers.

The more I think about it the more it makes really good sense.


I reckon they'd be room on the market for a battery life focussed laptop. If they could hit 24 hours+ they'd be laughing. 2 archs would be a pain, but quite a lot of software already supports aarch64, so it probably wouldn't be too bad..


How often do you need assembler in device drivers?


To everyone citing transitions and Mac & iPhone, I meant two architectures for the same product, not as stop gap, but fully supported going forward.

And it’s not that the can’t techically, it’s that they are all about focus and unambiguous messages to developers and costumers.

But after reading it all, I guess they could fork the Mac into two categories. Air, light or whatever, and Pro.

I actually like that idea a lot.


They've had dual instruction sets almost continuously since the company began:

MOS/68k, 68k/PowerPC, PowerPC/Intel, Intel/ARM


When I worked there, we had to support ppc32/ppc64/x86/x86-64/arm32.

Now I'm guessing the dwarf format is: x86-64/arm32/arm64 with arm32 being legacy.

Back then we were also switching from GCC to LLVM, which at the time I thought was ludicrous because we'd be losing all of the flexibility in architectures GCC gave us. But I guess my worries were without merit.


with the iPad line there is no reason to move the computer line to ARM. They would alienate a lot of the current user base who has software there are no Arm equivalents nor likely to ever be one.

while I have seen many laud the iPad Pro for its power I haven't see any mention of heat or how long it can sustain a workload. Personally I do not want to see the Mac line change processors again for two reasons, like I mentioned prior many of us have a lot invested in software that runs on OSX as well as Windows that these machines can run, second I don't need that wall to go any higher.


iPad, especially with the limitations artificially imposed by iOS is a content consumption device first. You can trick it into other tasks, but it's not a primary computer.

If I was apple I wouldn't go all in on arm macs. I'd build arm thin and light laptops to get incredible battery life and leave the pro line as x86 for at least the next 3+ years. Even if I did decide to stop building new x86 devices I'd support macOS on x86 for at least 5 additional years.


From an OS standpoint. I only see two real limitations keeping iOS from being a desktop replacement for most people -- support for mice/trackpads and adding support for USB Mass storage devices to the Files app and treating mass storage devices like they treat Dropbox/iCloud/OneDrive, etc.


iOS is getting close to being all the computer my grandparents could ever need. My mother as well. My father and I are both in tech and would run into too many hard (and fairly artificial) limitations pretty quickly.

I won't ever switch my primary computer to something that needs an optional accessory to hold the screen up. ipad may be magical, but there is plenty of wonder left in a device can open up and start typing on a real keyboard immediately. It's been many years since my primary computer has been a desktop, but I'm not willing to compromise on the permanently attached keyboard and freedom to decide what executes.


An iPad is already the only computer my parents own. They don't even have smart phones.


And that iPads + keyboard are significantly cheaper than MacBooks.


Cheaper yes. Significantly? Not so sure. Unlike iphone apple hasn't been selling the old ipad pro as a cheaper, more entry level product. The base price of ipad pro has gone up each time we've seen an updated one until now we're firmly in macbook price land once you get functional storage. The modifier-free macbook is a similar size to 12.9 inch ipad pro, but ipad is actually more once you get 256 gb of storage and the optional keyboard.


Besides compiling code, what can’t an iPad do in terms of “creation” that a similarly priced laptop can do? You can edit movies on it, heck, you can even film movies with it, you can write, edit photos, do drafting and drawing (something that laptops can’t easily do,) you can make music on it, use it as a Logic Pro controller. The only think you can’t easily “create” on iPad is code. You can do spreadsheets, work processing, create Keynotes, take notes, manage a calendar. You can manage your HomeKit stuff.

Seems far cry from being just a consumption device. That canard is getting old. A guess since you can’t run Linux on it or run a sever with it, somehow it isn’t creative?


It's more than just code. A "real" computer lets you do so much by piecing together work from different applications. For example, you can use a text editor like vim to transform a CSV file with powerful tools like regular expressions and shell commands, then load the CSV into your spreadsheet to work on. Yes, you can use the files application on iOS to manage your files but all of the other applications tend to stick to keeping all their files in one place.

On iOS it's very hard to build your own custom workflow with a bunch of applications and other tools, the way you can on a desktop OS.


It could probably be done with the Workflow app introduced in iOS 12 if you were so inclined that can stitch together different apps.


What workflow app? I'm on iOS 12.1 and I don't see a workflow app.


Oh yeah. It’s called “Shortcuts” now. It was originally a third party app and was bought by Apple a little bit over a year ago and was given deeper hooks into the system.

https://workflow.is/



It might be randomly killed if the app isn't in the foreground with the screen unlocked, though.


Just play a silent audio stream and it will keep running.


just a consumption device is hyperbole for sure. I certainly don't mean to imply ipad is useless or it can't meet the needs of tons of people.

For my particular needs it takes more more effort and time to complete the same set of creation tasks on my ipad compared to my mac. That includes working with media files.

I love casually browsing the web on my ipad, but the moment I want to write more than a few sentences I reach for my mac.


Just curious, but how would custom silicon work with something like gaming engines? I know there are engines out there that support an ARM architecture, but as it is currently, you can run a lot of games fairly easily on Macs. And with eGPUs being officially supported now, gaming on a Mac is not a terrible experience.

Would moving off of traditional desktop CPUs harm that? Is there a way to do compatibility at the OS level without sacrificing half of the performance gains?


At one point intel had committed to releasing the thunderbolt spec to the industry, so at least in theory arm macs could also support egpu. Games would have to be complied to support macOS on arm, but nintendo switch is also arm so that isn't necessarily taking game companies somewhere they don't already have to go.

https://newsroom.intel.com/editorials/envision-world-thunder...


> chips which power a tablet that has no software ready to use that much speed

I think this comment underestimates the importance of performance to the iPad and Apple's long term vision for it. While "real Photoshop" won't be ready until next year, it's clearly aiming to make the iPad a real solution for compute heavy graphics tasks. More will undoubtably follow. Why not edit video on a film set on the iPad? I can see it happening. Your comment sort of makes it sound like Apple just threw these into the iPad as a PR strategy and the "real plan" is to transition the Mac. But Apple's plan is to make both of these pro product lines as beefy and power efficient as possible and target real professional creative workflows.


So for the past few years we've been hearing time and time again how amazing these Apple chips are yet were is the software that actually pushes them?

Wheres the example of software that truly shines on these chips? Wheres the software like After Effects, Houdini, Octane Render that you can truly see the power of the machine rev up when on the right hardware.

They're wheeling Photoshop out as proof of this devices power yet as a designer I certainly don't consider Photoshop a heavy piece of software anymore and the only reason it ever chugs is it doesn't use the machines power effectively, mostly single core and disk speed constrained.

This power has been available to iPad developers for a few years now so shouldn't we be seeing truly powerful pro apps that take advantage of it emerging? Are these chips actually powerful for real world pro tasks or are they just talented at providing Geekbench scores.


> ...which power a tablet that has no software ready to use that much speed.

I guess it depends on what you mean by "ready" how long the list is, but I'd suggest that Photoshop, console-quality games, AutoCAD, video editing tools like iMovie, all can take advantage of the speed.


> console-quality games

What console-quality games actually ship on the iPad?

The problem with games is that to actually match an Xbox One S's graphics you don't just need to match its 5 year old hardware in performance. You also need to have the capacity to actually fit the game & its textures. All 40-80GB of it.

Who is going to ship an actual console-quality game at console-quality sizes on a device whose base model can't even hold it?

And Apple is notoriously stingy with RAM. This new one bumps it up to 6GB, which is nice, but still less than the 8GB in an Xbox One S. How much does iOS reserve of that, and how much do the games actually get?


Yeah, Civilization VI is well known to respond well to having additional cores, so I expect the lags I see in late-stage games on large maps to not be as bad as they are on my 1st generation iPad Pro.


Man, do you do PR as your job? A very nuanced understanding of the matter you have there.


He's an Apple employee massaging our brains to be ready for the big message in 2019. It's all part of their master plan, the post you replied to has been crafted and trimmed to perfection by Jony Ive, Tim Cook and 26 lawyers.


Kind of ironic that, had Apple not hired Anand away from running Anandtech, they would probably still be the most respected tech outlet.


I have a feeling that most of the people who frequent this site would be more interested in Anandtech's A12 deep dive that was part of their iPhone XS review than a PR interview on the A12X that is light on performance or power use metrics.

CPU:

https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-re...

GPU:

https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-re...


This is a fine article for what it is, but isn't it really "Apple marketing walks Ars through the iPad Pro’s A12X system on a chip"? (at least I assume Anand wasn't hired as a SoC architect)

There's little technical detail that wasn't in the ipad review's benchmarks[1] and previous speculation on using apple chips in desktop machines[2].

Everything new they gave no details on, and the only in-depth answers were hard hitting questions like "you could have made a slow chip, why did you decide to make a fast one instead?" and "why is apple so good at teamwork?" (maybe not the questions that were asked, but they were the questions that were answered :)

[1] https://arstechnica.com/gadgets/2018/11/2018-ipad-pro-review...

[2] https://arstechnica.com/gadgets/2018/04/apple-is-exploring-m...


I haven't had a chance to read the article yet, but Anand Shimpi is hardly "Apple marketing." He was almost certainly hired because of his technical expertise, not for any marketing ability.

If I were to speculate, due to his decades of industry experience evaluating hardware platforms, his role in Apple is to provide strategic direction and guidance on how to build the best hardware platforms. I can't think of any other role they would have wanted him for... which could be a failure of imagination on my part.

So... no, it's not "Apple marketing". Anand Shimpi is involved!


> his role in Apple is to provide strategic direction and guidance

That's the feeling I got, as well as perhaps being a human abstraction layer between the engineers and the executives. If you can describe in relatively understandable terms something that is technically difficult to thousands of laypeople (well, that's unfair - Anandtech was for nerds but being a nerd doesn't make you an IC/EE engineer), you'd be an asset to both the corporate and engineering teams.

And that translation works both ways - understanding the direction of the company with regards to future products vs what you want to get out of your silicon teams (e.g. when Anand mentions thermal envelopes, that might include an understanding of the limitations stemming from the potential form/design and material of a future product).


> This is a fine article for what it is, but isn't it really "Apple marketing walks Ars through the iPad Pro’s A12X system on a chip"? (at least I assume Anand wasn't hired as a SoC architect)

I wonder how that colors the reviews - if the outlet is too critical about a device, they can quickly lose these special privileges.


I was briefed personally by Anand after the introduction event in New York. I guess I can say that now, as it's out that Anand is taking care of this side of the business. It was the most compelling, interesting, clear and well informed briefing on a silicon in my entire career as a tech reporter. Anand is a an A+ player, and Apple has made an incredible hire in his case.


I wonder how long it will be before Apple releases a laptop with an A-X chip.

They’re already shipping their custom T2 chips in their laptops. The compiler toolchain can build great binaries for their A chips. They’ve swapped CPU architecture before and the modern Mach binary format can hold versions of the executable built for different architectures.

They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch. That might be tricky because of the huge surface area of the x86 instruction set. But I think it will only be a matter of time. It might also explain why they have kept the MacBook and MacBook Air product lines - they might want one of them to stay with intel’s cpus and the other to switch to their A* chips going forward. Or maybe they’ll just wait another generation or two and switch CPUs across their whole line in one go.


>They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch.

Well, and all the applications that won't switch. Also even on the Mac virtualization/containerization is not nothing. The Mac is a different market and use profile then iOS, and while Apple based on past history won't support an old arch indefinitely neither are they likely to completely blow off backwards compatibility. Compared to previous transitions dropping x86 would have extra complexities as well, so previous experience may not be entirely applicable. In particular Apple would be moving away from the full fat computer standard rather then no change or towards one, which may change the payoff for users despite Apple being much bigger. The absolute performance differences (immediate and future) also aren't likely to be as big.

I don't want to underestimate them, and huge disruption is inevitably coming down the pipe anyway and Arm may well emerge a winner there regardless, but it's also just a really big challenge.

>That might be tricky because of the huge surface area of the x86 instruction set.

Transmeta was able to do a decent job, and I think Novafora is still around and licensing their IP? Granted a lot of instructions have been added since then, but Apple certainly has a lot of expertise there as well and a great deal of capital to aim at the issue.


Apple's track record of being able to switch architectures is absolutely incredible. If they want to switch, they can switch.

To think that deep in the Apple labs that don't already have A-X laptops running - and have for, for a while is not thinking like Apple would.


> To think that deep in the Apple labs that don't already have A-X laptops running - and have for, for a while is not thinking like Apple would.

Right. To me it's unthinkable given the amount of low-level code shared between iOS and macOS that macOS hasn't been running on ARM since day 1 of iOS.


Yeah; if nothing else, the first versions would have been "port Darwin to this SoC". Maybe they never ported the GUI components, but it would be unsurprising if they did.


>Apple's track record of being able to switch architectures is absolutely incredible. If they want to switch, they can switch.

Sure, but don't discount how every specific instance can be a bit different either. As I said they've got expertise, they've got capital, and there are even previous paths to follow here. But at the same time every time has its own unique hurdles. Previously with 68k -> PPC and then PPC -> x86 for example they were going to something that was not merely just an improvement in some important respect right off but also had a clear long very steep growth ramp ahead thanks to fabrication improvements if nothing else. In the PC world we were still very much in either a very steep or at least steeper part of the S-curve. But those days are just plain done, the issues presented by physics and the geometry sizes being worked with now are simply fundamentally harder. There is certainly more room for improvement year to year for a long while, and more chances to grow horizontally with valuable new features, but it's not like a system made now will be obsolete in 3 years either.

Additionally those were coming at points in Apple's life with a dramatically smaller installed base, and they were also going away from something more proprietary (at least in principle, obviously CHRP never actually worked out that well). The move to x86 which brought Macs in line with everything else meant a huge amount of software opened up to more trivial porting, huge amounts more opened up to trivial virtualization, a vast 3rd party hardware market became more easily accessible, etc. That definitely helped offset some of the old Mac software that ultimately didn't make it, even more so because due to above it is still quite possible to run old Mac software fine: Classic can be emulated, and 10.6 can be run under virtualization still which in turn grants access to Rosetta even on new Macs, and the absolute performance advantages vs 12+ year old systems are significant enough that even with the overhead it's still fine.

Basically there are a lot of subtle day-to-day advantages that come from everything running the same instruction set underneath, or at least being able to stick some sort of translation layer in there. Again, absolutely not saying it's something Apple can't tackle, just that it's a big challenge and I think it's bigger now then it was any time previously. Of course, Apple too is bigger now then any time previously! They're not infallible though and I hope they get the balance right here.


Would dual chips make sense? A bit like their dual graphics, run most of the OS and anything compatible in one, and have another, low power x86 core take over transparently on demand (like, when running an x86 only program)


Aside from performance, a huge reason to abandon Intel is security. I don't see Apple keeping an Intel chip if they switch to the A-X series as this would defeat much of the point.

Anyways, Apple has never been one for smooth transitions. Their history is dotted with big, bold changes. If they kept x86 they would slow the adoption of their new architecture. Apple will likely take a "take it or leave it" attitude like they did with the CD drive and headphone jack.


The company that didn’t even keep a USB-A port on its first laptop with USB-C ports will absolutely never keep an x86 CPU in its first ARM laptop.


If dual chips is possible, it would probably slow down developer adoption.


I don't think they will. They have already expressed the iPad is their vision of the future of computing. A completely locked down platform allows apple to have full control over the entire computing life cycle and achieve results unparalleled elsewhere, whilst also making their product incredibly sticky.


I think Apple can and will justify it. If you think back to Steve's analogy about cars and trucks, there's still (obviously) a very healthy market for trucks.

Despite their ongoing efforts to make the iPad more capable, I think and hope they'll recognize the value in keeping it simple enough for anyone to use, and thus having a separate macOS experience with more tools/flexibility in a laptop/desktop form factor.


Apple doesn't really care much about backwards compatibility. I'm sure they will have an x86 emulator, but it probably won't be very fast (read: useless for games) and will probably be dropped entirely a few years down the line.


>Apple doesn't really care much about backwards compatibility

Depends on what you mean by "much". They have made good backwards compatibility an important part of every single architecture transition so far, and on the Mac there were good 3-4 year official transitions at least (the 68k emu still ran under Blue Box/Classic Environment so it lasted through 10.4 Tiger, Rosetta lasted through 10.6 Snow Leopard). And that's official, in practice there have continued to be longer last options.

It's certainly not the degree that Microsoft has traditionally cared, but it's not at all been blown off either.


This is what I'm waiting for. I can't find a reason to upgrade to the current line of MacBook (Pro, Air, etc.) Better performance, better battery life than Intel and I will no longer be funding Intel's lazy way.


Bloomberg had a report earlier this year saying they'd have ARM Macs in 2020, which makes sense given that lines up with the work they're doing in Marzipan now [1]. They're also making gains in the legal fight to crack the Qualcomm business model, so that means ARM Macs could have cellular at the same time [2][3].

[1] https://www.bloomberg.com/news/articles/2018-04-02/apple-is-...

[2] https://www.wsj.com/articles/qualcomm-suffers-setback-in-ant...

[3] https://www.reuters.com/article/us-apple-qualcomm/apple-not-...


I don't see the point of an A-X chip in a laptop. You aren't taking photos on your laptop, why does it need the image signal processor? There's no FaceID (yet, maybe it's coming) on laptops, so we don't need the ML chip.

When you strip away all the stuff a laptop doesn't need, you're left with... an x86 chip!


> When you strip away all the stuff a laptop doesn't need, you're left with... an x86 chip!

You're left with an ARM chip


ISP is also used for the video, as in the front camera used for video meetings.

FaceID is not solely dependent on ML, it's also managed by the secure enclave co-processor which is also used for Touch ID which is available on Macs now. ML helps to reduce false positives.

Apple's T2 chip is an ARM-based processor that's already in almost all of the newest Macs, it is used as a storage controller (which allows Apple to encrypt the drive very fast and transparently), security enclave processing (touch ID on MBA), Siri processing, and more. Every year, more and more of the processing is moved to Apple's T series co-processor.

Apple's custom silicon allows them to integrate software and hardware on a deeper level. Intel develops CPU for the mass market. Apple develops for their own customers only.


> There's no FaceID (yet, maybe it's coming) on laptops, so we don't need the ML chip.

With Apple's focus on on-device ML, I would guess this will be the first part of the A-series trifecta (CPU, GPU, Neural) to be included on a Mac and exposed to developers. I can imagine a bunch of possibilities for such a chip, not just FaceID.


Exactly. Why couldn't developers/hackers start using the ML chip via python etc to do machine learning, facial recognition from remote cameras, etc?



I found Rob Beschizza's thought [1] about the new iPad Pro insightful: "a very powerful machine, handicapped by iOS having no workflow".

Because of the sandboxed nature of iOS, and the current immaturity of the Files app/feature, it's hard and inconsistent how you can exchange files between apps on the iPad. As a result, you're still mostly stuck to using one app to do things in iOS. The workflow is still fragmented.

[1] https://boingboing.net/2018/11/06/ipad-pro-deemed-amazing-fo...


It doesn't matter if Apple's goal here is to get "desktop class" A-series CPUs manufactured and shipped at scale...as a precursor to an A-series Macbook.


So basically the same kind of argument which has plagued all high end Android tablets: Good hardware held back by lacking software.

Iow: If you want a pc, get a pc.


Exactly. Except all the discussion about the hardware power of this iPad is how it's comparable to a PC laptop.

But from a software usability standpoint, it's not.


Is it really true this A12X is comparable to last year's Macbook Pro?

On that Macbook Pro I could run several VMWare sessions running Windows and Linux (have run 3 at the same time in the past). I can run Handbrake encoding videos across cores while still browsing the net in Chrome with 4 windows open each to different profiles each with 5 to 20 tabs. Have 4 terminal windows open, at list one of them serving a dev webpage. Run VSCode and Unity and Visual Studio and other stuff all at once. I've also done things like compile Chrome from source. Run XCode, run 2-3 iOS simulators.

I get that an A12X can't do those exact tasks as it's not the same instruction set but could it do the equivalent and get similar perf?

That's amazing if true. An iPad Pro weighs 1/4th of my MBP (2014). My MBP's fans spin like crazy when running a high intensity app and the case gets too hot to touch.

I'd love to believe a machine that has no fans and doesn't get hot and weighs 1/4 as much could actually have the same or more perf for real but when I actually use an iPad it rarely feels as fast and given it doesn't multitask well there's no way for me to check that perf is really comparable in real world use cases.

Anyone have any insight? Is it just because the chip was redesigned to be more efficient it can match or exceed the i7 in my MBP? Should Amazon be filling their AWS racks with A12X based machines that get the same per at much less heat and power? (Yea I know they can't by A12X chips buy still). Don't iPhones and say top Samsung phones generally show similar perf?


It is comparable at running geekbench 4, yes. And some web benchmarks seem pretty comparable.

But your workload doesn't seem to include those things, so hard to say. Critically A12X is unlikely to have hardware virtualization support, so your use case of VMware would be slow even if it wasn't doing any binary translation.

Also your MBP's fans spin because it's trying to achieve higher sustained performance. Typically mobile devices will just instead thermal throttle hard. Like, lose half their performance hard. How well can the iPad Pro sustain its performance? That's a real big question.

Re AWS racks: No, they can't. A12X in the server world would be a joke. It'd be competing against things like AMD's Rome which is 64 cores / 128 threads with, and this part is critical, up to 4TB of RAM with 128 PCI-E lanes. Even if the A12X could compete on raw CPU throughput it can't compete on I/O, virtualization, etc... The A12X is also going to be pulling a lot more power than you might expect. It's not that much more power efficient.


I really wish Apple to stick an A-series chip in a MacBook Pro with an Intel co-processor.

Best of both worlds.

Don't need x86? Don't spin up the Intel chip, doing something that requires x86, spin it on up.

This way you get software compatibility with the power sipping of the ARM CPU.


Been there done that.

I had a PowerMac 6100/60 with the DOS Compatibility card that had its own sound chip, video controller and optionally RAM.

My 6100/60 had 24MB of RAM and the card had 32MB of RAM.

Before that, I had an LCII with a ‘//e card.

I doubt that modern Apple would ship a hybrid x86/Arm laptop though.


This is nuts. The iPad Pro is so massively ahead of the game, it's even hard to believe. It is faster than the MBP 2017 and ALMOST as fast as the MBP 2018. We're talking a device that is 5.9mm thick and is battery-powered!

Now here's my prediction: Apple does not really want to build an ARM-powered MBP. Instead, they will eventually allow iPads to double boot into iOS and/or MacOS.

Call me crazy, but this would be huge. Of course, Apple would still build traditional laptops, maybe even with ARM processors in them, but only as a byproduct of their iPhone / iPad product line.


This will never, ever happen.

You can't use OSX with your finger. There are millions of places across the OS and applications where the hit point is too small for a finger. You just need to compare the keys on the iOS keyboard and then compare that to the traffic lights on OSX.


Mouse support :)


> This is nuts. The iPad Pro is so massively ahead of the game, it's even hard to believe. It is faster than the MBP 2017 and ALMOST as fast as the MBP 2018. We're talking a device that is 5.9mm thick and is battery-powered!

The MBP's are also battery-powered & thin?

But you seem to be taking geekbench 4 here as gospel. I'd take that with a grain of salt. A really, really big grain of salt.

Even with that said I'm not even seeing any MBP 2017 results in the article...?


Tim Cook himself flat out said they have no intention of ever merging MacOS and iOS or converging the two in any way.


Yeah, and Apple never turns around in a couple years and does what the CEO said they had no intention of doing. (Big phones, tablet stylus, ...)


Merging does not equal dual boot. They'd still be separate pieces of software, but the ability to run MacOS on iPad hardware would be huge.


I just don't see Apple ever adding a feature as convoluted as dual-boot. That's just about the least Apple software feature they could add.


I would have figured anand would give this interview to anandtech.


I guess his involvement may have made that look like a conflict of interest? AnandTech already has a reputation Apple favoritism (deserved or not, I can't say).


Would have been interesting for Ars to have asked them how their GPU differs from the IMGTec/PVR IP they acquired besides adding more cores, what did they change?

They give the impression by saying it’s a custom GPU that it’s a from scatch in-house design but it’s unlikely to be the case.


The GPU they were licensing from Imagination was already very tailored to their needs. Apple already had a strong saying in the architecture, in the driver, in the compiler and in the implementation details of the GPU they were buying from them.

Moving to a "custom GPU" was basically Apple saying "Ok, thanks, we are taking over from here".


I've been considering getting an iPad w/ Pencil for over a year now as a note-taking and doodling apparatus (too often I'll scribble a system diagram down, realise I've missed a bit, and had to start again thanks to the immutability of pen and paper). But beyond that, I don't think I'd have a use for one.

All this power sounds great, but I really don't know what else I'd do with it, beyond surfing an ad-riddled internet on my couch.

If any developer has a life-changing daily use case for their iPad, I'd love to hear it.


Content blockers allow for ad free (ish) browsing on iOS. I have found the iPad Pro with pencil an excellent tool for research (reading and annotating papers), and using Pythonista python based programming can be done, although more as a hobby.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: