Ever since I've read the innovator's dilemma around 2006 or so, I've tried to watch for other examples of disruptions happening in the tech industry, including laptops (disrupting PC's), iPhone/Android phones (disrupting Nokia/RIM smartphones), iOS/Android (disrupting Windows/Mac OS) and a few others.
But while you could still find something to argue about in some of those case, especially when the "fall off a cliff" hasn't happened yet for those companies (disruption takes a few years before it's obvious to everyone, including the company being disrupted), I think the ARM vs Intel/x86 one has been by far the most obvious one, and what I'd consider a "by-the-book" disruption. It's one of the most classical disruption cases I've seen. If Clayton Christensen decides to rewrite the book again in 2020, he'll probably include the ARM vs Intel case study.
What will kill Intel is probably not a technical advantage that ARM has and will have. But the pricing advantage. It's irrelevant if Intel can make a $20 chip that is just as good as an ARM one. Intel made good ARM chips a decade ago, too. But the problem is they couldn't live off that. And they wouldn't be able to survive off $20 Atom chips. The "cost structure" of the company is built to support much higher margin chips.
They sell 120 mm2 Core chips for $200. But as the articles says, very soon any type of "Core" chip will overshoot most consumers. It has already overshot plenty, because look at how many people are using iPads and Android tablets or smartphones, and they think the performance is more than enough. In fact, as we've seen with some of the comments for Tegra 4 here, they think even these ARM chips are "more than enough" performance wise.
That means Intel is destined to compete more and more not against other $200 chips, but against other $20 chips, in the consumer market. So even if they are actually able to compete at that level from a technical point of view, they are fighting a game they can't win. They are fighting by ARM's rules.
Just like Innovator's Dilemma says, they will predictably move "up-market" in servers and supercomputers, trying to chase higher-profits as ARM is forcing them to fight with cheaper chips in the consumer market. But as we know ARM is already very serious about the server market, and we'll see what Nvidia intends to do in the supercomputer market eventually with ARM (Project Denver/Boulder).
As for Microsoft, which is directly affected by Intel/x86's fate, Apple and Google would be smart to accelerate ARM's takeover of Intel's markets. Because if Microsoft can't use their legacy apps as an advantage against iOS and Android, that means they'll have to start from scratch on the ARM ecosystem, way behind both of them. Apple could do it by using future generations of their own custom-designed ARM CPU in Macbooks, and Google by focusing more on ARM-based Chromebooks, Google TV's, and by ignoring Intel in the mobile market. Linux could take advantage of this, too, because most legacy apps work on ARM by default.
The biggest difference is that Intel has consulted closely with Christenson, and is not afraid to cannibalize their own market to retain dominance. The Intel Celeron came directly from their consultations with Christenson. The Celeron signficantly dented Intel's profits temporarily but was the beginning of the end for AMD.
And certainly the price is a very significant factor. But remember that ARM sells an order of magnitude more chips than Intel does. So if Intel is successful, they can make it up on volume, at least to a degree.
I dont recall Celeron being a major problem for AMD. The things that did hurt were
a) Intels effectiveness at preventing AMD SKU's from hitting markets
b) The Core 2 family from Intel
c) AMD insisting on shipping 'native' dual / quad cores with worse yields - there wasnt any advantage to the end user and I would imagine the yields were worse
I know, but that was back in Andy Grove's time. I don't think Paul Otellini ever understood the innovator's dilemma that well. In a sense, it does seem like they get it now and try to compete with ARM, but I'm not so sure this came from within the company. I think they were pressured into it by stakeholders and the media a few years ago.
But again, even if they succeeded making competitive chips against ARM, that doesn't equal market success in the mobile market, and it doesn't equal that they will survive unless they take serious steps to survive in a world where they are just one of several companies making chips for devices, and where they might not even have a big market share of that, and where they make low-margin chips. Bottomline is they need to start firing people soon, restructure salaries, and so on. I think this is why Paul Otellini left. He didn't want to be the one to do that, and be blamed for that.
Intel's 120mm2 Core chip is built on a smaller process than the ARM based competition meaning there are far more transistors on that $200 chip than the one you are comparing it to. Also keep in mind that Intel doesn't need to charge $200 to make a profit. The current Atom chips are price competitive with ARM based alternatives. The Atom parts aren't quite up to par with an A15 based part yet but I think it'd be foolish to count Intel out.
The primary cost of a SoC is manufacturing. Process advantages mean that you have access to cheaper transistors that have better performance and power characteristics. The easiest way to improve the ratio of performance to anything in microprocessors has always been to make it smaller. There have been far too many words wasted on the role of instruction sets and architectures. Those things matter but that's the easy part. The hard part is getting a meaningful advantage in the manufacturing side, which is what Intel has. This is precisely why AMD is dying. They can't even undercut Intel because Global Foundries is so far behind Intel that they physically can't produce an equivalent product for less despite Intel's ~60% margins.
I think what you will see is Intel getting more aggressive in the mobile space in the next couple years because they are going to want to ensure that TSMC doesn't get a chance to catchup. TSMC is the real key to anyone threatening Intel not ARM.
Great analysis. Yes, cost structure; yes, ARM is the incumbent with all those forces in their favour, such as being tuned to the upgrade rhythm of the phone market.
A way out for Intel is their world-leading foundries, with process shrink a generation ahead. It's been suggested they manufacture ARM SoCs, and sell at a premium. But there isn't really a premium market... except for Apple, and its massive margins. And Apple is feeling the heat from competitors hot on its heels. Therefore: Intel fabs Apple's chips. Intel gets a premium. Apple gets a truly unmatchable lead. It's sad for Intel, but Andy Grove has a quote on the cover of The Innovator's Dilemma. They know the stakes.
The nice thing for consumers would be a x2 fast or x2 battery life or half-weight iPhone/iPad next March, instead of in 1.5 years.
BTW, re: Tegra 4/overshoot - In the next generation, when silicon is cheap enough for oct-core, because we don't need the power (and can't utilise multicore anyway) it will instead lead to the next smaller form-factor. But smaller than a phone is hard to use, for both I and O. A candidate solution is VR glasses - because of size.
> But smaller than a phone is hard to use, for both I and O. A candidate solution is VR glasses - because of size.
If we could get wireless hdmi down and standardized (or wireless displayport, or some display standard over some wireless frequency band) I can easily see computers going so far as being solar powered interconnected nodes, where instead of you lugging around hardware you just link up to the nearest node and utilize a network of tiny quarter sized chips as capable as a modern dual core A9 or some such that runs off solar.
I wonder if Ubuntu has a chance in this setup. In a real productivity driven environment, having a single-app-in-fullscreen OS is just absolutely insuffient for purpose. But Ubuntu already runs on the Nexus 7 (albeit really slowly because Compiz and Unity are... lackluster).
I don't think the Unity desktop will make it, but I definitely see some windowed environment inserting itself into the gap between Android tablets and Windows laptops on the high end ARM chips. And unlike Windows, the GNU core has a ton of software written that runs on it already, and thanks to being open source and compiled against GCC, this stuff rarely has large issues besides performance running on ARM.
I only say Ubuntu because it seems Canonical is the only market force trying to push a GNU os commercially. Red Hat seems content to let Fedora just act as the playground for Enterprise, and Arch / Mint / Gentoo / Slack / etc don't have the design goals (ease of use for a completely new user) or infrastructure (Debian and its ultra-slow release schedule wouldn't fly).
(The "cost structure" of the company is built to support much higher margin chips.")
That is where you got it completely wrong. Intel can produce ARM based SoC that earns nearly the same margin as they are currently selling their CPU.
Not to mention you keep referencing x86 and Intel as the same thing. As the way they are and for the foreseeable future. I could literally bet anything that Intel wont die. Simply because Intel could still be the BEST Fab player you could ever imagine. In terms of State of Act Fabs, they beat TSMC, UMC, GF, and Samsung Combined! And Intel aren't dumb either, that have the best of the best Fab Engineers. And the Resources and R&D that is put now for the coming 3 - 5 years in the future.
So Intel wont die.
x86? That depends. If you look at the die shot of SoC you will notice CPU are playing less and less part in die areas. It used to be 50+%, Now it is less then 30%. CPU, or ISA is becoming less important. You will need a combination of CPU, GPU, Wirless, I/O and other things to succeed.
> But as the articles says, very soon any type of "Core" chip will overshoot most consumers.
Your claim is essentially "Most people will never need today's high-end chips, let alone anything more powerful." This could have been equally well said in 1998. How do you know you're not as wrong making that claim now, as you would have been making that claim then? What's different?
Today's computers are powerful enough to comfortably run today's software. Tomorrow's computers will have a lot more power than today's software needs; but that's irrelevant, because they'll be running tomorrow's software instead.
To lay out my case in a little more detail:
"As hardware resources increase, software will bloat until it consumes them all." This is probably somebody's law, but I don't know who off the top of my head.
You don't really need more than ~300 MHz, 128 MB to do what the vast majority of users do: Word processing, email, and displaying web pages.
Usage patterns may change as you increase the amount of computing power you need. For example, I usually have a large number of tabs open in my web browser -- I probably wouldn't use the browser in this way if I had much less memory.
Some software is just bloated. My Windows box has dozens of different auto-updaters that run on every boot. Steam does literally hundreds of megabytes of I/O doing -- something, Turing only knows what.
Of course all the latest UI's have all kinds of resource-intensive 3D effects, run at unnecessarily high resolutions, use antialiased path rendering for fonts instead of good old-fashioned bit blitting, et cetera.
The point is that, as the standard hardware increases, OS'es and applications will add marginally useful features to take advantage of it. Users will learn new usage patterns that are marginally more productive but require much better hardware. As standard software's minimum requirements rise, people buy new hardware to keep up.
This is not a novel idea; it's been the story of computing for decades, and a trend that anyone who's been paying any attention at all to this industry has surely noticed.
Except that's exactly not what's going to happen - the shift is determinedly away from heavy desktop apps to lightweight clients and more work done on the server backend. That causes a dramatic split in processing needs: clients get thin (requirements flatline in procesing) and servers get fat (now all the work is server side). What used to be a "unified" market suddenly is not any more.
> "Just like Innovator's Dilemma says, they will predictably move "up-market" in servers and supercomputers, trying to chase higher-profits as ARM is forcing them to fight with cheaper chips in the consumer market."
Which way is up? Intel's been moving down in terms of per-core wattage since 2005, putting them closer to direct competition with ARM. Anybody can glue together a bunch of cores to get high theoretical performance, but it's Intel's single-threaded performance lead that is their biggest architectural advantage.
In this case "up" is defined by potential profit/chip. There are structural and easy to miss business reasons why it is very hard for any company to successfully move into markets where the profit margins are below what they are structured to expect, and it is very easy for a company to move into markets where the profit margins are above what they are structured to expect.
Intel has to improve their power because a major market for them - server chips - is full of people who want to spend less on electricity for the same computation. Despite this push, they have made absolutely no inroads into the unprofitable mobile market. By contrast ARM, which already has the required power ratios, has every economic incentive in the world to move into the server market. Unless Intel can offer a good enough power ratio to offset the higher costs of their chips, ARM will eventually succeed.
But while you could still find something to argue about in some of those case, especially when the "fall off a cliff" hasn't happened yet for those companies (disruption takes a few years before it's obvious to everyone, including the company being disrupted), I think the ARM vs Intel/x86 one has been by far the most obvious one, and what I'd consider a "by-the-book" disruption. It's one of the most classical disruption cases I've seen. If Clayton Christensen decides to rewrite the book again in 2020, he'll probably include the ARM vs Intel case study.
What will kill Intel is probably not a technical advantage that ARM has and will have. But the pricing advantage. It's irrelevant if Intel can make a $20 chip that is just as good as an ARM one. Intel made good ARM chips a decade ago, too. But the problem is they couldn't live off that. And they wouldn't be able to survive off $20 Atom chips. The "cost structure" of the company is built to support much higher margin chips.
They sell 120 mm2 Core chips for $200. But as the articles says, very soon any type of "Core" chip will overshoot most consumers. It has already overshot plenty, because look at how many people are using iPads and Android tablets or smartphones, and they think the performance is more than enough. In fact, as we've seen with some of the comments for Tegra 4 here, they think even these ARM chips are "more than enough" performance wise.
That means Intel is destined to compete more and more not against other $200 chips, but against other $20 chips, in the consumer market. So even if they are actually able to compete at that level from a technical point of view, they are fighting a game they can't win. They are fighting by ARM's rules.
Just like Innovator's Dilemma says, they will predictably move "up-market" in servers and supercomputers, trying to chase higher-profits as ARM is forcing them to fight with cheaper chips in the consumer market. But as we know ARM is already very serious about the server market, and we'll see what Nvidia intends to do in the supercomputer market eventually with ARM (Project Denver/Boulder).
As for Microsoft, which is directly affected by Intel/x86's fate, Apple and Google would be smart to accelerate ARM's takeover of Intel's markets. Because if Microsoft can't use their legacy apps as an advantage against iOS and Android, that means they'll have to start from scratch on the ARM ecosystem, way behind both of them. Apple could do it by using future generations of their own custom-designed ARM CPU in Macbooks, and Google by focusing more on ARM-based Chromebooks, Google TV's, and by ignoring Intel in the mobile market. Linux could take advantage of this, too, because most legacy apps work on ARM by default.