Hacker News new | past | comments | ask | show | jobs | submit login
How Intel missed the smartphone market (mondaynote.com)
292 points by JeremyMorgan on May 16, 2016 | hide | past | favorite | 195 comments



Expect the same fate for many more tech companies.

Although not many people realize, tech is a _mature_ sector of the U.S. economy now - it grows at about 3% a year, pretty much in line with GDP. But the perception is different - just ask your friends - and the reason for this is that there are highly visible _pockets of growth_, like Uber/SpaceX/name-your-favorite-unicorn, and people often extrapolate their growth to the rest of the industry.

Now what happens with many tech companies is that they often have a product with exceptional margins and double-digit growth rate, and it makes sense to invest all the available resources into it - better from ROIC perspective - ignoring all the alternatives which either lack volume or high margin to look attractive. This inevitably leads to problems once your exceptional product stops growing, and you realize you have barely invested in anything else.

Just much like Intel with x86 and ARM chips, or Qualcomm, or EMC, or RIM, ... - the list goes on and on.

Even when you look at Google, most of their resources are invested into search/ad business, so when that's stops growing - or, rather, they got all the share from the TV and start growing at a GDP rate - they will be in the same boat.

Edit: typos.


What counts as tech?

Is AirBnB tech, or hospitality? Is Uber tech, or transportation? Is the Apple Watch tech, or a watch? Is SpaceX tech, or aerospace? Is Tesla tech or automotive? Is Instacart tech or groceries? Are Twitter/Medium/Wordpress tech or media?

Not a rhetorical question, I'd actually like to know what's included in the 3% growth figure. I could believe it if tech is defined as "companies that sell computer hardware or softare", but it seems very low if tech is defined as "companies that provide a service using software or hardware that wasn't possible a decade ago."


That's a GREAT question. Information Technology is $5.6T market, and telecommunications services is another $1.8T [1]. Uber and AirBnB, which I believe ARE tech companies, are under $90B combined, so ~2% of IT sector.

SpaceX is aerospace: ~$12B valuation vs ~$600B Aerospace & Defense industry size, and Tesla is automotive, $28B vs ~$750B Auto + components market.

Note that Uber, AirBnB, SpaceX and Tesla are probably the biggest unicorns out there.

But you are raising an important point - many industries will be redefined when new competitors are using IT much more heavily. Still, that growth mostly comes from existing players, and I believe the growth of the IT sector itself is limited now.

[1] https://eresearch.fidelity.com/eresearch/markets_sectors/sec...

edit: typos, minor clarifications.


Uber, Airbnb, SpaceX, Tesla are not unicorns any more than Google, Facebook, Netflix, Priceline and Amazon are.

Uber is seven years old, Airbnb is coming up on eight years old, SpaceX is 14 years old, Tesla is 13 years old. And they're all very substantial enterprises at this point.

I'd suggest that companies of that size and operational maturity should be considered to be well beyond the semi-early stage billion dollar valuation companies that have tended to get the unicorn tag. They were unicorns, now they're just like Netflix or any other established company.


Unicorns generally refer to pre-IPO billion+ dollar companies. Tesla is not a unicorn, since it's IPOed; Uber and Airbnb are.


Here is a fairly comprehensive list (it's surprisingly long). https://www.cbinsights.com/research-unicorn-companies


> tech is a _mature_ sector of the U.S. economy

I disagree. What I see is a boom/bust cycle as major technological shifts lead to growth and then consolidation:

Mainframes are invented, an ecosystem of small players flourish, the market grows an order of magnitude or two, the technology finds product/market fit, the market consolidates around a single major player (IBM). Defeatism settles in amongst entrepreneurs as more and more opportunities are swallowed up by Big Blue.

The PC is invented, an ecosystem of small companies flourish, the market grows an order of magnitude or two, and then consolidates around a single player (Microsoft). Defeatism settles in amongst entrepreneurs as more and more opportunities are swallowed up by Embrace and Extend.

The internet is invented, an ecosystem of small companies flourish, the market grows an order of magnitude or two, and then consolidates around... well, now there are more players. Now there's Google and Facebook and Apple. Defeatism settles in among device inventors as the iPhone/Android hegemony seems unassailable.

Some people think VR will be another land grab that fuels an order of magnitude jump in the tech market, and AI. I don't think that's crazy. Personally, I think there's a massive economic explosion waiting to happen in making programming tools more accessible to average people (Wordpress/Squarespace/Salesforce/etc are beating on that door, but no one knows how to get through it yet).

So I guess what I'm saying is I agree with you, but I think this is a trough. There are more peaks ahead.


> Personally, I think there's a massive economic explosion waiting to happen in making programming tools more accessible to average people (Wordpress/Squarespace/Salesforce/etc are beating on that door, but no one knows how to get through it yet).

If you look back, that was always the hope when personal computers got introduced and has been ever since. It never seems to pan out: BASIC on micros didn't do it, Excel/FoxPro/Access didn't do it, HTML didn't do it (RIP Geocities).


Gordon Bell analyzed [0] computing markets decades ago and found that a market forms around a new computing class roughly every 10 years. If we mark today's smartphone as the most recent market, we should expect another major platform market to open up very soon. My guess is that it will be a hybrid platform, combining resources over the network.

[0] https://en.m.wikipedia.org/wiki/Bell%27s_law_of_computer_cla...

Here is a keynote Gordon gave discussing some of his analyses: https://youtu.be/Z-4X6dkESEk


> Personally, I think there's a massive economic explosion waiting to happen in making programming tools more accessible to average people

It's called Excel.


A pure functional reactive programming environment with built in schemaless eventually consistent (with onedrive!) data store. Excel ain't bad.


Yes, sadly the state of the art in End User Programming hasn't advanced much since the invention of the spreadsheet. Spreadsheets will stay with us, but I don't believe they are the final form of programming.

http://www.miramontes.com/writing/spreadsheet-eup/ (1990)


That's not entirely true.

Intel has invested in plenty of things that haven't worked out, for instance, Itanic, Ultrabooks, Realsense, etc.

Also companies like Intel and Microsoft have bought into the Clayton Christensen theory of "disruption" to the point where they really are throwing their existing customers under the bus. That is, Windows 8 was perceived as a tablet operating system, Intel only sells high-end chips under the Xeon brand, etc.

Had Intel stuck more piggishly to making better PC's, we might have mainstream water-cooled desktops running at 5Ghz+ and more reasons for people to upgrade.


According to Wikipedia[1], Intel set aside "$300m over 3-4 years" to invest in Ultrabooks, so $100m/year. Contrast that to ~$8B they spend on R&D each year[2]. So it was a relatively small bet for them.

I'm not saying tech companies never invest in anything outside of their core products. It's just often those projects are given low internal priority precisely because their (presumed) impact to the bottom line will be small.

[1] https://en.m.wikipedia.org/wiki/Ultrabook [2] https://ycharts.com/companies/INTC/r_and_d_expense

edit: typos


I'll just note that a startup of 10 - 100 people that raised $300M over 3-4 years would be a big deal.

Innovation isn't about ideas, its about finding ways to execute against those ideas in a way that others haven't or ideally can't.


Sure, but important difference is that 10-100 people startup that raised $300m behaves way differently than an internal department of a major tech firm that has decided to invest into a side project: authority to make decisions, willingness to take on risk, speed of innovation - everything is very different between the two.


Cisco and it's spinning out approach comes to mind.

They probably got this right.

Amazon is another company that is innovating disruptively despite it's size.


I think your definition of innovation makes sense only in the context of corporate or SV innovation.

In a more general context, innovation takes on the other meaning.


And small bets rarely materialize into anything useful. They end up being great marketing, and worth the cost in the marketing spend so others think you are being innovative while you continue to do the same stuff you always have.


> Also companies like Intel and Microsoft have bought into the Clayton Christensen theory of "disruption" to the point where they really are throwing their existing customers under the bus.

They could've fooled me. Clayton's theory wouldn't recommend changing Windows to adapt to the mobile form factor. It also wouldn't recommend Intel investing what are now probably over $10 billion in Atom and in making x86 work for mobile.

Quite the opposite. It would tell Microsoft to create a new operating system that is dedicated for those form factors, and it would tell Intel to adopt ARM on their own. Both should also be done under an isolated division, so they have no "conflicts of interest" with the incumbent products, whether it's about product features (like say Intel refusing to make Atom too strong so it doesn't encroach on Core territory), or setting up expensive (and unsustainable) infrastructure to compete against the leaner competitors.

I would also say both Microsoft and Intel failed in these markets because they were also late to try and become players in those markets, which is what the disruptive theory always warns incumbents about. It may be difficult to remember now, but even Windows 8 was considered "late" for the mobile market. So was Windows Phone 7 compared to iOS and Android. Same goes for Atom "catching up" to ARM chips in performance/low power (though not in price, a typical failure of a disrupted incumbent). If the "Wintel" monopoly wasn't so strong, you'd already see Celerons and Pentiums be replaced by ARM chips in notebooks and hybrids and whatnot, because those chips are the same ones that failed against ARM in mobile.


This is inline with what Stanford professor John Ousterhout says about the evolution (and necessary death) of software :

"The most important component of evolution is death Or, said another way, it's easier to create a new organism than to change an existing one. Most organisms are highly resistant to change, but when they die it becomes possible for new and improved organisms to take their place. This rule applies to social structures such as corporations as well as biological organisms: very few companies are capable of making significant changes in their culture or business model, so it is good for companies eventually to go out of business, thereby opening space for better companies in the future. Computer software is a counter-example to this rule, with ironic results. Software is incredibly malleable: it can be updated with new versions relatively easily to fix problems and add new features. It is easier to change existing software than to build new software, so software tends to live a long time. To a first approximation, software doesn't die (compare this to the hardware in a computer, which is completely repaced every few years). At the same time, it is difficult to make major structural improvements to software once it has been shipped, so mistakes in early versions of the program often live forever. As a result, software tends to live too long: just good enough to discourage replacement, but slowly rotting away with more and more problems that are hard to fix. I wonder if the overall quality of computer software would improve if there were a way of forcing all software to be replaced after some period of time."

https://www.quora.com/What-are-the-most-profound-life-lesson...


> Had Intel stuck more piggishly to making better PC's, we might have mainstream water-cooled desktops running at 5Ghz+ and more reasons for people to upgrade.

Can you think of what those reasons might be? I can't. My office computer is an i5-3570 and it sits idle most of the time because it's just not that hard to have Outlook and Firefox open. A/V work and AAA gaming drive upgrades now like they always have, what other reasons would there be for Intel to push beyond their current efforts like you suggest?


I think that higher resolution displays are probably the leading reason behind gamer upgrades today. Pushing 3-4x the pixels is a lot of overhead compared to the decade before UHD. Another reason really is power consumption. There's been a lot of effort to reducing the power envelope in CPUs, this investment would have happened even without a gamer segment.

Servers are probably Intel's single biggest market today, and the companies running these miles of racks of servers are demanding more cores and lower per-core power. That goes in line with mobile improvements too. Intel is a perfectly reasonable cpu option for tablets today, within 3-4 years they'll probably have something suitable for phones.

I've always liked the concept of a phone with everything that you dock into a workstation, or a laptop form factor to use, but as a phone it's still great. There's a lot of possibility there. More than that, with the likes of services that go beyond echo, we may see the watch form factor take over from phones... relying on larger systems in your home/office/car to fill the void. Who knows where things will go.


>> we might have mainstream water-cooled desktops running at 5Ghz+ and more reasons for people to upgrade.

> Can you think of what those reasons might be?

Some applications that I would want that might need to wait for high-cpu (some high-memory and/or high-disk) to become more mainstream, since I want them all to run locally until FTTH reaches my part of the boonies:

A scanner I can literally wave a piece of paper at, have it process all of the ultra-high speed video frame grabs into a high-dpi scan, then reconstitute the scan into an appropriate LibreOffice document or PDF, with better OCR and better vectorization of raster pictures. Then I auto-file all my invoices, bills, statements, receipts, etc.

A personal digital assistant that transcribes, annotates, indexes and categorizes all of my conversations across all mediums (voice, text, chat, social, etc.). This doesn't need to be ML-fancy at first (except for the voice recognition piece), just lots of memory+cpu+disk to drive intensive, continuous indexing.

A millimeter-accurate 3D positioning system, using differential GPS off of a geodetic mark on my property, with relays of signals to indoors. This would drive my robotic chicken tractor, curb edger, and vacuum cleaner. I could keep a detailed inventory of household items with this, then query the house computer over a voice link the next time I forgot where I set down $thing (some RFID and 3D positioning integration needed here). Outside, it keeps track of exactly where various pipes, outlets, irrigation heads, etc. are located.

A software-defined radio that scans for chatter on interesting bands like police and fire, performs voice recognition on them, and pings me when it hears keywords about my neighborhood, street, nearby streets, or region when combined with other keywords (like "tornado").

A house computer that can tell whether or not I'm in the house or not, if I'm asleep, getting ready to sleep, or still undergoing my morning routine, at my computer, at my tablet, on my phone, etc. And do this for all occupants and visitors. Lots of visual recognition from lots of camera input. For myself, I want this information to drive what information is presented to me, when, and in what context. Bloomberg Radio after I've been up for 10 minutes, and after I've put on my headset, before my first call (incoming or outgoing).

A process that continuously combs through my Emacs Org agenda, and compares against my current "state". If I'm driving away from the house, and confirm a query from the computer that I'm going out to buy groceries and run errands for the day (the query is launched by continuously-computed probability cones narrowing possible agenda matches against my GPS position until a threshhold is met), and no one else is in the house, then automatically put the house HVAC system into energy-conserving mode.

A dumb robotic mechanics driven by updateable software in my desktop computer to fold my laundry, turn over my compost bed, rotate stop valves monthly/quarterly (otherwise they freeze in place, the primary challenge the automatic flood sensors face today), change HVAC filters, open and file incoming snail mail, etc.

Intensive AR to turn nearly all parts of my house into smart surfaces. "What's inside that drawer?" "What's behind that wall?" "What's inside that freezer?"

If we are talking only about pure desktop applications constrained to the physical desktop/laptop computer itself, then yes, you may have a point. However, when we include extending the reach of those systems to our environment, and there is an explosion of data to crunch, then today's typical consumer desktop computer (4-8 GB RAM, 2-3 GHz x86_64, 2-4 TB at the top end) doesn't have the capacity to manage all of those demands simultaneously.


I love this. This is all great. But with the exception of the scanner, all this can be done across an internet connection, and so therefore all of this will be done across an internet connection. For entrepreneurs building an ANI (or AGI), it makes little sense to let anyone else get their hands on the code, even in binary form, and then to have to support that code running on arbitrary PC configurations.


I hear you, and I hope to see these applications arrive in both cloud and on-prem versions, perhaps from different origins/vendors. That signals the software ecosystem is strong, vigorous and robustly healthy enough to support many different solution angles. Let users choose what best fits their needs, and let the diversity seed further innovation in a virtuous spiral.

I sometimes muse if on-prem will get a second look as containerization, self-hosted AWS Lambda-like microservices, and similar advances become more mainstream on servers, and we'll see more hybrid solution ecosystems evolve. I strongly suspect we will look back in decades hence and lament that governments are Why We Can't Have Nice Things In The Cloud; while the US has been particularly egregiously public, we shouldn't be surprised at other nation state actor reveals in the future, either. If I'm on point about that, then we'll probably see hybrid cloud+on-prem (COP?) instead of an overwhelming dominance of one or the other for the near future at least.


I don't know about you, but I'm not going to willingly send all my personal conversations and a mass of data about my living space over the internet to be analyzed by a third party...

So this will never happen in my household unless I control the data, and it is stored/analyzed locally.


I would argue that it is not a trivial upgrade of the capacity of the hardware (8 GB Ram, 2-3 Ghz x86_64) but the optimization and work on the underlying algorithms and hardware. Furthermore, the cost associated with introducing the new products might not be currently sustainable.

An ASIC can perform some of the computational tasks you mentioned (such as image recognition) with much better thermal/performance levels; it might be hard to make it a commercial success.


These applications absolutely depend upon more algorithmic and software engineering work, and it definitely is not a matter of just adding more hardware; if it was, I'd have the rack or two (or more?) of hardware it could take today. It's hard to predict markets, but in the Open Source world there are at least primordial forms of every application I mentioned, or at least the elements of them. It's fun to tinker with them at the very least, while the hardware keeps improving.

It's like back in the 80's it was neat to imagine GB-scale disks with ubiquitous, continuously-indexed search across all files on the system. We could even implement it, and tinker with it on a toy scale for practical speeds, or on a system-wide scale at slow demo speeds. It took better hardware before that feature was really practical.

I'm wondering if ASIC-based products will come out for "settled" niches that large swathes of the workstation market agree upon for a kind of standardized co-processing set of sockets/slots in workstations, and workstation-embedded FPGA's as software-reprogrammable become available for niches that are not as settled, but not quite amenable to GPU/CPU processing. I don't know enough yet about hardware at that level to discern what field of application could possibly penetrate mainstream computing that would need those devices at such a low level, though, that couldn't be addressed with PCIe cards today, if not through CUDA/OpenCL for accelerators/co-processors and general-purpose CPUs.


- voice recognition on the computer - better search (apple's OSX is impressive... PC... nosomuch) - AI-based autocorrect of entered commands that learns who you are - better data visualization - better web visualization - gesture recognition without a mouse - eye tracking without a mouse


Here's Intel's problem (and my apologies, I'm regurgitating a study I read a couple years ago and I can't find a link.)

Basically, Intel needs to sell huge numbers of chips in order to make their numbers, ie giant investments in fabs, work. While Xeon may generate all the profits, they still need to sell millions of low end pc chips to pool costs. If that stops happening, even assuming they sell the same xeons for the same prices, the R&D costs to make those fabs eat them.


"Investment in (new) fabs" only makes sense if volumes are increasing or manufacturing processes are changing. Volumes are projected as flat for the foreseeable future, and as far as anyone can tell process tech is near stagnating as well.


Investments also are to research and build out that fab tech to stay ahead of Samsung tmsc et al


They could have been a major ARM vendor if they had gotten over their NIH attitude, seen the writing on the wall, and kept on with StrongARM/XScale.


In what way did Itanium not work out? It isn't in every PC sold, but it was never intended to be. It was intended to replace Sparc/power/alpha. While they didn't get Sun or IBM on board, they did get HP, and the chip is profitable. It's not going to drive the future of the company, but it's not like they didn't make money on the investment... it turned it's first profit 7 years ago and has been making money for them since.



No, it was intended to be the "Big Iron" CPU replacement. They never had any intention of bringing it to the desktop which is why they delayed as long as they did in x64. They wanted 64-bit to be a long-term differentiator between server class and desktop class CPUs. Hence the forecast being Servers sold a year...


"No, it was intended to be the "Big Iron" CPU replacement"

All the computer press was pretty sure it would replace x86. Microsoft released Windows port for Itanium, which was unseen of them.


They planned to keep x86 as 32-bit which means gradual phase out (complemented by x86 emulation on Itanium).


>and it makes sense to invest all the available resources into it - better from ROIC perspective

Also, not every tech company can compete in other realms of tech. How could intel really break into mobile? A non-ARM chip means lots of testing and cross compiling for .1% marketshare that no one will do. Or worse, battery and performance draining hacks to get ARM compiled binaries to run on x86. There's an ASUS (?) phone series that does this. Its terrible and panned by cell phone reviewers.

I think this is a lot more deterministic than we care to admit. I don't care how much "leadership" and "grit" and "listening to your customers" intel could have done better, its clear that they couldn't break the ARM mobile monopoly because no one can. Becoming a 3rd or 4th tier ARM producer amongst many would have probably gone to shit as well.

Its easy to arm-chair quarterback companies but the reality is that there isn't a lot of 'win-win' paths in business. Sometimes you just can't enter a new market or beat the new hot startup regardless of what you do.

Lastly, no one questions mass hirings that don't seem sustainable, but we all freak out when mass layoffs are announced. Its incredible how we think jobs only make sense as permanent fixtures, when in reality they're subjected to the same market forces everything else is. If anything, intel's biggest screw up was hiring too many people too quickly.


What if Intel had replaced Samsung as the fab for Apple's A series chips, getting to produce ~75 million units a quarter and starving Samsung of that revenue? Seems like it would have aligned well for Apple and Intel.


You're partially citing some of Clayton Christensen's observations in the Innovator's Dilemma (which Andy Grove, Intel's former CEO, used to refer to):

https://en.wikipedia.org/wiki/The_Innovator%27s_Dilemma


By all means most people would do well to pick up Innovator's Dilemna! Wikipedia also has pretty good coverage of Clayton's Disruptive Innovation theory (what most of the book discusses) directly as well: https://en.wikipedia.org/wiki/Disruptive_innovation#Theory

'Letting your company get chased upmarket,' is a pretty good synopsis: yielding ground rather than trying to get competitive.

Clayton talks about the steel industry as a common example, with big "integrated steel" companies not adopting mini-mill tech that allowed for lower capital investment costs, and a big price reduction, first at some quality limitations to protect high margin integrated business, but with the low end market ongoingly overtaking more market up until they "won".

I am curious though- Intel is still charging ~$281 for your run of the mill, very boring grade laptop chip. In some ways, it seems like their obstinacy is finally proving it's payoff, even as they abandon market segments. Which is both how disruptive innovation continues to happen- with the low-level market getting eatten out from under them with Rockchip and MediaTek chromebooks, and perhaps gaining more ground- but it also seems to have proven to some degrees how effective their protectionism has been, that they have not had to get competitive on laptop pricing.

Making chips Apple would've wanted would have put Intel at a remarkably different price:performance scale than what they've stood on.


Of course - most of what I wrote is an opinion shared by many people, I don't want to claim originality here :).

A related anecdote - at my previous job, I was at an annual sales conference event, where Mr. Christensen was an invited speaker. Both before and after his speech, the management kept bragging about "how unique we are", "how fat our margins are", and "how none of our low-cost competitors could match our offering", despite real mid-to-long term danger to their core business, completely ignoring the innovators dilemma, or hoping that it's not applicable to them. It was almost painful to watch. My prediction is that my old company will get disrupted and ceased to play such an important role in just 5-7 years.


Do you mean "information technology" instead of "technology"? Because I don't know what you could mean by "technology is mature." Jets are technology, they were new, now they're mature. Same for any technology (including CRISPR, looms, routers, and rockets). Parts of IT are mature, but a lot of smart people clearly think that parts of IT still have the potential for big returns.


Since he said "tech", rather than "technology", I'd have thought that clear.


CRISPR isn't tech?


If you don't cannibalize yourself, someone else will.

Intentionally limiting Atom performance, selling of their ARM division, etc. was all done in order to not harm their main cash cow. By the time they woke up and really tried to push for x86 on Android it was too little too late.

Just from an engineering perspective it was always going to be a monumental task. Because guess what, that small dev shop with the hit-of-the-month mobile game is not going to bother cross-compiling or testing on the 1% of non-ARM devices. And if "Ridiculous Fishing" or whatever doesn't work flawlessly, your device is broken from the consumer perspective.

But what should really have Intel pissing their pants is the recent AMD x86 license deal with Chinese manufacturer's to pump out x86 server class chips. I'l love to hear if they're taking it seriously at all, or dismissing it as usual.


Paul Otellini came up on the x86 marketing side. He fully believed that Intel Architecture was the ultimate, therefore everything else was no better than 2nd place.

BK came up through the fabs, and therefore is more open to fabbing others' designs (Intel Custom Foundry) because it drives volume. However, he has overseen one hell of an awkward reduction in force (I'm ex-Intel, but still have many friends there). Small offices were told they were closing, and a few weeks later they were told where they should relocate to keep their jobs, and a few weeks later they might learn about the relocation package. It's almost as if they drew a line on a spreadsheet filled with numbers and now they're struggling to figure out how it affects the stuff they still want to keep. Odd.


I wish it was odd, if anything it sounds pretty normal when companies change top line strategy in a broad scale, and then have to spend the next year figuring out what the hell that even means, and how to do it.


I think this is more symptomatic of larger, more conglomerated organizations. I doubt this would apply to say, Apple or Google which have, respectively, 95% & 50%+ of their revenues coming from one product line.


Looks like the article just got deleted? Here's a cache http://webcache.googleusercontent.com/search?strip=1&q=cache... (It's not available in the Internet Archive because of robots.txt, which makes me wonder how Google justifies caching it but whatever)


Deletion was accidental according to Jean-Louis Gassée (author) Twitter feed...


link for context: https://twitter.com/gassee/status/732280308049936384

> @sebbrochet @fdevillamil Embarrassing mistake, don’t know what to do other than grovel and hope Medium can undo.


Happy followup: https://twitter.com/gassee/status/732368830190620673

> Big Thanks to @Medium. My fat fingers killed 9 yrs of Monday Notes this am. Awful feeling. They fixed it quickly. Much appreciated.


Lets look at QCT which I think is the biggest if not the biggest ARM company out there.

http://marketrealist.com/2016/04/qualcomms-chipset-margin-hi...

QCT’s operating margin fell from 16.9% in fiscal 1Q16 to 5% in fiscal 2Q16. The margin was on the higher end of the low- to mid-single-digit guidance as the ASP (average selling price) of 3G and 4G handsets equipped with Qualcomm chipsets rose by 6% YoY to $205–$211. The price rose due to a favorable product mix and higher content per device.

The Margins on cell phone chips are terrible. QCT made 2.5 bil on 17bil in revenue.

http://investor.qualcomm.com/secfiling.cfm?filingID=1234452-...

Would it really make sense to invest in Cellphone business when every dollar you put in gets you less ROI compared to what you have now. From a finance perspective it would make more sense to return it to the shareholders and let them invest in QCOM if they want to.


Actually 16.9% isn't much below INTC's 18.2% operating margin for 1Q16. The 5% number is more just an outlier due to revenue dropping much faster than expenses - the article mentions losing Samsung and possibly Apple as customers in a pretty short period of time. That certainly does suggest trouble ahead for QCT but I would argue that it doesn't necessarily say much about the industry as a whole.


The reason there overall margin is 18% is due to mobile. Their mobile margin is -30%. The rest of their business is closer to 30%.

https://www.sec.gov/Archives/edgar/data/50863/00011931251306...

As it stands today you can't really say it was a mistake to not get into mobile. They don't have the competitive advantages they had with x86. They had a monopoly there, but in the ARM space they would be just one of many.


The existing ARM fabrication market is very balkanized. Presumably if Intel had dived in with both feet, they would have been able to dominate it using their incredible aptitude for iterating process technology.

There is certainly that third path though, where Intel embraced fabrication, then regretted it.


I'm not sure if Intel's process lead is what it once was. For the last year or so Samsung have been shipping phones that use their own 14nm FinFET process, very similar on paper to Intel's best. It's entirely possible that Samsung will actually beat Intel to 10nm.


A point that's been made is that Samsung's 14nm process isn't really equivalent to Intel's- many gate feature on Samsung's 14nm process are much larger than 14nm.

Still, just the fact that anyone is in a race with Intel at all would have been unthinkable 10 years ago.


On the one hand it's true, but on the other hand, Intel opened it's fabs to others but nobody accepted, so intel's process might be more expensive.


   > The Margins on cell phone chips are terrible. 
   > QCT made 2.5 bil on 17bil in revenue.
And in many ways ARM is to blame for that, if you have a lot of chip suppliers you get downward price pressure. But the more interesting question is this, what is the marginal cost of that extra margin? Intel is going to run its fabs, its going to buy silicon ingots, its going to test and package chips anyway, so the marginal R&D cost is the additional architecture (not the chip layout) and a channel for moving those chips (which is easy since they is only a half dozen or so people who actually make the phones.

Gasse's point, as I understood it, was that Intel couldn't see past it's self imposed margin requirements on chips to see that this business could be additive to its revenue, margin, and keep it in the game when the world switched their primary IT gadget from the laptop to the phone.


"Gasse's point, as I understood it, was that Intel couldn't see past it's self imposed margin requirements"

And I think Gasse is dangerously ignorant to the basic fact that new Apple chips, even with less advanced fab tech, are quite competitive with chips Intel sells for $281. Core M is somewhere in the same field performance wise, and M's are frequently priced the same as the vastly larger ultrabook cores that themselves only offer incremental gains. And the A9X's vast GPU likely applied much of the pressure on Intel beefing up it's Iris, just to stay competitive (with a hat tip to Apple's colossal drivers/software advantage that no one else has). http://ark.intel.com/products/88198/Intel-Core-m3-6Y30-Proce... http://ark.intel.com/products/90617/Intel-Core-i5-6442EQ-Pro...


I see it a bit differently. There isn't any ARM vendor or user which is building an ARM CPU that has the memory and I/O connectivity of even the "low end" x86 chips. Every ARM vendor (with the exception of AMD's server core and HPs use) are focussed on SoC's like the A9x which are high integration chips with limited memory capacity and no I/O expansion beyond what is built in. Great for tablets and phones, and not great for servers/desktops/laptops. Consider that there isn't an ARM chip available you could use a GPU with that was connected via 16 lanes of PCIe.

I think the point was that phones/tablets are a differentiable enough space that you could go there with a lower margin SoC, especially if you went tit for tat and made it an ARM architecture.


According to Google, the A9X has a memory bandwidth of 51.2GB/sec, whereas the Core i5 mentioned above, according to Intel, has a bandwidth of 34.1 GB/s. So what gives?


No generally available access port to that memory. In the x86 world (for some chips) there is a proprietary bus called the "frontside bus" which connects the CPU to something called the north bridge. Other chips export the memory controller directly with some number of "physical" bits which will constrain the total number of DIMMS that can be connected to the system.

ARM SoC's typically have a memory port that supports exactly one memory chip. So the most memory you will see on them is typically 2GB (32 bits x 512K) although from the Anandtech article on the A9x (http://www.anandtech.com/show/9824/more-on-apples-a9x-soc) it seems to support 4GB. Looking at the iFixit teardown (https://www.ifixit.com/Teardown/iPad+Pro+12.9-Inch+Teardown/...) on the iPad Pro it has two LPDDR4 chips (SK Hynix H9HCNNNBTUMLNR-NLH).

So that makes it one of the first I've seen that actually can take two chips. The issues generally are that they burn up a lot of pins and at the speeds the run signal integrity is really hard (for example all of the data and address PCB traces have to be the same length so the bits all arrive at the same time on the CPU side and the memory side!)


> I think Gasse is dangerously ignorant to the basic fact

Unlikely.


I have to agree. not only was it the right answer to not get into the ARM business, it is also the right answer to get out of mobile phone and tablet chips if they are deeply cutting in to your numbers.

The ARM ecosystem and economy run on different margins, and are put together in an entirely different way.

Ax makes sense because Apple is vertically integrated and they get more value out of outspending their rivals on ARM R&D to be in a unique position than they do from whatever effect vertical integration in mobile CPUs has on their margins.

That means Intel is squeezed between a lo-margin ecosystem and strategic vertical integration. Doubly ugly. There is no good answer short of somehow coming up with a significantly better proprietary technology. That somehow appears to be elusive.


I agree. Qualcomm and the rest can own the frontend and get half a nickel in profits per handset shipped, while Intel owns the backend and sells server CPUs by the millions with gross margins of hundreds of dollars per unit and next to nothing in the way of competition.


Having worked for 1.5 of the companies you mentioned I think there is something to be said for and against each side of that. Intel's margins may be higher, but the volume becomes lower and in semiconductor scale is everything. The cost of plants does not scale well and the cost of developing technology nodes is VAST. That is why you see the majority (all?...and I include Samsung in this) of the ARM design shops going the customer fab route.

The issue for Intel is that they seem to have found an unintentional local optimization point where margins and volume are effectively in sync in a way that prevents them from effectively growing into new sectors. I used to joke that every ARM chip you buy includes the purchase of an Intel Xeon in a data center somewhere, but that is now something less than 10:1 and probably more like 100:1.

Now that IBM has divested from their fabrication capacity almost entirely, and they sold it to a company owned by Abu Dhabi, it will be interesting to see what happens with Intel's government revenue share. I could see them working together very intentionally with Amazon and the NSA on projects. I think the purchase and quasi-customer fabbing of Xilinix may be an attempt to go that direction.


Intel has purchased Altera, not Xilinx.


ah cra, my bad. thank you for the correction


Take any company that once rode atop a huge market and then suffered through declines as that market changed and you could filter back through the history of their decision making and say the exact same thing Gassee is saying here about Intel. So I'm not sure what point he is trying to make when he says their "culture ate 12,000 jobs." Does he mean their culture of focusing on the market they were currently very successful in, and their culture of not seeing huge paradigm shifts happening just under the surface of it? I suspect he just had a dull axe, and decided the time was finally ripe to sharpen it.


I'm not sure they even made the wrong decision, take a look at Kodak who were pioneers in early digital photography. You could argue that it was a distraction for them and they should have concentrated on milking film as long as possible.


I would argue that Kodak was right to jump into digital photography, but made some really bad decisions about how we'd take pictures. Sony made the correct call by concentrating on the cellphone market with the experience from their pro offerings. Kodak got the digital leap but didn't quite get the form factor leap.


People also seem to forget that Kodak owned the entire digital still photography market for its first decade. They had a 1 megapixel sensor during the Reagan administration, for crying out loud.


Amazing isn't it. I get the feeling a lot of people don't look beyond film. Too bad their management seemed to have Xerox disease.


No. Any shareholder would have wanted (in retrospect) Kodak to lead any technology that had the potential of killing their primary business.


Kodak to Shareholder: We believe digital technology will be the future of photography and result in great profits, but it will take billions of R&D and probably a few decades to get there.

Shareholder to Kodak: Screw that. We need to make earnings next quarter or I'm selling. Don't miss your earnings by even $0.01/share or you're toast.


The problem is when new tech is going to be a brutal slug fest like digital photography it's really hard for bloated companies to compete. Intel and Kodak both had decades of brand recognition, but that only goes so far when many competitors keep entering the market.

Consider, Google paid 1 billion for YouTube, that's relatively speaking chump change. Microsoft could have just bought Nintendo vs Dumping 10's of billions on X-Box. Remember, buying means you get revenue from day one to offset that initial investment where R&D takes years and might not pay off. Of course you need to time this when the competitors are still affordable.


Kodak is the perfect example. They managed to collect for every single picture you took, between the film, processing, and printing. There's no revenue stream in digital that had the potential to replace it, so they were doomed from the start. They certainly did see it coming.


With that kind of money, they could've been investing in consumer storage (e.g., NAS and even flash for phones) and point & shoot digital camera components. With perfect hindsight, of course.


>they could've been investing in consumer storage

Which has also been a razor-thin margin commodity business. Fujifilm does provide some insights into how Kodak could have moved forward but they struggled too and were a smaller company.

The high-end camera makers have done OK in digital photography (although there's been churn in that space as well) but Kodak had long ceded that market to the Japanese by the time digital photography was much more than a glimmer.

The print consumables market was also pretty good for a while but that ended up being relatively ephemeral.

Even with the advantage of 20-20 hindsight, Kodak was in a tough position to directly leverage either their existing tech or their channels.


It's really hard for huge companies to disrupt themselves. You need a strong visionary leader like Reed Hastings or Steve Jobs to make it happen. When the decision making power is distributed and consensus is the operative selection metric the chances of radical change become almost nonexistent.


Did Apple under Jobs ever really "disrupt themselves?" I suppose the iPhone disrupted the iPod business, and the iPad cannibalizes Macs somewhat, and the original Mac was a semi-successful attempt to supplant the Apple II, but all of those, while innovative, were fairly logical extensions of their existing product lines.

The iPod and going from Apple-the-PC-company to Apple-the-gizmo-company could be considered one, but I think it's giving Jobs too much credit to claim he started development on an MP3 player because he knew in advance how wildly successful it and the products it spawned would be.


You shouldn't have been downvoted. That is exactly what happened. Kodak dominated film (especially motion picture film) then in the way Windows dominates desktop operating systems now. The profits were just too great and they thought digital was just a distraction.


I found this quote to be one of particular interest.

“I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”

…and, perhaps more importantly:

“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. My gut told me to say yes.”

I personally love data,facts,hard science, but I find too often many can ignore gut feelings. I almost always follow my gut instinct, It has proven itself to me over and over again even while taking what is often perceived as long shots but somehow my gut tells me "you got this".

In particular I would say gather your own data on how often or not your gut instinct is correct and use that as a data point in addition to the hard science, facts etc.

Instinct evolved to keep you alive, it is often wise to not ignore it.


I also thought this was a very important quote in today's world. Compare these two quotes:

> The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. My gut told me to say yes.

vs. this famous quote from Jim Barksdale:

> If we have data, let’s look at data. If all we have are opinions, let’s go with mine.

Now that everyone is praying at the altar of big data, I find it important to keep both these quotes in mind. In my mind, it's not that the data is wrong, but too often I've seen attempts to interpret data it ways that can undercut long term planning and vision. For example, Steve Jobs had a famous saying about why Apple never put a ton of "xyz inside" stickers on the bottom of their laptop keyboards. It would have been easy to look at the data and say "Company pays us X million dollar to add this sticker, but only 0.001% of people decide not to buy us because of the sticker, so we should add the sticker." Judging the long term damage to your brand because you cover up your machines with crap is harder to put numbers to.


The data said: This market isn't worth it.

The average consumer probably thinks something along the lines "I have paid a lot of money for this phone. If intel had joined this market then they could have made a lot of money" when in reality intel would have gotten only a few dollars for every sold iPhone. Even if they produced ARM chips themselves and completely replaced Qualcomm they would only make $2.5 billion profit from $18 billion revenue. This is the best case scenario. In reality they probably wouldn't even make $1 billion in profit because of the how competitive the market is.


A lot of people read "truth" when they see the word data, but this data was forecasts. Aka speculation.


The problem of data is that it apply only to the past, and always has a very strict point of view.

Data is great, but it has this one problem. The best results normally occur when one uses data to keep the gut feeling honest.


Thinking that your gut makes more optimal decisions than a random process might just be feeding one's ego.

https://en.wikipedia.org/wiki/Confirmation_bias


And regardless, while instincts are useful, I would rather analyze them and extract the underlying reasons for my instinctual reactions, rather than blindly acting on them at the emotional level. And at that point, I'm back to acting on data (though potentially drawing it from more sources, now).

In any case, to believe that your [survival] instincts are equally applicable to making sound business decisions is a bit silly, as that's pretty far removed from what they've been optimized to do.


I would agree, I find there are some that love to sort of just run around with a version of "I follow my instincts and thing turned out great, damn its good to be me!"

That's why I encourage people to actually keep track of such things and treat it as a single data point.

I would also say there is a difference between listening to your experience/confidence and listening to your ego. A lot of good things can go wrong because of Ego.


Beware survivor bias! Few people bother telling the stories where their gut was boringly wrong. That's going to make it hard even to gather data about yourself. I would say "listen to your gut" is a dangerous lesson to take from this anecdote.


Your gut also produces shit. Should you still give it importance?


Not all that matters is measurable. Beware the data-driven decision approach.


i'm convinced the new paradigm in a few years will be anti data


Many are focussing on the 12,000 number. Many forget Intel has acquired a whole bunch of companies over the past few years including McAfee, Altera etc. Now mix in a change in strategy, you find yourself with a bunch of people you no longer need. Redundancy of services like HR, IT, test, design, management etc along with another class of engineers whose skill set you no longer need in your new direction.

Organizations continue to evolve and change direction. I am not saying leadership has not failed spectacularly in mobile, but some of the 12,000 jobs are a natural progression of acquisitions.


>Many forget Intel has acquired a whole bunch of companies over the past few years including McAfee,

They should just fire everyone from that acquisition and close down the whole division. It's nothing more than a huge stain on the rest of the company. It's really sad that the world's largest chipmaker reduced itself to being a malware purveyor.


Don't know, but I'd assume Intel's McAfee idea was more for the enterprise market vs selling junk to consumers. At the enterprise level it's somewhat more defensible, plus it opens up paths to desktop management capabilities.


If that's true though, then the failure is not making these changes on a constant basis (100s at a time) and instead saving it all up for a massive blowout.


no, not really. look at the timelines for Altera & McAfee acquisitions and how many employees & sites were added to Intel books. Pretty much last year.

now add a change in strategy. better do it at once and move on.


Moreover those that produce ARM processors have probably produced lot of jobs. If it would be Intel to have the ARM front, less would work elsewhere.


Intel's problems are long time coming. They lost track of real customer needs and have been operating in their own x86 centric world. It was always "let me build it first and then try to look for a good usage". I am ex-Intel and used to do software things. None of the leadership had a real sense of what customers really wanted and the h/w architects just built things that they are really good at. And they justified it in unnatural ways ("You know why you need such a fast processor ? You can run this flash game at 10K fps").

Things have been getting very efficient both on the client and the server side. With Cloud, they will have some momentum behind - but long term, I think the glory days are gone where they can just produce chips and someone would take it.


This goes back farther than 2005!

At one point at the height of the bubble, Intel was involved in mobile devices.

I worked for a company that developed secure, efficient wireless communication middleware for that space. Our client hardware was mainly Pocket PC and Windows CE at that time.

We partnered with Intel to port our stack to the Linux device they were developing (codenamed "PAWS"). This was around 2000-2001, if I recall.

Thing were going very well, when practically overnight, Intel decided to pull out of this market entirely. They shut down the project and that was that.

It didn't bode very well for that little company; we gambled on this Intel partnership and put a lot of resource into that project in hopes that there would be revenue, of course. Oops!


The post was deleted.

Here is the google cache link for quick access : http://webcache.googleusercontent.com/search?q=cache:https:/...

And here is the raw text(without any links) : http://pastebin.com/e10Yw0zi


I hate this piece. I hate that in 2016, people still believe in Magical ARM ISA Performance Elves. I hate that this is like my sixth or seventh post-Ars comment on HN in as many years debunking the ARM Performance Elves theory.

Back when the PPro was launched there was a low double-digits percentage of the die taken up with the x86 translation hardware (for turning x86 instructions into internal, RISC-like micro-ops). Now that number is some small fraction of a percent. It kills me that I still read this nonsense.

The reason that Intel didn't get into cell phone chips is because margins were (and are) crap, and Intel is addicted to margins. The reason everyone else went ARM is because a) licenses are dirt cheap, and b) everyone knows that Intel is addicted to margins, and they know about all of Intel's dirty tricks and the way Intel leveraged Windows to keep its margins up in the PC space and screw other component vendors, so everyone has seen what happens when you tie yourself to x86 and was like "no thanks".

Of course Intel didn't want to make ARM chips for Apple (or anyone else), because even if they had correctly forecast the volume they'd have no control over the margins because ARM gives them no leverage the way x86 does. If Intel decides it wants to make more money per chip and it starts trying to squeeze Apple, Apple can just bail and go to another ARM vendor. But if Apple went x86 in the iPhone, then Intel could start ratcheting up the unit cost per CPU and all of the other ICs in that phone would have to get cheaper (i.e. shrink their margins) for Apple to keep its own overall unit cost the same (either that or give up its own margin to Intel). Again, this is what Intel did with the PC and how they screwed Nvidia, ATI, etc. -- they just ask for a bigger share of the overall PC unit cost by jacking up their CPU price, and they get it because while you can always pit GPU vendors against each other to see who will take less money, you're stuck with Intel.

(What about AMD? Hahaha... Intel would actually share some of that money back with you via a little kickback scheme called the "Intel Inside" program. So Intel gives up a little margin back to the PC vendor, but they still get to starve their competitors anyway while keeping AMD locked out. So the game was, "jack up the cost of the CPU, all the other PC components take a hit, and then share some of the loot back with the PC vendor in exchange for exclusivity.")

Anyway, the mobile guys had seen this whole movie before, and weren't eager to see the sequel play out in the phone space. So, ARM it was.

The only mystery to me was why Intel never just sucked it up and made a low-margin x86 chip to compete directly with ARM. I assure you there was no technical or ISA-related reason that this didn't happen, because as I said above that is flat-earth nonsense. More likely Intel didn't just didn't want to get into a low-margin business at all (Wall St. would clobber them, and low-margin x86 would cannibalize high-margin x86), and by the time it was clear that they had missed the smartphone boat ARM was so entrenched that there was no clear route to enough volume to make it worthwhile, again, especially given that any phone maker that Intel approaches with a x86 phone chip is going to run the other way because they don't want x86 lock-in.


Yeah, you're wrong. :) I mean, sure, people exaggerate how important ISA is but it is important. Especially if your instruction set is heavily intertwined with a bunch of characteristics of various subsystems (e.g. memory). Size of the decoder, your ability to hide latency (which you pay for with caches and yet more logic), bandwidth cost for most of the crappy user more code - that drains power. It's not "OMG 10x costlier" but it adds up in terms of power and space.

Today it probably matters little. But 5-10 years ago it did matter. And x86 was magically worse than ARM simply by not being ARM. Sure, ARM proper w/o thumb would do nothing but ultimately savings simplicity brought to the table paid off. Today power is less of an issue and manufacturers (esp. low- and mid-end ones) care about space the most.

And sure, if Intel weren't addicted to margins they'd demolish ARM. And this, not architecture, causes their pains today. But as much as I love you man (signed copy of Inside... is one of my most treasured possessions), your "no ARM performance elves" is just way too simplistic a view.


I said that it mattered years ago, and doesn't matter now. Also, I wasn't talking about backwards compatibility, at least not directly. ISA matters a great deal for compatibility and software, but not at all for performance as of the past decade or so. None of the x86 lock-in tactics I describe above would be possible if ISA didn't matter at all -- it just doesn't matter for performance/watt.


I agree with you on all of this, except that I don't think this article really relies on Magical ARM ISA Performance Elves. All it's saying is that the counter-theory ("Magical Intel Process Mastery Performance Elves") kept getting trotted out for years as a way that Intel would be able to get past ARM dominance, and never came true.


> The only mystery to me was why Intel never just sucked it up and made a low-margin x86 chip to compete directly with ARM.

Culture. Some companies are willing to sacrifice today's revenue for tomorrow's growth. Intel isn't.

A low-end, low-power, low-margin x86 system would cut a huge swath thru Intel's margins. The trend of "upgrade to newer! better! faster!" CPUs would be hard-hit.

Even Apple is seeing that now with slowing iPhone sales. The phones are powerful enough that upgrading doesn't get a lot of benefit. So people aren't.


>> The only mystery to me was why Intel never just sucked it up and made a low-margin x86 chip to compete directly with ARM.

> Culture. Some companies are willing to sacrifice today's revenue for tomorrow's growth. Intel isn't.

Exactly. A tale as old as time; IBM is addicted to high-margin mainframes, ignores low-margin minicomputers, gets their lunch eaten by Digital. Digital is addicted to minicomputers, ignores PCs, lunch eaten by Apple, Commodore and, quelle ironie, IBM. Microsoft is addicted to the desktop platform, ignores web and mobile, left in the dust by Google and (irony again) Apple.

Sometimes it's not even so much a margin thing as simply being unable to visualize alternative ways of making revenue. Missing link in the previous paragraph is Microsoft eating IBM/Apple/Commodore's lunch when they failed to see that software was an even greater profit opportunity than hardware. Xerox famously ignored the steady stream of insanely lucrative guaranteed mondo profit cash cows produced at PARC because management didn't understand any business besides selling copiers. Excite turned down a chance to acquire Google for $700,000.


> Culture. Some companies are willing to sacrifice today's revenue for tomorrow's growth. Intel isn't.

At least historically, Intel was. Remember what happened to Intel's SRAM and DRAM in 1981?


> A low-end, low-power, low-margin x86 system

AMD A-Series?


Intel was complicit for a decade or longer for essentially undercutting AMD by bribing the manufacturers like Dell. There were quarters where Intel's "marketing budget" were all that kept Dell and other manufacturers in the black. [1]

AMD has no impact on the overall market because it's not a serious option for the big box manufacturers.

[1] http://www.theatlantic.com/technology/archive/2010/07/dells-...


I think you're right on both counts.


Overall agree with the rest of the comment. Just wanted to comment on this: "The only mystery to me was why Intel never just sucked it up and made a low-margin x86 chip to compete directly with ARM. "

the problem is that any low margin x86 chip that would work phones could be used for cheap servers, low end desktops and laptops and Intel wouldn't want that and there is only so much you can do with crazy contract restrictions on manufacturers. It is a dangerous gamble, but it seems to have worked for now.

Intel did feel the possibility of an attack on low end servers from ARM, and did release the Xeon-D which, at least for now, crushed any hope of ARM breaking into the datacenter.


> The only mystery to me was why Intel never just sucked it up and made a low-margin x86 chip to compete directly with ARM.

High-margin addiction is typically deeply embedded in a company's culture. Every manager who is accountable for revenue is keenly aware of the incentives of toeing the high-margin-culture line. As a result, you have low-margin products and ideas killed off by mid-level managers. For big companies, there is no choice: you either get lucky with high-margin addiction, or you die.


This all sounds very accurate, though it seems to me that it was the interaction between the x86 and the Windows monopolies that allowed this to happen. An Apple phone seems like it could have moved from x86 with greater ease than the more open Mac OS X moved to x86 previously. Android seems like it was built from the ground up to avoid this particular issue by keeping everthing a level above the bare metal so I'm guessing Apple would plan for it too, even if not by shipping a VM. But they'd still have leverage.

So that's yet another economic reason that Intel wouldn't have wanted to sell chips to Apple in this space.


Everyone does it in mobile space. At least the big three.

Last year Apple introduced distribution via LLVM bitcode, and Microsoft has been doing it for a while as well, with MSIL being compiled to native at the store servers.

So going forward, iOS, Android and WP can be processor agnostic. Of course there are always some app developers that would rather use Assembly or native code directly, but that is an app specific issue.


Indeed, you can consider x86 to be an instruction compression scheme in which case it is a net win for caching (though minor).

What Intel had for years was a process advantage. Apple would have paid a premium for that. Not desktop CPU premium, but a premium nonetheless. And Intel wouldn't be a distant also-ran in the mobile SoC space.

If you read the Oral History of x86 documents the same arguments almost killed the first few Intel CPUs. The memory guys were worried they'd offend their memory customers and CPUs were not a high volume or high margin business.


Yeah, code bloat (vs. CISC) was a common knock against RISC early on, but as cache sizes have ballooned there's pretty much no effective difference at this point.


So what I hear you saying is, Intel was never going to play in the mobile market, not because of ISA but because of business practices. By being the evil overlord instead of the benevolent one, they made it so no one would choose them if they had any choice -- and the mobile market was a whole new ball game where x86 was no advantage.

So even matching the low-power ARM chips early on wouldn't have helped.


Yeah?

I would like to see a scientific comparison between Apple, Qualcomm and Intel...

I bet that if Apple or Qualcomm wanted, they could create a desktop ARM that would be a generation ahead of Intel.


Well I for one like your rare HN comments post-Ars, because they're much less constrained. Though... I do miss your write ups on Ars too.


So this makes sense - this margin addiction theory. But if so, how come Intel rules the chromebook market ?


>> low-margin x86 would cannibalize high-margin x86

What happens when low-margin ARM cannibalizes high-margin x86?


Low-margin x86 would not require recompilation of, re-certification of, and possibly changes to existing software. It has a much lower barrier than low-margin ARM.


On the other hand, Microsoft's new Universal Apps compile to ARM, Linux can compile to ARM already, and Apple can and has switched. It's difficult to one degree or another, but it can happen.


It's funny how you compare MS Apple and ... Linux which isn't a company. I like linux but can't understand how it's able to compete without high level strategic management.


Low margin ARM is already cannibalizing high margin x86 as more and more compute uses cases move from PC to smartphone (social media, casual gaming, web browsing, video, camera and so on). Photoshop, spreadsheets and the like are still on PCs - for now. The platform itself has shifted from Windows to Android/iOS.


The main issue here is: Intel doesn't want to work on stuff with lower margin(even though it can make money and it's strategically valuable). That's common to many businesses.

Isn't there any financial engineering trick to enable companies to solve this dillema that often leads to their death?


Every innovation eventually becomes commoditized. So if you want to remain high-margin you have to continually innovate.

(But to be fair to Intel they did, and do - they keep producing better and better CPUs. It's just that demand for that whole category has shifted)


Exactly. Intel invests ginormous amounts into R&D. If they had adopted ARM earlier, they could have captured the mobile market and .... then who knows what? Innovation is unpredictable, especially in tech.

They started our making DRAM's then moved to making microprocessors...I wonder why they didn't realize they couldn't rely on the same business model forever.


They did adopt ARM. Did you forget about StrongARM? (I believe they acquired it from their DEC acquisition.) They ran with that for a while, made an updated version called "XScale", made a bunch of mobile processors with it, and then sold it all off to Marvell when they decided to refocus and shed a lot of other things that weren't core to the business.


History is not the strong suit of this site's readers. Yes, Intel was selling ARM CPUs into the mobile market (Compaq IPAQ, Sharp Zaurus, etc) when Apple was still making pink iMacs.


They did adopt ARM earlier than most and had a license for it that would allow them to even make innovations of the ARM core like many of the current successful chip developers but they were not able to create great comms chips like they were able to do with PC chips. I believe this was due to there culture more than anything.


They produced incremental improvements of those CPUs. Sure, they got faster and had more cores, but there's not much in the past 20 years I'd call "innovation" on a fundamental level.


The problem with low margin for international company is the currency rates. If you have a 2-3% margin, a 5% inflation or fx change can put you in loss.


Some sort of solution... to the dilemma caused by innovation...


I think the advice is to create a fully isolated subsidiary and just enjoy it's increase in value an the control that would provide, with no decrease in margin. It can work when you're talking about blockbuster adding a netflix.

But if intel wanted to do that, it would need to sell a lot of fab services to said company, at a lower margin that it's used to, so it's not a solution.


Intel also bet big on Wimax. Which (mostly) ended up going nowhere. That was another multi-billion dollar mistake. And one which cost thousands of jobs at Intel.


FWIW, AMD also struck out HARD in the mobile market. When I interviewed there (just before release of iPhone 3G), I was jazzed about the "imminent" opportunity for mobile CPUs. I expected AMD to scale right down and into the phone, tablet and CPU market. We watched the world tumble around us as if on a ferry boat.

When I voiced the words, "mobile CPU" to anyone there, people were oblivious and silent. The company was just out of touch, thinking everyone was going to keep buying tower PCs and laptops. It seemed the only variable they thought customers cared about was performance. People would buy AMD if it was just a little faster. They didn't realize it was simply the wrong product/market. Performance wasn't nearly as important, sigh.


one manufacturer that did anticipate the 'death of desktop' was nVidia, as powering the "iPad killer", Pixel C.


It is sad. Because in comparison with sickening ARM scene, Intel actually put effort into making open GPU drivers.


I was shocked the other day the Intel discontinued Intel Atom. So if one needs a cheap SoC board, there are now only ARM like Raspberry Pi, and no cheap ($50) x84/x64 compatible boards with Windows drivers. The Intel Atom 230 (1.6GHz dual core with HT) of 2008 was cheap ($50) and fast (Pentium M based). And on the high-end side there is still no 5GHz and there is little single-core improvement since 2004, since the Pentium 4 3GHz. We need more competition with x84/x64 to get lower prices and faster cores.


What do you mean discontinued? What about the Atom that was just introduced earlier in 2016? e.g. http://www.newegg.com/Product/Product.aspx?Item=N82E16813138... ?


It's not an Intel Atom, but a low spec low power Celeron. The J3060 has no hyperthreading and without the boost it's as slow as the Atom 330 (which has HT). So after 8 years, there is no Atom brand anymore, and the next best option is as slow as back then and costs about the same.

http://www.cpu-world.com/Compare/979/Intel_Atom_330_vs_Intel...


> on the high-end side there is still no 5GHz and there is little single-core improvement since 2004, since the Pentium 4 3GHz.

This isn't true. There's easily a single thread win, clock-for-clock, of 2-3x over this period.


Single core performance comparision: https://www.cpubenchmark.net/singleThread.html

   Intel Core i3-4370 @ 3.80GHz     2,215 pts 
   Intel Pentium 4    @ 3.80GHz       822 pts
But to make a fair comparison, one would have to throttle the memory to the speed of a 2004 memory chip. Then the CPU single core performance increase will be a lot less. Also the Pentium 4 architecture was a crossroad and Intel switched back to Pentium 3/M for Core. So a better comparision is the fastest Pentium M with a modern Core CPU.

  Intel Core i7-4760HQ @ 2.10GHz    1,922 pts
  Intel Pentium M      @ 2.10GHz      660 pts
Again, the memory speed increased a lot since then, so for a fair comparison, one would have to throttle the memory chip.

The Atom 330 vs current gen Celeron (haven't found J3060, but his bigger brother J1900):

  Intel Celeron J1900  @ 1.99GHz      530 pts   (10 W)
  Intel Celeron J3060  @ 1.60GHz        ? pts   ( 6 W)
  Intel Atom 330       @ 1.60GHz      251 pts   ( 8 W)
The performance increase of the cheapest Intel CPU hasn't increased that much since 2008, as I said.


I'm not sure the "Just You Wait" is without merit, just taken longer than thought... I'm not sure why... given the transitions that have happened through process advancements the past 6 years or so, we now have CPUs that burst to higher speeds very transparently, and at really low power compared to a decade ago. While not suitable for phones, Intel is a pretty great choice for laptops and tablets (OS concerns aside).

I do think that ChromeOS (or whatever it's called) offers a distinct difference to most users from what Windows is. That changes things up a lot.

I feel that within 4 years, Intel will have some competative x86 offerings compared to ARM... On the flip side, by then, ARM will be competitive in the server/desktop space much more than today. It's kind of weird, but it will be another round of competition between different vendors all around. I'm not sure what other competitors will come around again.

That's not even mentioning AMD's work at their hybrid CPUs... also, some more competition in the GPU space would be nice.

It really reminds me of the mid-late 90's when you had half a dozen choices for server/workstation architectures to target. These days, thanks to Linux's dominance on the server, and flexibility and even MS's Windows refactoring, it's easy to target different platforms in higher-level languages.

There will be some very interesting times in the next few years.


I do wonder what would have happened if Intel had concentrated on being a custom chip maker. With a continuing of advancement in a low-powered x86 core and custom options, it would have been interesting. I guess when both game machine manufactures skip Intel to get AMD you really have to wonder where they thought things were going. A company so against customization that it closed its bus from third-parties probably isn't going to do well in a custom world.


The mobile space was aready hotting up. Nokia, alone in 2005 was already selling over 250M devices at around 35-40% market share. They were already using ARM chips. Forecast for smartphones was already high, since the whole mobile industry was posiioning itself to upgrade. It was always a question of getting the telcos to improve their 2G networks to 3G before the smartphone explosion was expected. Intel should have known this.


Article error: "In 1985, fresh from moving the Macintosh to the x86 processor family, Steve Jobs asked Intel to fabricate the processor that would inspirit the future iPhone. The catch: This wouldn’t be an Intel design, but a smaller, less expensive variant of the emerging ARM architecture, with the moderate power appetite required for a phone."

That should be 2005.


Intel didn't miss the smartphone market. It simply made no sense (and still makes no sense) to go after a less profitable market.

When the margins on x86 cross below that of ARM chips, Intel will come in and destroy all the ARM manufacturers.


It did. Even Otellini said so:

http://www.theatlantic.com/technology/archive/2013/05/paul-o...

And I'm not so sure they'll be able to go into ARM fabbing that easily. Margins of X86 are high, but they are selling less of them, so expanding to new market would have been a wise move.

To their credit, they tried twice and couldn't make it work. Not sure if things could have been different considering ARM is a licensable architecture other people can use.


> When the margins on x86 cross below that of ARM chips, Intel will come in and destroy all the ARM manufacturers.

Unless the world has moved on. Profitability happens over time, so even declaring a market unprofitable may shut you out for ever.


x86 was already on lock down. They could have ridden into the ARM ecosystem, guns blazing, and wouldn't have lost a penny from their existing x86 platform. ARM doesn't compete against x86 in mobile. That's obvious now, but it (apparently) wasn't obvious at the time.

Intel turned down a boat load of free money.


They made several attempts (http://www.cnet.com/news/intel-sells-off-communications-chip...) but I think Intel has a huge cultural problem that does not allow for true innovation outside of core competencies. I worked for the Marvell group and while Intel did not know how to be nimble Marvell had there own issues which is in part (IMHO) why blackberry lost its leadership position. But the biggest problem with Intel is the culture as you are truly just a cog in the wheel there. Just my $.02


I'd love help understanding how Intel's failure in the smartphone market didn't also doom the IoT market. The x86 chips never took off because they were expensive and took too much energy. If IoT is to be ubiquitous, those barriers seem even more important.

Intel's comment about IoT makes me wonder: do they think they just lost the first mover advantage for mobile & the industry got hooked on ARM ISAs? Do they still believe the story that their chips will become cheaper and more powerful than ARM if they change nothing?


I reckon it already has doomed Intel in the IoT market, investors just haven't realized it yet. Their answer to the IoT involved buggy shrunk-down 486s with poor performance and power consumption and very limited onboard peripherals compared to their ARM competitors. (I can't even find power consumption or instruction set information for the newest ones, which is worrying.) They've put a lot of money into high-profile publicity stunts like an Intel-funded TV series ("America's Greatest Makers"), BMX bikers, Arduino boards, etc but apparently if anyone wants to integrate their chips into the kind of IoT product they're being promoted for, they can't actually get them unless they're a big-name brand. They mostly seem to be relying on the press regurgitating their PR and not looking into the details.


Article gone?

Well, I'd guess fundamentally it was about tying yourself to the Microsoft sociopath mothership.

Intel could have taken Linux by the reigns and made an OSX-equivalent and certainly windows-beating decades ago.

But they didn't.

So they missed the boat.


> Intel could have taken Linux by the reigns and made an OSX-equivalent and certainly windows-beating decades ago.

I disagree. Hardware companies have a long history of trying and failing to make good software. It's not in their core business, so the company culture and talent pool isn't right. The same is true the other way around, just look at how rough Microsoft's entry into the console market was.

On top of that, the "last mile" of usability between the current state of Linux and something that can realistically compete with Windows requires experienced UX designers and an infusion of design knowledge into the software engineering team. This isn't realistic for most companies.

Just to be clear, not saying it would have been impossible, but I would bet against it.


I'd be really interested to know what the team size of the original OSX people was. Granted they were probably all NextSTEPpers that really knew the OSX core inside and out to begin with. But still, Apple at the time had 1/100 the resources of Microsoft and Intel. Intel could have bought BeOS after Apple passed.

Think about that: BeOS with real backing.


Intel has some amazing research. I was at an internal fair back in 1999 when I was doing an internship there in the Product Dev group that did the Itanium chipset.

They has this handheld touch screen device that resembled a modern smart phone that was driven by voice recognition. This was 1999 way before the iphone came around.

In my opinion, it is the internal politics and the leadership that has caused them to lose out.

Chip design books in academia were already moving in the direction of low power designs during the late 90s. They just did not take any action.


The iPhone didn't represent any sort of technical advance. The only thing truly innovative about it was the (almost) all-screen form factor.


Just saying, working touch screens have been around on devices for 30 years or more. Most 'PDA's, including < $100 phonebooks had decent resistive touch screens as far back as the mid-90s. Also, as such, speech recognition too has been around for at least 25 years. What's new in today's smartphones is the cloud+deep learning driven high-accuracy. You could probably have easily found a handheld with touchscreen and speech recognition at Circuit City in 1999.


Working touch screens are useless without usable touch UIs.


I guess the point of the article is that this is a failure of some sort but it never really makes a strong enough case. The closest it comes is trying to make some sort of assertion that the executives at Intel should have been able to predict the future. If I were inclined to hate the company for some reason, I would find this emotionally satisfying, but I'm not, so I just find it to be a recap of things everyone already knew, expressed with an unjustified tone of smug finger-wagging


I wonder if Intel saying "no" to Apple re: ARM will go down as the same scale decision as Digital Research saying "no" to IBM for an operating system.


Time to get out my Maemo (RIP Nokia) and Meego (too bad, Intel) merchandise and go cry in a corner.

  Maemo/Meego
  WebOS
  Firefox OS
All of these were interesting, all failed. I'm still convinced that there's a market for a non Android/iOS solution.


Article is back up. He accidentally deleted his 9 years' worth of posts. https://twitter.com/gassee/status/732368830190620673


“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. My gut told me to say yes.”

Killer, for us fans of rational strategy.


This case is a real problem for the "data-driven thinking" set. Or rather, the problem with that idea is you have no way to tell which data you should be looking at. Just because you have numbers doesn't make them the right ones.


A friend of mine at Intel told me that Intel doesn't care that much about not owning this market because the size of the profit pool around server chips is orders of magnitude greater and they have monopolistic control of that market.


That is the same thing that the mainframe vendors said about minicomputers and the same thing that the minicomputer vendors said about microprocessing. Intel's failure to recognize itself as the next link in the cycle was a colossal mistake. They should have gone to Steve Jobs and offered to rectify their mistake the moment they realized how popular the iPhone was. Instead, they tried the Itanium strategy of promising things they could not deliver. What made them think that would work after the first failure?


they just forgot that only the paranoid survive.


OT but somehow related:

The article contains the sentence "Jobs asked Intel to fabricate the processor" and "Intel styled itself as a designer of microprocessors, not mere fabricator".

Question: according to your own experience, is it common in English to use "fabricate" and "fabricator" as synonyms of "manufacture" and "maker"?

I am much more familiar with the negative meaning (e.g. "fabricated lies") but since I'm not a native speaker my vocabulary might be limited.


Specifically in the case of chips, "fabricate" seems to be fairly common.


Why was the title editorialized by mods?


Unable to read the story. Seems delete from medium.


> The author deleted this Medium story


The author deleted this Medium story.


...what?

Intel didn't miss anything, they sell the hardware that powers the infrastructure behind these new, always-connected devices.


They were in that even before smartphones were the norm. Considering that they had to compete a lot more in the server space (SPARC comes to mind) and then won that market doesn't mean they won the smartphone market.

AFAIK, Paul Otellini also made the same argument about servers however, the point is really about processors that run inside consumer products and not server infrastructure. Intel Inside just isn't the case.


He just deleted the post :/



Why would he do that?


ALL of his posts are gone from MondayNote now strangely.


And his entire Medium account is gone, too.


Hey, it's the BeOS guy!


... and also the GPU market.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: