Hacker News new | past | comments | ask | show | jobs | submit login
The Intel Enigma (mondaynote.com)
108 points by chmars on Dec 14, 2014 | hide | past | favorite | 63 comments



A few things to note:

- Intel benefits from mobile growth even if it's not selling the phone hardware. Each of those phones needs to connect to back-end servers for almost everything, and those servers run X86. Every time they connect to a server, it generates data, and that data is analyzed and monetized by servers that, again, run X86.

- Fabs, and by proxy fab time, is super expensive (and growing superlinearly as process size shrinks). They may have made a strategic decision that they'd rather be pumping out Xeons for backend servers with that capital than Atoms for phone clients, except to the extent that they design and produce enough to "stay in the game" should they wish to change direction.

- Judging from the rate of improvement of the Intel low-power CPUs and GPUs, they are on a relatively short path to being extremely competitive with ARM and the associated graphics chipsets.

- Intel has a lot of experience writing "sufficiently smart compilers", and doing CPU / GPU integration. Both of those are extremely handy for low-power graphics-heavy environments.

I think this is just one of those things where it doesn't look like a trend until it's inevitable.


Sufficiently smart compilers never showed up where it mattered most for Intel:

Turning branchy C/C++ system and apps code into explicitly parallel instructions.

And arguably, they can't.

Sufficiently smart compilers for GPUs aren't sufficiently smart. They work by imposing sufficiently dumb limits on programmers instead.


Interestingly, Intel spends a lot of die budget on on-chip instruction level parallelism, and their major instruction set-level enhancements have been increasing vector operation lengths. They've also invested somewhat heavily in GPGPU-style vector coprocessors in their Xeon Phi's.


Right. But the whole point of EPIC was to do away with that part of the die budget entirely, and a lot more besides.

GPGPU programming has worked because nobody proposed to run general code on it from the beginning. It just so happens that there are some workloads that fit the vector processing model.

But lots still don't. Lots and lots and lots.


Jean-Louis raises some good questions about Intel. They aren't new questions, and the shape has changed, but they are good ones. When I worked there in the 80's I was validating a graphics chip (the 82786) which was a really cool video chip. And tried to get Intel to consider these "Unix workstation" things and they couldn't see past the risk of harming their Microsoft relationship.

To be fair, Intel has made a ton of money and they are widely regarded as the premier chip manufacturing company, standing strong while people like Motorola (the Freescale part of it), TI, NEC, Fairchild, and National faded. And by that measure they are an unqualified gorilla in the market place.

So it is interesting to see things like ARM effectively encroaching into their markets. I'm typing this on a desktop with an x86 processor, but I have way more machines interacting with me running some OS on an ARM processor in my house than I do a x86 machines. That bodes ill for the future.

I've been playing with a Lenovo x86 based Android tablet. Its nice, but it is no better nor worse than an ARM based tablet. And apparently Lenovo was being paid $51 each for them to make these. You cannot retake market share being 'just as good as' and a bribe. You really need a compelling reason to get people to switch to your processor. I wonder what that will be for Intel.


> And tried to get Intel to consider these "Unix workstation" things and they couldn't see past the risk of harming their Microsoft relationship.

and by not listening they made how many billions?


That is kind of the point. They have made billions, an unqualified success, and they have triumphed over all of their rivals of the day. The only problem is the one the Jean-Louis points out, they are running out of road.

I was at Sun in 1988 when they decided to "go enterprise" in a big way. The focused on killing DEC and all the other workstation vendors, and partnered with AT&T to make UNIX a real Enterprise OS, combining System V, and BSD 4.2 (aka SunOS) into Solaris. It made them many billions of dollars, heck it paid for most of my house! But that road did not lead to them being a force in the Linux/*BSD world, nor supplying computers for that word. That was lost to the other server manufacturers and Sun Microsystems ran out of road. Long before Sun was effectively dead, they could have moved into a new space. They did not. Now they exist only as assets in the Oracle ledger.

Jean-Louis points out that Intel could be headed this way. They say big things but they seem unable to face the reality that the microprocessor world is changing in fundamental ways that make it hard to have giant gross margins on a single chip.

Intel has done great things, its a wonderful company, and it will die if it cannot figure out how to compete with ARM.


Do you have any thoughts on the future (or lack there of) of AMD?

I thought it was very interesting that they won both the Sony Playstation 4 and the XBOX One - in my eyes the first win for their long transition towards "real" on-die GPUs.

I'm thinking that they'll now have the whole game industry making graphic engines that work great on their stack (both major consoles, and on PCs). If they can leverage some of that towards tablet/mobile -- that could be interesting.

So far it seems that Nvidia is winning there, though.

(Obviously, while I have a bit of a crush on AMD as the underdog, and love their push for better open source drivers -- I don't really want to see a new monopoly emerge. But anyway, I'd be curious to hear what you think?)


I like AMD. I told the CEO of NetApp he must fire me if going with an AMD processor (Opteron) in the high end (aka most profitable, flag ship) filer turned out to be the wrong decision. That said, I hate AMD, they can't execute for crap.

It wasn't for lack of vision. Fred Weber, their CTO at the time, had solid ideas about moving AMD into the lead for x86 scale chips. AMD kept thinking they were a chip making company and didn't unload what became Global Foundries fast enough. Not to mention that everyone who has to play with x86 is playing with the deck stacked against them. Massive memory machines are coming, all the cool kids are over at HP building one on "the machine" project. AMD could have built that machine 10 years ago, or at least laid the ground work for it. AMD owned a huge chunk of the FLASH market, they could have spun silicon in a way that would allow for mixed FLASH and DRAM on the "north bridge", it would have changed a lot of the economics of things. But when you're constantly about to die, it is really really hard to make strategic moves. So unless AMD stops perceiving itself as being on life support it won't be able to muster the courage to move forward on things where it could make a huge difference. And its not clear to me if it has fallen under the power curve or not. Fred took a huge risk when they did the Sledgehammer (aka Opteron) architecture. And I watched how much pain people went through trying not to piss off Intel to use it (including NetApp). And got to see how it totally stomped the living crap out Intel's Netburst designs. (Intel was pushing Itanium as their 64 bit / large memory answer, which also underperformed Opteron machines)

But the reverberations of that triumph were not good, AMD's inability to execute meant they stumbled on the next few bits. They got the jitters, canned Fred, and went for something "safe". Not a good plan for world domination.


"AMD kept thinking they were a chip making company and didn't unload what became Global Foundries fast enough"

Ah but they were only capable of do that after their settlement with Intel (I mean, the Intel anti-trust settlement)

Before that if you wanted to make x86 chips the condition is that you manufactured the chips or something like that


Very valid point. The albatross of x86 licensing/patents is a hard one to git rid of.


So, there's obviously two parts to AMD. There's the part that can't make a decent CPU, and there's the ATI part. The ATI part is surviving on gaming and other consumer graphics applications, and starting to look at a real opportunity in datacenter-scale computation if they can market the S9150 as a realistic competitor to the nvidia K40. This is mostly a matter of the device having actual performance power consumption that AMD says it has, and them being able to actually deliver parts. This is basically AMD's last chance to get into the datacenter market, since they haven't made a competitive CPU in more than five years.


The Tyrannosaurus Rex had an incredible run and made mincemeat of uncountable prey animals. But evolutionary dead ends are permanent no matter how successful past performance may have been.

I just got a new tablet a couple weeks ago that, on certain tasks, is actually more powerful than my last full desktop computer. From loading off of disk, to compute to final rendering, it's giving me more FPS than my quad-core intel 2.4Ghz system with $350 graphics card was giving me a couple years ago. It's not only kind of crazy that something that can fit into my coat pocket can do that, but it's doing it at some tiny fraction of the power consumption. If I hook it up to an HMDI cable it can drive a nice monitor, and I know it supports a USB keyboard and mouse (and a wireless controller), it basically can replace my desktop for most uses.

Between iOS and Android, there's 3 or 4 million applications, from productivity to music authoring. Heck, I have a terminal emulator on my tablet that comes complete with ssh so I can effectively do what I do most of my work day anyway.

At this point, I'm counting the days till my OS X and Windows desktops are mere emulated apps, or running off of a spare intel compatible core on the SoC.

Intel is going to be a classic case of not realizing they're on the way out, and then suddenly their out of business -- patent portfolio scattered to the winds (and AMD on life-support keeping alive legacy corporate systems that haven't transitioned to something else).

About the only way they can "fix" this is for Wintel tablets to find the right feature/price mix and suddenly take off in sales, and then translate that somehow into smartphones. But it's unclear if the new Microsoft is as willing to toss money at that problem to make it happen as the old one, without such a partner intel is in trouble.


That's what's happened.

The interesting what-if question is if they would have made more if they had listened.


Intel have repeatedly blown a fortune on new chip architectures, often chasing what the world was assuring everyone would be the Next Big Thing. The iAPX 432 (hardware garbage collect, targeted at the likes of Ada), the i960 (RISC will own the world!), the i860, and so on.

The market has consistently told Intel that it wants more and more out of the x86 chip set. When Intel has ignored that, it has been punished, and punished harshly - most especially so when AMD gave the market an x86 64 bit architecture and Intel didn't. Why should Intel ignore decades of experience now?


I wouldn't say they have blown "fortunes" (with the exception of Itanium). For a long long time Intel was the dominant force in embedded microcontrollers as well. You may recall the 8051 architecture, still in use today which is in a bazillion products. That line flourished even when Intel was punished for its first 8086 "SOC" the 80186.

Intel has never learned the skill of serving multiple markets. Even when I was actively engaged with them in new designs they couldn't have the '64 bits in x86' discussion because that market was "owned" by the Itanium guys. There didn't seem to be any way to talk about x86 with those new features. I'm sure that was the same issue with SOCs. If I could advise them, that is the skill I would really want to work on, actionable listening to get ahead of their customers needs.


Ideally, Assuming that intel's chips will be as good as other's without a bribe, and tablet manufacturers already had some successful designs with x86, than all it takes is a small nudge.

It could be in the form a bit of lower prices, or some nice bundle with other technology intel will (intel and micron make flash together, or desktop chips,etc), or some unique technology that intel will develop or acquire, etc.

But first ,let's get to the point where intel is equal including in price.


> Apple might feel that Intel’s process needs to mature before it can deliver 300M units.

We don't have to wonder about that at all. Apple will either use Samsung's 14nm or TSMC's 16nm (or both) for its chips next year.

It's funny that Intel keeps bragging about its process advantage, yet even with that process advantage they have no real advantage over the 28nm planar ARM chips in terms of performance/power consumption (sorry, Anand, it seems the "x86 myth" hasn't been "busted" after all. Otherwise Atom on 22nm FinFET should've wiped the floor with any 28nm planar ARM chip. But it doesn't. Not even close. In fact many current high-end ARM chips on older process beat Atom on the newer process).

Instead they have to beg tablet makers to take their chips away for free (or they pay them to take the chips).

Also, Intel's actions speak volumes. They've started licensing out their Atom micro-architecture to Rockchip and Spreadtrum. Think about what that means for a second - it means Intel thinks it can't succeed in the mobile market making its own chips. Instead it has to give their designs out as IP, just like ARM, so other companies make Atom chips. Even if Intel is "successful" with this strategy, they'll be making pennies on the dollar in mobile, just like ARM Holdings does (ARM is totally fine with that, given the company structure - I doubt Intel would be).

Oh - and those companies aren't going to use Intel's "huge process advantage" either. They're going to use TSMC's 28nm planar process - late next year. If Intel is doing this then Intel must think that having others make Atom chips on an obsolete 28nm process (while ARM chip makers move to 20nm and 14nm FinFet next year), is going to be "more successful" than themselves making them with that "huge process advantage". Let that hopeless Intel strategy sink in for a moment.


While I personally am confused about this part of Intel's strategy, I can't avoid the feeling that Intel (and Rockchip) know something I don't.

Maybe it's some kind of "white-label" conspiracy - Intel wanting to sell Atoms cheaply through a 3rd party so they don't need to dilute the Intel brand (and Wall-Street observed profit margin).

Maybe they are planning subsidies for the next 20 years (Intel was always playing the long game), and this is their way to avoid anti-trust, which would surely hit if they are actually successful.

They are at a huge disadvantage in the market, and they seem to have no technical advantage in mobile and tablets so far. But this does not seem like a desperate strategy - it seems the goal is different than "sell more now". It might just be red queen style "keep running in order to stay in place until we can figure out something else".

HP RISC cpus are gone. Alpha AXP is gone. AMD is mostly playing catch-up. IBM Power cpus are sort-of alive, but it is not an Intel competitor at this point. I hope Intel's dominance wanes and competition arrives again - but I'm not holding my breath.


Making ARM chips for low/medium end tablets is a cut throat market. Rockchip did well initially but now Allwinner has passed them, and they seem to be in a challenging situation. Just look at Allwinner vs. Rockchip growth. Not to mention Mediatek that will eventually come with integrated cellular (including 4G). So for Rockchip, it may simply be a defensive move: life is too hard and competitive on the ARM side, so try to find a niche in the x86 side. I'm very doubtful it'll work out: the volume is in the low/mid tiers, and the average consumer doesn't care about the CPU architecture. It's just a basic Android tablet, period. I don't see a worthwhile differentiation on Intel side, except the current crazy subsidies but those can't last forever. And this market is not faithful: the chip makers need to provide mostly everything to the ODMs (Mediatek is quite famous there, providing even the production and test process and tooling ready to deploy), and this makes changing chip provider rather easy. There is no loyalty, only a focus on low prices.


Itanium was not "an adoption" of PA-RISC. It was a completely novel design co-developed with HP after HP researchers argued that VLIW was the next step forward.

It did, however, manage to kill several of Intel's potential high end competitors with nothing more than hype.


> nothing more than hype.

at least $10 Billions of dollars (still bleeding, though at a much lower rate these days), some goodwill from customers, and letting AMD become the premier 64-bit platform from a couple of years and having to play catchup -- Intel was practically forced to adopt the AMD64 architecture or they would have lost the lucrative market for PCs and servers.

All things considered, I'm not sure that the price Intel paid for killing PA-RISC was good value for money - it was a lot more than hype, even if in the grand scheme of things it wasn't much.


They spent billions of dollars on a dream that didn't pan out. And, in retrospect, was a bad idea.

And maybe a few hundred thousand on press junkets to hype the dream. Millions, tops.

Given that Alpha and MIPS dropped out before there was anything at all concrete, I'd say that hype was the one that mattered.

But I guess it's easy for me to say that in hindsight.


Oh, it was a bad idea from the get-go. That was the sentiment I had myself, and heard from everyone who had actually looked at the details -- long before actual hardware was available.

The market did NOT want Itanium, and Intel knew that - but intel believed that they were strong enough to dictate. They weren't, and their internal culture made it hard for them to accept that for a long time.

I actually find it fascinating - Intel had an OO processor (name escapes me now) between the 286 and the 386 that has a story surprisingly similar to the Itanium: Radical departure towards unproven instruction set which, when arrived, delivered too-little-too-late, causing the company to scramble into retrofitting the older cash-cow for the future before someone else manages to eat their lunch.

History does not repeat, but it often rhymes.


It was iAPX 432. IIRC, its main problem was that it was ahead of its time. It was complex, and the manufacturing process was not developed enough, so it had to be manufactured as three separate chips. This made motherboards more complex, and also performance suffered.


I don't think "ahead of its time" is a good description (or, alternatively, it was so far ahead that its time had not yet arrived).

There was one CPU architecture designed since, that I'm aware of, that has OO baked in as well as a stack-machine model (the defining features of the software side of the 432, even if the hardware was perfect). The Java CPUs faired about as well as the 432, which is "not well at all". I think "bad architecture" is a better description than "ahead of its time", which is just as true for the Itanium.


The iAPX 432, I think, after some googling to refresh my memory. The "mainframe on a chip". It was definitely a bold attempt.


The Pentium Pro was what killed Alpha and MIPS, not Itanium.


The Pentium Pro seriously maimed Alpha and MIPS, but they still limped along bleeding profusely. Itanium is what delivered the killing blow, but I agree that even without Itanium MIPS and Alpha would have died soon enough.


A bit surprising that Jean-Louis confused Itanium with PA-RISC. Itanium was started at HP as a successor to PA-RISC and provided an emulation mode for PA-RISC legacy users but it's EPIC (VLIW) architecture was pretty much a clean slate.


I noticed this too, and thought it was especially funny considering both names are linked to their respective Wikipedia entries.


I wonder why Intel doesn't just design their own custom Arm core like Apple and Qualcomm do. Why so focused on moving manufacturers to x86?


They had it (XScale) and sold the unit, maybe due to internal politics.


I'm surprised that so many people don't know this. Back in the day, StrongARM and XScale were used in most (if not all?) of the PDAs. Intel could've easily had a monopoly today if they continued developing the technology.


And then they'd face regulator-imposed breakup because they'd be the only game in town for CPUs, from phones to datacenters.

All that money they throw at mobile suggests they're not intentionally staying out now, but you have to wonder if they did at some point.


Except that they didn't own the technology like they own x86.


Well, they don't own x86-64, either - and that worked out incredibly well for them :-)


Actually they do. AMD licenses x86 from Intel on which x86-64 is based.


It's all about margin.

Intel isn't going to allocate fab time to a chip that gives them 1/10 the price margin.

Once Intel starts having idle time on its factories because of lack of x86 demand, then they'll start producing lower margin chips, but not before.


The article above states (or assumes?) that Intel sold off XScale to Marvell when the iPhone ship sailed without them.


The cost of developing an ARM core would need to be paid for solely by sales of ARM chips.

By sharing x86 and graphics cores with all their other chips, that R&D is amortized across everything (and the financials for the Mobile division don't look nearly as bad)


Cores are designed to hit a particular performance and power envelope. Even if implementing the same instruction set, cores will be very different for low power sensor platforms compared to tablets and to servers.


ARM is interested in a strong custom core ecosystem , not a single strong manufacturer, especially one that at some point in the future might become a competitor.


It doesn't have anything to do with ARM. Intel used to have an ARM license but they sold it to Marvell just as the iPhone came out.


Isn't it the case that Intel Atom has been just that little bit too power hungry compared to Arm chips? They can't quite convince anyone to use them without large subsidies.

I know they've been making some headway in this regard, but the perception is still there that Atom is too heavyweight to stick in a phone.


The danger Intel faces is at what point are ARM chips "good enough" to encroach on the PC and server space. At some point the billions Apple and Samsung are spending to improve manufacturing and R&D will likely make ARM chips fast enough to handle all kinds of general usage in laptops and servers.

If Apple took their PC business away from Intel, they might not care, but if Samsung decided to start selling $10 desktop ARM chips while the equivalent PC chip was $100, that would potentially put a huge dent in Intel's core business.

I don't know if this will happen soon, but the economics of it are not in Intel's favor. If I were Intel, I'd be more worried about ARM encroaching on desktop and server than getting a foothold in mobile. That is the risk Intel faces.


> I see three possible answers.

How about a fourth option? Intel's organizational structure doesn't really allow it to build profitable mobile chips that are competitive with the ARM chips, $51 subsidy notwithstanding.

This is company that has gotten used to having literally thousands of engineers work on each processor. And note this is a single processor not an SoC, and going from a processor to an SoC is a ton more work. I think they simply don't have the organizational dexterity to effectively compete with the lean and mean ARM shops.


I'm surprised there was no mention of POWER8 and IBM's adoption of the licensing model, similar to ARM. For the first time in at least 5 years, Intel has a Xeon competitor.


If Intel thinks a 65% margin for IoT silicon will be achievable, they'll probably lose again.


The new strategy in the IOT isn't to sell silicon - they try to create higher value goods - an IOT gateway with secure software, wearables, smart glasses together with famous optical brands, etc.

They're probably hoping to create some value due to hardware software integration, ALA apple. But hardware integration(from the hardware side) played only a small part in apple's success,at least as far as i understand it.

So we'll see about intel.


Intel might want to enter the IoT market this way, but all the other chipmakers I've seen aren't going this route. They're trying to drop the power consumption and cost as fast as they can and let someone else design the fancy glasses.

Intel's record in consumer electronics to date is, well, pretty awful. So the expectations are pretty low here. And, again, nobody gets 65% margin on consumer products.


And to extend that remark with fiatmoney's comment, it may turn out that selling one proc to a cloud provider at $1000 and 65% margin is somewhat more profitable OVERALL than selling a hundred $5 procs to a tablet mfgr at a -1000% margin due to their famous subsidy, or maybe just zilch margin due to competitiveness.

Also its not merely a matter of marketing focus. Imagine the next iphone, now with intel processor, having a Pentium 4 sized heatsink and fan hanging off the back. Extreme low power is a checkbox in hardware, much like security is merely a checkbox in software.

Its a crisis that Ford isn't expanding into the wheelbarrow market, etc.


Or Intel is still betting on x86 and thinking ARM is "just a fad"

A lot of companies went down because of similar thinking


"Essentially no revenue for Mobile and Communications"

While technically correct, this misses the fact that Intel has been making some huge inroads into mobile (particularly tablets). This isn't reflected in revenue because Intel has quite literally been giving them away, lubricating their use to make up for the industry being heavily ARM-centric.

http://www.zdnet.com/article/intel-to-hit-40-million-mobile-...

There are tens of millions of devices out there running Intel chips, and it is lubricating the world for the platform. Right now there are people who are running Android on Intel devices and they have absolutely no idea that they are -- the various Memo Pads, for instance. The Dell Venues.

The Memo Pad ME572C runs an Atom Z3560, a 64-bit x86-64 processor with four cores, SSE4.2, and a powerful onboard graphics solution.

Intel is laying groundwork. Anyone who looks at the revenue number and counts Intel out is being fooled. In a few years I suspect that many of the same people will be crying foul about Intel having bought themselves into a market they missed.


> Right now there are people who are running Android on Intel devices and they have absolutely no idea that they are.

This is good for consumers, but terrible for Intel. Once Intel stops wrapping a $50 bill around every Android tablet chip they sell, why would device makers or consumers choose Intel-powered tablets over anything else?

Android tablet SoCs are a commodity market with many players, and the prices will reflect that. It's a market that might bring revenue for Intel, but will never be able to help support their current fat profit margins.


Once Intel stops wrapping a $50 bill around every Android tablet chip they sell, why would device makers or consumers choose Intel-powered tablets over anything else?

Why wouldn't they is the real question. Previously they wouldn't simply because platform support was mediocre to non-existent, and third party apps didn't support it or ran miserable.

It was an ARM party. x86 was a non-starter. Intel would have zero buyers if they sold at normal prices for what was destined to be a second-rate experience.

Now Android fully supports x86 (and very soon x86-64), and while many third-party apps explicitly support x86 (the NDK makes it a simple extra flag, and of course fully managed apps already fully support it), those that don't still run superbly.

Regularly here on HN we see people who are actually in this industry who don't even realize that Intel and x86 have become serious competitors. That devices they might have used are running it.

Android tablet SoCs are a commodity market with many players

Qualcomm has seen exploding revenues, now pushing $30 billion per annum and great profit margins. It is anything but a "commodity" market, and a few makers are making gangbusters.


Qualcomm has seen exploding revenues, now pushing $30 billion per annum and great profit margins. It is anything but a "commodity" market, and a few makers are making gangbusters.

That is almost entirely from sales for Android smartphones. Qualcomm's leading-edge baseband tech (and monopoly on 3GPP2/EvDO networks) has made them the clear market leader for phones. This is why both Intel and NVIDIA purchased cellular baseband technology (Infineon/Icera), tried to compete in phones for a year, and then pulled back to focus on tablets while they continue to develop basebands.

Without a requirement for cellular, the tablet market is wide open. Like you said, these chips all run the exact same software and users can't tell the difference. It's a commodity market, and is Intel going to compete on price with NVIDIA, Samsung, Qualcomm, Marvell, MediaTek, Allwinner, etc? Not while making 65% gross margins.


You're saying that Android on Intel works, but that isn't saying much. It isn't better, it isn't cheaper (once subsidies expire), and it doesn't have the diversity of dozens of different SoCs to choose from.


It isn't better, it isn't cheaper (once subsidies expire), and it doesn't have the diversity of dozens of different SoCs to choose from.

Are you actually basing this assessment on actual facts? Further, as to diversity, Qualcomm 80x absolutely dominates the Android sphere. Vendors aren't making designs with some abstract SoC that they plug in on delivery day -- they are specifically designing devices around a given SoC. Meaning they choose a 805, or K1, or Atom Z3560, and design the product and release it.


How can you "make inroads" into a market, when you're giving away your product in a non-sustainable way. Sure, tablet makers will take Intel's "just as good" 0$ Atom chip. But will they take it when it's "just as good" for $30 or $40?


They might be trying to get some "lock-in" back, though I have no idea how.

Ballmer's Microsoft apparently missed the Mobile/Tablet revolution assuming that the desktop lock-in they still have could be leveraged. That didn't work for years, but seems to have been ignored until Nadella came in.

Intel might be trying to gain a lock-in by somehow putting the familiar x86 into mobiles/tablets, in the hope that developers would release x86 binaries that would make ARM undesirable. But that's an 80 degree inclination uphill battle. They either have a crazy card up their sleeve, or are unable to respond properly (and the $0 cost + $50 subsidy) is the best they can do at present while trying to craft that crazy card.


Quite the opposite: It's eliminating lock-out, which is where Intel was in an ARM-only mobile world.

Intel has the most advanced fabs in the world, and the limits of their own devices has primarily been their concern about competing with themselves (which I'm sure is still the case. With each new chip they probably have internal negotiations about how to ensure it doesn't threaten their desktop and server chips). If Intel isn't disadvantaged in mobile, only a fool would count them out.


To reiterate, given that I apparently failed to communicate this well-

-The mobile space was essentially locked down by ARM. All of the OS', tools, and apps targeted ARM.

-Intel was on the outside looking in, so they starting giving away chips, offering their own engineering to bring Android (and other OS') to x86, and making it a compelling target for the makers.

-Intel x86 OS images for Android became available and production quality (originally only deployed in developing markets as essentially a beta). Intel x86 toolsets were available. Still, though, it didn't matter much because there was no market.

-So Intel gave away their chips. By the tens of millions. These chips are actually quite excellent (great performance, 64-bit, great graphics, low power consumption), but knowing that they were on the outside looking in, they gave them away.

The Android ecosystem now is one where x86 (and soon x86-64) is as supported as ARM is. There are tens of millions of x86 devices, and it's only accelerating. No one can ignore it.

Intel gave away chips because they were at a disadvantage, but they're quickly getting to a position where if they make a very compelling chip they no longer are sabotaged by being an outsider.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: