Hacker News new | past | comments | ask | show | jobs | submit login
The x86 Power Myth Busted: In-Depth Clover Trail Power Analysis (anandtech.com)
95 points by sciwiz on Dec 24, 2012 | hide | past | favorite | 47 comments



From the article: "We've demonstrated this in our battery life tests already. Samsung's ATIV Smart PC uses an Atom Z2760 and features a 30Wh battery with an 11.6-inch 1366x768 display. Microsoft's Surface RT uses NVIDIA's Tegra 3 powered by a 31Wh battery with a 10.6-inch, 1366x768 display. In our 2013 wireless web browsing battery life test we showed Samsung with a 17% battery life advantage, despite the 3% smaller battery. Our video playback battery life test showed a smaller advantage of 3%." It is worth also noting that they are comparing this years intel with last years ARM. Also worth noting that it is also a smaller CPU process of around 25% in Intels favour.

Now nothing stopping Intel doing ARM chips down the line or indeed doing some hybrid chip. They have many cards they can play if they don't get the sales they want.

I would love the day when you could get a motherboard and the current Intel and ARM chips are pin compatable so you could drop what you like in, wouldn't that be a wonderful dream.


I think the trend is definitely away from user-replaceable CPUs. Some enthusiasts say it's "sad", but it's just part of a general trend towards integration on a larger scale. You can't replace a broken transistor on your Intel 4004 either.


It's more a question of upgrades than trying to fix anything. CPUs are quite reliable. The only important things to replace when they break are power supplies, batteries, and storage drives.


I would love the day when you could get a motherboard and the current Intel and ARM chips are pin compatable so you could drop what you like in, wouldn't that be a wonderful dream.

I'm already crossing my fingers that AMD will offer that, as they have announced they are developing an ARM chip and historically are friendlier about socket compatibility.

Not really sure whether it's actually possible though. It also might not be quite as dreamy as you and I are imagining- are drivers CPU architecture dependent?


Drivers no, same with bios, but not as big an issue as it could be.

I admit the AMD/ARM possibilities you mention are certainly very interesting prospects.


I don't think this is a particularly fair test of ARM and x86 as architectures, though it is of Tegra 3 and Cloverview as specific systems. Cloverview just came out, but Tegra 3 is nearly 2 years old. Cloverview is on Intel's 32nm process node, whereas Tegra 3 is on TSMC's 40nm node (and remember that Intel's 45nm node actually had better performance than TSMC's 40nm).

And I've never quite understood NVidia's design choices with Tegra 3. All the other ARM SOC vendors used a low power version of the process node they were on for their ARM cores, but NVidia used the standard process version for 4 of their cores and the low power version only for the 5th core. And yet those four cores are clocked only slightly faster than the cores of their competitors, while their 5th core is clocked way lower.

Now, NVidia's GPUs are top notch and while it hasn't been that successful in recent generations of phones I assume that NVidia's existing Windows software compatibility is why it was selected for the Surface.

EDIT: I seem to remember that there were some big issues on the GPU front a couple of years ago with NVidia's design tools and TSMC's 40nm process having something of an impedance mismatch. The big improvements between NVidia's GPU's from that time to their current generation attest to these issues being sorted out.


Tegra3 has been a huge failure, and ended up with very few design wins. Imagine that surface got a good deal.


It got Nvidia in the best selling tablet of the year, the Nexus 7. Technical merits aside, it was a win for Nvidia's bottom line.

I still don't get how embedded chipset manufacturers can keep competing with the largest graphics company in the world in the gpu space so easily though. Tegra 4 is supposed to be jumping from 12 up to 72 in the highest end model Kepler cores instead of old 8000 cores. If they manage to get some super efficient high performance graphics in mobile I could see a games revolution around it.

Especially if they put that in the Nexus7#2 next year.


"I still don't get how embedded chipset manufacturers can keep competing with the largest graphics company in the world in the gpu space so easily though."

Just for some perspective:

Samsung total revenue: $247 Billion Apple total revenue: $157 Billion Qualcomm total revenue: $19.1 Billion Texas Instruments total revenue: $13.7 Billion NVidia total revenue: $4.00 billion

NVidia is the relatively small fish in the pond here, and if they have relatively more graphics experience the others have relatively more experience with everything else in this space.


The Tegra 3 cores are actualy older 7000 series (prior to the whole CUDA revolution) cores, much the same as you have in a playstation3. Yes the Tegra4 will make a huge jump to teh kepler cores, so will be hard to compare core counts. Certainly will afford a very powerful platform. Tegra3 is how old now and works realy well.


The Nexus 7 outsold the iPad?


Does Windows RT offer anything other than ARM compatibility for better size/battery life? Is the whole confusing Windows RT / Windows 8 split something that could've been avoided if MS had known Intel would have these chips ready?


I think it's a political move more than a technical one. WinRT lets MS say to Intel that Intel has to care about the tablet market and low-power processors or otherwise ARM will get a foothold in Windows (giving them easy access to the low-end of the laptop market in the next few years).

Of course, this would have worked a lot better if people cared about WinRT at all, but I'm sure that was the idea.


> WinRT lets MS say to Intel that Intel has to care about the tablet market and low-power processors

I don't think Intel needs MS to say that. The lost money on the table in the smartphone-tablet ecosystem is enough incentive!


Margins on CPUs for tablets and phones are not all that high. Intel enjoys pretty amazing margins on their desktop market, so they aren't missing too much there. They just need to keep ARM away from their bread and butter of profits, which probably can be explained by their current push to keep ARM out.


No, but tablets and smartphones are starting to affect the PC market. Desktops are dying, and laptops are stagnating.

For a lot of people, a smartphone or tablet is sufficient to meet their computing needs. Especially with ability to connect a TV and keyboard, and a number of small ARM based "mini PCs" starting to arrive.


Leverage aside, it also protects MS a bit in case x86 crashes and burns.


Precisely my thoughts. The way things are looking, by the time MS gets sizable Windows RT device sales, Intel would be beating ARM on performance/efficiency.


Intel would be wise to brand Clovertrail as something other than Atom (or Celeron). Clovertrail perf and efficiency are very good and the Atom name typically suggest otherwise.

If I could get a Surface Pro with a Haswell at the listed Surface Pro prices, it would be a no brainer. But with a 3rd-gen Core i5 -- not so enticing.


I just found that Intel is about to release a server-version of the Atom, codenamed Centerton, based on a 32nm process. These will be 64-bit processors and support Intel's virtualization extensions.

Next year there will be a 22nm version of Centerton, Avoton, with a 14nm processor following the year after. Centerton, and possibly Avoton, will be out before the first 64-bit ARM chips.

http://www.anandtech.com/show/6509/intel-launches-centerton-...

It looks like Intel is taking the Atom's intended market serious enough to extend their tick-tock cadence to it.


This is on area which Intel are trying to cover low power server market to stop people looking and playing with ARM's as much as possible.

Eitherway it certainly will wake up Intel, took AMD with the XP series chips to get Intel into gear many years ago and they need another kick for momentum.

But on a like for like generation of CPU and nanometer size ARM certianly have more headroom. Intel though certainly have a crazy no lose situation as they have a full ARM liscence so can always make and sell ARM chips.

Intel also have there multi core knights corner CPU's and they will only get better and have many area's of research into feilds that have yet to emerge. I find there whols onchip radio research very tantalising for future chip to chip communicatons with regards to production costs of servers.

Fun times, and as a consumer I'm happier we now have a race on our hands.


They say they proved the atom performance advantage, but the linked article says they had nothing but a JavaScript test which probably has significant code quality differences between the Intel and Arm JITs.


Does anyone know anymore about this and or can explain what the relation is between frame rates and survey data?

"User experience (quantified through high speed cameras mapping frame rates to user survey data)"


My interpretation was that they would use cameras to measure frame rates and then look for the level at which lower frame rates reduced user happiness as measured through surveys. Pushing frame rates beyond that would be much less valuable.


I don't know anything about this test specifically but I cannot imagine that people were happier with lower frame rates.


They are just given as examples as ways Intel is getting quantifiable UX data. Probably more clear written as "quantified using tools ranging from high speed cameras mapping frame rates to user survey data"


The problem is Intel has already lost on the performance side. Intel has always needed 3 things to beat ARM, and they need all 3 to be competitive: performance, energy efficiency, price.

With Atom, they've always had the performance advantage. In fact they've had a big advantage over ARM at the time when when Atom came out. It was like several times more powerful than the ARM chip inside iPhone 3G.

But to be competitive, Intel needed to compete on the other 2 levels, as well. So they kept Atom's performance mostly the same, while they tried over the years to push Atom down from a 10W TDP. But while Intel tried to lower the power consumption, ARM's chips were gaining in performance - faster than Intel. So now when Intel has finally reached parity with ARM in energy efficiency, ARM has surpassed it in performance with Cortex A15, and from what I've seen so far, by a lot.

Then there's also the pricing issue. I think they've improved there quite a bit. An Atom chip used to be close to $100. Now it's probably more like $40-$50 (this dual core version). But that's still like at least 50% more than an ARM chip.

So Intel still has a lot of catching up to do in "overall competitiveness", because right now it looks like they've only switched the much bigger performance leadership for "energy efficiency parity", and are losing at the other two. I'd like to see how high they are willing to go with Atom on performance, until they realize that it's going to cannibalize their Core i3 market.


Check out the Intels ULV processors. Intel haven't bothered to do anything with Atom because they don't want to cannibalize their own products.

Long before Atom even came out a dualcore Core Duo ULV processor with a TDP of 5 was usually twice as fast as the Atom, if you disabled one of the cores on the Core Duo ... In other words about four times the performance at a much lower power consumption. I'm writing this on a 7 year old laptop sporting such a processor... Yes it is slow, but compared to an atom it the performance is just hysterical.

The Atom is, and has always been, a pathetic joke in all regards except for price. And the performance is so abysmal that the only reason it ever sold was because of shortsithed consumers that can't look passed the price tag. Which of course took everyone by surprise, and intel obviously haven't dared to improve on Atom too much since it's launch for this very reason.


ULV chips are just the lowest-power top bin of all normal cpus. To get chips that can be called ULV, Intel has to bin hundreds of non-ulv chips. As such, ULV chips can not be mass-produced in the kind of volumes needed for tablets.


May be so, but I have hard time believing that you can find a worse comparison than Atom vs. ARM (in this context).

It just isn't that simple.


> Intel has always needed 3 things to beat ARM, and they need all 3 to be competitive: performance, energy efficiency, price.

It's a nitpick, but surely you don't mean this literally? What if Intel's offerings beat ARM chips on, say, efficiency and price, or efficiency and performance... would you still believe ARM to be in a superior position?


There will always be room for ARM if Intel doesn't compete on price.


Shareholders will be very happy if Intel remains dominant at the high-margin, high-performance end. And competitive everywhere else.

AMD has been competing with Intel on price for the past two decades.

I know which of these two stocks I'd have rather owned over the past ten years:

http://www.google.com/finance?chdnp=1&chdd=1&chds=1&...

The data from this article is the first I've seen that indicates Intel could dominate the high-end mobile chip market in the near future.


>Shareholders will be very happy if Intel remains dominant at the high-margin, high-performance end. And competitive everywhere else.

Not with mobile trends and ARM uptake trends persisting. ARM speeds are getting more and more competitive. (I would argue, please forgive me, that signs are pointing towards Linux uptake. Intel's high-margins are on consumer processors as well, surely a large percentage of it. It will hurt as mobile continues to displace desktop and as desktop increasingly targets ARM.


> What if Intel's offerings beat ARM chips on, say, efficiency and price

Very unlikely. First, because ARM already has chips that are much more efficient, like Cortex A7. Intel can't bring Atom to that level right now. It would take a few years, just like it took for Atom to get from 10W to 2W. By that time A7's successor would probably again be more powerful.

I also doubt it will be cheaper. So far no indication that Intel has managed that compared to ARM - ever.

>or efficiency and performance

That may already exist, in the form of their high-end chips, and in terms of performance/Watt. But their prices are 10x bigger. So pretty irrelevant when it competes to competing with ARM in the same markets.


This is a faulty analysis, I think. The Atom wasn't originally directly in competition with ARM, they were in different market segments. And it wasn't a case of the atom merely being iterated in design until it matched ARM efficiency levels, that's not what's happened. Instead, the Atom was originally just a low-power stand-alone processor and now Intel has developed a first generation Atom based system-on-a-chip. The fact that Intel's first SoC designs are so closely competitive to ARM's n-th generation designs should send alarm bells ringing throughout the ARM industry. Intel is good at improving design and they are even better at economies of scale.

It's easy to dismiss Intel's offerings as not being competitive with state-of-the-art ARM designs, but only a sucker would dismiss Intel and rest on their laurels. Intel's 2nd and 3rd generation chips are just going to be that much better, and ARM could be in serious trouble in as little as a year or two from now. Imagine, for example, an Intel based android powered tablet that can run stock Windows in a VM or emulate windows and can run most windows applications and perhaps even games, while also being able to run all android applications. And do so at a cost/performance/efficiency level equivalent to any ARM based system. That's the sort of thing that should send shivers down the back of Samsung or even Apple.


Indeed. It would also be a serious error to assume that anyone will be on Intel's leading edge process any time soon. At some point there is a process node that only Intel can afford to develop, and ARM makers are going to just sit there on their old process, whic might be cheaper but isn't going to be as efficient.


ARM CPU's outsold Intel CPU's in volume well before tablets and smartphones started using them.

If I remember correctly, projected volume of ARM CPU's for this year is in the region of 3 billion units.

The smartphone and tablet market is not even near being the majority of ARM CPU sales. So I don't think ARM will despair over Intel's designs just yet.


Volume is not profit. Intel is an order of magnitude more profitable. However, ARM's model seems to be to suck the profit out of the market. It's a textbook example of a paradigm shift, with an initially inferior product offered at a much lower price point gradually catching up with the market leader. Intel has no choice but to follow ARM down the price curve to settle at a lower profit margin. They seem to have realized this with their current chip strategy. I would be looking at intel to start diversifying into the device market, because that's where the profit is moving.


A lot more: From Wikipedia: "In 2011, ARM's customers reported 7.9 billion ARM processors shipped, representing 95% of smartphones, 90% of hard disk drives, 40% of digital televisions and set-top boxes, 15% of microcontrollers and 20% of mobile computers."

That roughly matches my memory of previous recent years' reported sales.


>Imagine, for example, an Intel based android powered tablet that can run stock Windows in a VM or emulate windows and can run most windows applications and perhaps even games, while also being able to run all android applications.

If we're going that far, we should imagine an iPad that is also a Mac.


That's not far at all. If they can manage to get their shit together and provide a reasonable user experience, the day when a Mac (not necessarily all Macs) run the same OS and the same apps as iPads and iPhones is not far off.

Also, it is not a far off prospect at all, since the Surface Pro tablet will run Windows 8 Pro (on a Core-i5). It'll be able to run almost any windows app, even photoshop or visual studio (or MS SQL for that matter).


Its not really a SoC in any meaningful sense, other than integrated GPU and memory controller (which most CPUs have now) is it? Compared to ARM SoCs which are very highly integrated so you more or less have to wire up power and RAM.


The new atom processors really only cost 50 bucks? So does that mean there is pretty much no reason at all for the 150-300 dollar premium on atom based that we are currently seeing?


Atom SoC NEVER cost close to $100. And it doesn't cost $40 now either. What you are talking about is the Desktop Atom. Which is different to Mobile / Tablet Costing.

I dont think anyone should doubt Intel's Enginerring Power to bring Power and Performance Level unseen before to SoC market. The only problem is Margin and Price. Who these should work out are remain to be seen.


>The problem is Intel has already lost on the performance side.

Really? From TFA:

"""For us, the power advantage made a lot of sense. We've already proven that Intel's Atom core is faster than ARM's Cortex A9 (even four of them under Windows RT). Combine that with the fact that NVIDIA's Tegra 3 features four Cortex A9s on TSMC's 40nm G process and you get a recipe for worse battery life, all else being equal."""


That's some bad copy write from the start:

>The untold story of Intel's desktop (and notebook) CPU dominance after 2006 has nothing to do with novel new approaches to chip design or spending billions on keeping its army of fabs up to date. While both of those are critical components to the formula

If those components are "critical to the formula" then you cannot say that "it has nothing to do" with them.

Critical means: it has absolutely something to do with them too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: