Hacker News new | past | comments | ask | show | jobs | submit login
Why Apple saddled the MacBook Air with "gimped" CPUs (arstechnica.com)
101 points by pietrofmaggi on Oct 21, 2010 | hide | past | favorite | 93 comments



Apple has an amusingly-worded bit about their CPU on their features page:

> MacBook Air weighs less than three pounds, but it’s a heavyweight where it counts. Intel Core 2 Duo processors get the work done fast. So you can be every bit as productive on MacBook Air — but in more places. Live-blog the event of the year straight from the convention floor. Perfect your sales-winning presentation from the airport terminal. Cite references down to a T from the library stacks. MacBook Air lets you do everything you need to do whenever and wherever it needs to be done.

Everything you need to do, such as writing, writing, and writing. Those examples seem like they would be more appropriate if Apple had chosen an Atom processor. They don't exactly instill me with 'heavyweight where it counts' confidence.


I used to do image deconvolution of the original (flawed) Hubble images on a Sun Sparc5. Exactly what are you people using your Macbook Air for while sitting in starbucks ?

Is there a big underground CFD or N-Body scene among the trendy teens that I'm not aware of? Or is everybody into computational chem and protein folding to develop the next party drug?


Right. Folks nowadays carry around an embarrassment of riches in terms of CPU power. People feel poor if they can only sit with a phenomenal amount of computing power in their local cafe, as opposed to phenomenal times 3.


While we do have a phenomenal amount of power in our hands, the fact is that the rest of the industry has grown in step with the hardware in a lot of ways. Browsing the web (admittedly as a power user) is one of the most intensive things I do aside from compiling software.


the fact is that the rest of the industry has grown in step with the hardware in a lot of ways

This tells me that there is a tremendous opportunity in the form of huge inefficiencies in our computing infrastructure. I think Steve Jobs and others at Apple went down this mental path when formulating the iPhone/iPad. Actually, Opera Mini is a powerful demonstration of this idea!


>This tells me that there is a tremendous opportunity in the form of huge inefficiencies in our computing infrastructure.

VMware has benefited from the CPU race. ESX is an easy sell for fotune 1000 datacenters that have huge amount of idle cpu horsepower.


heh. I did an internship at NASA that basically involved image alignment of IUE starfields compared to another dataset. All done on a Sun 4. I would surf with Netscape while the conversions were running, which took anywhere from 10 seconds to 5 minutes each.


Well it is a bit of a bummer when I have to repeat my voice commands and dictation to my phone because it lost its connection to the big server farm that does the speech recognition.


You know, with all the talk of portability and 'mac meets iPad,' why isn't the air available with built-in 3G and iPad-like data plans?


I've been wondering that for years. Many netbooks and ultra-portable laptops have this at least as an option these days. I'm going to have to get one of those battery powered 3G->Wifi routers, my USB modem sticking out the side scares me (people walking past it are a liability). Annoying - another device to drag around - but I guess it'll probably come in handy in other ways.


But reading, browsing, and writing is probably what most people spend their time doing on a computer.

"Transcode video formats while sitting in a cafe" doesn't exactly have the same ring.


Sure, but video chatting is getting to be pretty popular as well (which, at least on the previous Air, tended to bring the processor to its knees).


And browsing can be a pretty CPU heavy thing, these days. Not on the scale of video encoding, but enough to bring an ARM or Atom processor to its knees.


Not when you stop shipping flash with the OS.


Pages with heavy Javascript can cause issues too. For example, open up a large reddit thread on a netbook and wait. However, I don't know whether the MBA's processor will be better enough to handle that.

And not shipping with Flash doesn't mean Flash is now useless. Plenty of video sites out there haven't switched to HTML5, not to mention product sites or Flash game sites.


Oh who cares, really. It just forces the user to go out and download the latest version if they really want Flash, which means Apple won't take the blame for buyers getting machines with outdated security-compromised versions of Flash.


MBAs have OSX, which I can't imagine is going to be losing Flash any time soon.


There may be a surprise coming your way.

See the link[1] and thread[2] about Flash-less MBAs.

[1] http://www.engadget.com/2010/10/20/macbook-air-all-substance...

[2] http://news.ycombinator.com/item?id=1814713


Well it doesn't come with Windows 7 by default either. It's just one of those things that everyone has to install whenever they reinstall windows. I also don't believe it comes with snow leopard either -- maybe the only change was remove the option of installing 3rd party plugins in Safari.



Did they also disable JavaScript?


I'm using a 13" MBP (with Core 2 Duo processor) for doing music recording / production. I only record a couple tracks at a time (but mix and playback more), so this may not count as uber-power-usage, but it's more than text editing, and it works fine.


I make electronic music. And I basically stop making a track and render when I literally cannot do anything more to it. I don't like bouncing thing down but by the end of my productions I literally have every single track frozen (so that I can still mess around with the MIDI/automation and such.)

I put a ton of stuff on my aux busses that I can't freeze so that's where the CPU goes but man, I would LOVE LOVE LOVE if I have something faster (4 GB MPB 15" 2.53 GHZ Core 2 Duo.)

And really why settle for what we have anyway -- don't get me wrong I'm very happy with the way things are now and humbly understand what it was like back in the day...but this is how we advance things by questioning and being unsatisfied with what we have now and making the future.

[For those curious: http://soundcloud.com/gau5/spotlight -- to listen and download here, any other music people on here send me some links! :)]


Nice tracks. I had never heard of soundcloud before.

I did not mean to imply that we should settle for current technology, nor that the Core 2 Duo processor was the most totally awesomest thing ever. I just meant to posit that it was perfectly adequate for more than the most trivial of computing tasks, and, implicitly, many MBA users will probably be fine with it.


You're probably not using a low-voltage version of the processor though. This could make a difference (I'm assuming that Apple is using the lv or ulv version of the Core2Duo in the MBA).


There is no performance difference for Core2Duo generation CPUs with identical clock speeds and cache size but different TDP ratings. The LV/ULV ones are the same chips as "regular" mobile (or even desktop) dice but hand-picked for their low consumption. It gets murkier with the Core iX generation due to the automatic overclocking mechanism.


> I'm assuming that Apple is using the lv or ulv version of the Core2Duo in the MBA

They do already (the low-profile LV and ULV), and I'm pretty sure they'll keep on doing just that. I'm not even sure there's a non-ULV C2D at such low frequencies.


The Air is in a completely different league to netbooks, for a number of reasons. Even the slower CPUs in the 11-incher are 30-60% faster than comparable Atom parts, but consume 30% less power. CPU performance of the 11-inch model is comparable to the 2006 Macbooks and the 13-incher to the 2007/2008 Macbooks.

The big performance story is the GPU and SSD.

For the majority of Air buyers, their new machine will feel quicker than any computer they've every used, by simple merit of the SSD. Cold boot times of 15 seconds are being reported and programs will load with similar haste.

The GPU is a big deal, both for graphics performance and OpenCL. OS X has always leaned heavily on the GPU. With the introduction of Snow Leopard, the vast floating point performance of the GPU is available to the whole system. If you've been following the Folding@Home project, you'll know how big a deal that is. Of course third-party developers are proving fairly slow to make use of it, I will bet lumps of my own flesh that iLife 11 will be heavily optimised for OpenCL.

We've already mostly dispatched with the megahertz myth, but we're going to have to confront the idea that CPU speed is a relatively minor part of real-world computer performance. The old Air felt miserably slow, mainly due to hard drive throughput. The new one will feel very fast indeed, in spite of a relatively modest CPU.


I think you misread the article. The power difference stated is between the 11' and 13' processors. Atom processors use around 2.5w, which is 4 times less than the 11' processor in the air.

http://www.intel.com/products/processor/atom/specifications....


The low power Atom parts aren't remotely comparable in terms of performance performance, so I didn't compare them - the 2GHz Atom Z550 at 2.5 watts TDP has a Passmark score of just 386, compared to 964 for the 1.4GHz Core 2 Duo.


Thanks, I wasn't aware there was such a large performance difference.


TL;DR Summary: Intel's newer CPU's are physically larger and thus harder to fit into the form factor, use somewhat more power, and their integrated GPU's don't support OpenCL and are lacking in performance.


So the headline could have been "Apple chooses form factor, power usage, and OpenCL performance over newer generation CPU in MacBook Air". The words "saddled" and "gimped" probably get more page views.


Except that your leaden, verbose headline would have been terrible from an editor's perspective. Their headline is far more succinct and uses language that speaks to the Ars Technica audience, which tends to both specs-obsessed and a little puerile.

Besides, the use of scare quotes ought to suggest to any reader that the headline is tongue-in-cheek.


The article underplayed the importance of thermal design envelope to the situation, and your summary completely left it out.

I think that many people- especially consumers who know enough to reach the wrong conlcusions- don't understand the impact and importance of dealing with heat from CPUs and GPUS. IF you look closely at the inside-the-case view of the Air, you'll notice that the motherboard is at one level, in a plane with the keyboard, and then above it (toward the camera) is space that exists over the CPU board but not over the batteries (because this is where the case gets thicker). On the left, this extra space is taken up with an SSD card that is attached on top of the motherboard... on the right, it is taken up with a heat-pipe that hits the CPU and GPU and goes to the fan.

See how much space the fan takes with the heat pipe? The heat pipe alone doubles the thickness of the machine (assuming some battery space could be given up to fit the SSD in elsewhere.) But you can't go without it and you can't put it in plane with the motherboard.

Laptops are all about power, heat and heat management.


Since Apple typically caters their decisions to the majority of their buyers (as opposed to those power users who look specifically for the most up-to-date tech specs) this decision makes a lot of sense. Having more battery time is rather large plus.


So right. Out of all this stuff listed in the article, the only thing that matters to the _average_ Mac consumer is the battery life.


Heat management also matters for at least two reasons. Directly, people don't want to be burned by their laptops. Indirectly, overheating tends to lead to component failures, and people care about reliability.


Here is a conspiracy theory I've got, and I think this article and Apple releasing the app store for mac backs it up: Apple is going to drop Intel chips in the next 3 years. My time may be off but I really think this is what's going on.

All of this fighting between Intel and Nvidia is really only hurting customers; namely Apple. So what can Apple do? Create an app store that makes devs standardize on an API and shift the underlying arch. An arch that 67% of their product sales are using.

Don't get me wrong, this is going to be a difficult transition, I think apps like Steam are really going to get screwed, but this is Apple's end game. They control not only all the software but also all the hardware.


It certainly meets the "improbable and kooky" parts of a conspiracy theory. There are several problems with it:

1) Apple already managed one processor transition without herding everyone onto a standardized API. Surely they'd go that route again.

2) If they weren't going that route, then surely they'd be trying to herd as many people as possible onto the standardized API. But the App Store as Stealth API Standardizer doesn't hold water; if that's what it were intended to do, they would have tried to make it appeal to every important vendor on the platform. Instead, it's squarely aimed at small indie developers and excludes or is otherwise not strategically interesting to important 3rd party vendors like Adobe, Valve, VMWare, Mathworks, Microsoft, etc, etc. Are they not coming along to the transition?

3) What is Apple going to transition to? ARM? Seriously? They're great low-powered CPUs, but they're not within a million miles of Intel's Core i* CPUs for the "truck" computing that Apple's pro machines are used for. Compiling software on a 1 GHz A4 would be painful; editing video on it would be downright insane. This may change in years to come, but we're nowhere near close enough for it to happen any time soon.


1) Apple already managed one processor transition

Two, actually. 68k to PPC to x86.


Good point.


1) Good point. Using the backdoor of the app store would just be easier for them to do (I would imagine).

2) Maybe they will be bringing along the big guys, in the future. Java is being deprecated (http://bit.ly/cFp0GX) in the future the may require apps to use the standard API.

3) ARM chips are getting much better. The A-15 is going to be multicore and be up to 2.5 GHz (http://bit.ly/bhVKBP). It is totally possible for apple engineers, gotten from PA Semi, to string together more cores to make a more powerful process. Difficult and costly, but possible.

The problem I see with this line of thinking is that ARM is only a 32 bit arch right now and I don't know of any plans on changing that. Also, having the big guys convert their software over to using ARM instructions over intel (point 2) would be really painful, I think. LLVM and some good virtualization might help mitigate this, but I would have to research that more.

Also, I'm probably completely wrong and I'm fine with that. You have very good points and I appreciate all of them.


I was thinking about this yesterday as well. I know Apple bought their own processors for the iPhone. I'm not sure they'd run their desktops off an apple A4 chip - but I mean - look at AMD. Market cap is $5 billion. Apple has $50 billion in cash.

Apple could buy them and REALLY control the computer, end-to-end.


This is not as easy as it sounds. AMD will loose it's x86 license if acquired without Intel permission. Designing an efficient parser for a different instruction set can take years.


I'm pretty sure that the patent situation between AMD and Intel is one of mutually assured destruction, given that AMD created the 64-bit extensions to x86. Intel could complain a lot and file some lawsuits, but if they seriously tried to block the acquisition of AMD, they would be putting their whole patent portfolio at risk and opening themselves up to billions of dollars of punitive fines from antitrust regulators.


Not really: "However, the agreement[43] provides that if one party breaches the agreement it loses all rights to the other party's technology while the other party receives perpetual rights to all licensed technology."

https://secure.wikimedia.org/wikipedia/en/wiki/X86-64

http://contracts.corporate.findlaw.com/operations/ip/802.htm...


That sentence doesn't seem to be supported by the citation as redacted and posted online, or I just can't find the relevant language. It looks to me like the list of sections that survive the termination of the agreement doesn't include section 3, which is the actual cross-licensing section. I'm not a lawyer, though, so please point me to the section that has that effect.

Even if the agreement does stipulate that AMD's patent license to Intel becomes perpetual upon termination due to change of control, I really doubt that it is legal in the US, since it basically means that a third party can't buy their way in to the market unless they bribe both AMD and Intel to weaken the duopoly. That would seem to amount to a cartel.


Exactly. Both Intel and AMD require licenses from the other to produce current x86 chips.


I can't see Apple buying AMD; like in the G4/G5 days, they'd just have to get up on stage every year and acknowledge the large performance gap between themselves and Intel and insist that it would be reduced sometime in the upcoming year. (Edit: And as a sibling post points out, this isn't even practicable. Without an x86 license they'd be dead in the water).

On the other hand I could see them buying NVidia and taking control of their GPU woes directly. Intel Integrated graphics aren't going to stop sucking, and OS X isn't going to stop growing its dependency on a real GPU anytime soon.


Given that apple is in a much different place now then it was during the PowerPC days, maybe this isn't such a terrible idea. (And PPC wasn't apple so much as moto/ibm). They're proving it works right now with the A4... so maybe you're right.

That sad, I'd hate to have them go down the sun route, and just get left behind if/once intel gets off their ass.


I don't think Apple has any reason to buy AMD (too much of a bother, they're already busy with their fabless ARM acquisitions, and they're not into big buys), but they sure as hell could "orient" AMD's future decision making, especially when it comes to LV/ULV CPUs and IGPs.


AMD spun off their fab business in mid-2008, so an acquisition of AMD wouldn't put Apple into the fab business.


Why not ship an A4 process along with intel chips? It only costs $10.75 for the A4 v.s. over $100 for the intel chip.


> Here is a conspiracy theory I've got, and I think this article and Apple releasing the app store for mac backs it up: Apple is going to drop Intel chips in the next 3 years.

With all the Intel/NVidia crap going on, that's pretty likely. But not to a different architecture.

> All of this fighting between Intel and Nvidia is really only hurting customers; namely Apple. So what can Apple do? Create an app store that makes devs standardize on an API and shift the underlying arch. An arch that 67% of their product sales are using.

Uh no, that's not going to work unless they mandate that everything on the MacStore be UB x86/ARM, in which case you'll see the transition coming from a solar system away. Furthermore ARM chips simply don't have the oomph to drive big systems right now, you can put as many Cortex-A9 cores as you want on a chip, you won't be building something that can rival the current 12-core mac pro.


Moving to Intel was the best thing that ever happened to the Mac. Literally — look at the sales charts for the past decade. Apple would have to be insane to drop them.

Beyond that, Apple has already done more to make devs standardize on an API than the App Store ever could. Cocoa is the sole API that can create 64-bit apps on Mac OS X, and lots of system APIs are now only available through Cocoa. I don't see how the App Store adds any pressure at all, to be honest.


The fact that the iPad uses a new processor and Apple's continued interest in LLVM would seem to support this, as well.


I don't see how LLVM has anything to do with this. LLVM is just a replacement for GCC which is woefully bloated with almost no good way to extend the functionality and is locked into the GPL.

That being said, Apple employs the guy that builds LLVM/clang, so it only makes sense that they have an interest in it :P


> LLVM is just a replacement for GCC

You're confusing LLVM and clang. LLVM is a bunch of tools to build compilers with, one of which is clang. With Apple moving to tools that support LLVM, they could easily switch out backends to generate, say, ARM rather than x86_64 instructions, allowing them to keep the same software, but run it on different hardware.


No, not really confusing them at all. I know what LLVM is, I used it in the more broader terms in that clang is a subproject of LLVM and wouldn't exist without it.

LLVM in the strictest sense is replacing GCC in that clang is being brought in, along with various other tools that are normally part of GCC.

Also, GCC currently compiles for ARM, what makes LLVM somehow better for generating ARM code than GCC? Your logic here makes absolutely no sense what so ever.


I was speaking in generalities about ARM. One of the main aims of the LLVM project is to be as modular as possible, and GCC's plugin architecture is less than stellar, as others have pointed out in this thread. Embracing flexibility would enable Apple to make this move, and they've jumped processor architectures twice in the past, and once with the iPad, so it's not out of the question that they wouldn't do it again.


Still has nothing to do with choosing LLVM over GCC. Currently GCC is happily generating object files for me that contains i386, x86_64, and ppc:

file x.o x.o: Mach-O universal binary with 2 architectures

x.o (for architecture i386): Mach-O object i386

x.o (for architecture x86_64): Mach-O 64-bit object x86_64

x.o (for architecture ppc7400): Mach-O object ppc

Going to another architecture doesn't mean changing the compiler at all, just like they didn't change the compiler when going from PowerPC to Intel. Back on PowerPC it was GCC and now on Intel it is GCC. I believe that GCC is also used to compile for the iPhone/iPad.

Please don't get me wrong, I am really happy Apple is embracing LLVM with the various tools surrounding it and is moving away from GCC, however that is not a sign that they are planning on moving CPU architectures again.


That Apple is deploying older generation CPUs in its latest generation MacBook Air is a further sign that the x86 architecture is in the early stages of being disrupted. Drawing on work by Clayton Christensen, the classic signs of disruption are as follows:

1. The current technology is overshooting the needs of the mass market.

Due to a development trajectory that has followed in lockstep with Moore’s Law, and the emergence of cloud computing, the latest generation of x86 processors now exceed the performance needs of the majority of customers. Because many customers are content with older generation microprocessors, they are holding on to their computers for longer periods of time, or if purchasing new computers, are seeking out machines that contain lower performing and less expensive microprocessors.

2. A new technology emerges that excels on different dimensions of performance.

While the x86 architecture excels on processing power – the number of instructions handled within a given period of time – the ARM architecture excels at energy efficiency. According to Data Respons (datarespons.com, 2010), an “ARM-based system typically uses as little as 2 watts, whereas a fully optimized Intel Atom solution uses 5 or 6 watts." The ARM architecture also has an advantage in form factor, enabling OEMs to design and produce smaller devices.

3. Because this new technology excels on a different dimension of performance, it initially attracts a new market segment.

While x86 is the mainstay technology in PCs, the ARM processor has gained significant market share in the embedded systems and mobile devices markets. ARM-based processors are used in more than 95% of mobile phones (InformationWeek, 2010). The ARM architecture is now the main choice for deployments of Google’s Android and is the basis of Apple’s A4 system on a chip, which is used in the latest generation iPod Touch and Apple TV, as well as the iPhone 4 and iPad.

4. Once the new technology gains a foothold in a new market segment, further technology improvements enable it to move up-market, displacing the incumbent technology.

With its foothold in the embedded systems and mobile markets, ARM technology continues to improve. The latest generation ARM chip (the Cortex-A15) retains the energy efficiency of its predecessors, but has a clock speed of up to 2.5 GHz, making it competitive with Intel’s chips from the standpoint of processing power. As evidence of ARM’s move up-market, the startup Smooth-Stone recently raised $48m in venture funding to produce energy efficient, high performance chips based on ARM to be used in servers and data centers. I suspect we will begin seeing the ARM architecture in next generation latops, netbooks, and smartphones (e.g., A4 in a MacBook Air).

5. The new, disruptive technology looks financially unattractive to established companies, in part because they have a higher cost structure.

In 2009, Intel’s costs of sales and operating expenses were a combined $29.6 billion. In contrast, ARM Holdings, the company that develops and supports the ARM architecture, had total expenses (cost of sales and operating) of $259 million. Unlike Intel, ARM does not produce and manufacture chips; instead it licenses its technology to OEMs and other parties and the chips are often manufactured using a contract foundry (e.g., TSMC). Given ARM’s low cost structure, and the competition in the foundry market, “ARM offers a considerably cheaper total solution than the x86 architecture can at present…” (datarespons.com, 2010). Intel is loathe to follow ARM’s licensing model because it would reduce Intel’s revenues and profitability substantially.

In short, the ARM architecture appears to be in the early stages of disrupting x86, not just in the mobile and embedded systems market, but in the personal computer and server markets, the strongholds of Intel and AMD. This is evidenced in part by investors’ expectations for ARM’s, Intel’s and AMD’s future financial performance in the microprocessor markets: today ARM Holdings has a price to earnings ratio of 77.93, while Intel and AMD have price to earnings ratios of 10.63 and 4.26, respectively.

For Intel and AMD to avoid being disrupted, they must offer customers a microprocessor with comparable (or better) processing power and energy efficiency relative to the latest generation ARM chips, and offer this product to customers at the same (or lower) price relative to the ARM license plus the costs of manufacturing using a contract foundry. The Intel Atom is a strong move in this direction, but the Atom is facing resistance in the mobile market and emerging thin device markets (e.g., tablets) due to concerns about its energy efficiency, price point, and form factor.

The x86 architecture is supported by a massive ecosystem of suppliers (e.g., Applied Materials), customers (e.g., Dell), and complements (Microsoft Windows). If Intel and AMD are not able to fend of ARM, and the ARM architecture does displace x86, it would cause turbulence for a large number of companies.

I just posted this as an article to HN: "The End of x86?" I'd appreciate an upvote. Thank you!


Currently Apple relies on Intel for a major component in a key product. Strategically, Apple doesn't like to have to rely on a single source or supplier for key products. Apple will do whatever is possible to remove this reliance.

Hence a prediction: within less than 5 years a Mac will be running on an Apple designed ARM processor.

How? By slowly, step by step, providing a way towards this.

Step 1. Migrate your OS to the new architecture (e.g. iOS already, OS X not far behind) - done

Step 2. Migrate your developer base onto developer tools which you control and can easily change the architecture it targets (e.g. Xcode and LLVM) - done

Step 3. Provide a space where problematic applications which use other VMs or rely directly on getting too close to the hardware are not welcome (e.g. a Mac App Store) - announced

Step 4. Change the marketplace behaviour so that you control how the majority of applications are distributed and can quickly provide updates without user intervention. Such as an App store.

Step 5. Release a new Macbook with an ARM processor, absolutely killing on form factor, price and battery performance that Intel cannot compete with. Encourage your Mac App Store developers to flick a switch in Xcode, to recompile and upload their new Universal (x86 & ARM) versions of their Apps to the Mac App Store.

Result: you now control the processor direction and application distribution mechanism for a key product and no longer rely upon the whims of Intel.

Apple is all about controlling an integrated experience for their customers. Currently Intel is getting in the way of this for the Mac product.


"Hence a prediction: within less than 5 years a Mac will be running on an Apple designed ARM processor."

That assertion presupposes that Apple will give up the professional content creation market entirely. If they've killed off Shake and curtailed developement on Final Cut Pro, then I'g guess that your theory has some merit.

The reason? The upcoming 2.5 GHz ARM core will most likely be performance competitive with today's x86, but by the time it comes to market, we'll most likely be looking at a Sandy Bridge refresh, and even the first Sandy Bridge processors will most likely smoke the best that ARM can throw at it.

The ARM will almost certainly continue to dominate the low-power markets though. I don't see a high likelihood of x86 making it there. I'd even go out on a limb and suggest that if Intel wanted to compete head-to-head with ARM, their best bet would be a low-power Itanium, since it's not saddled with x86 hardware any more, and therefore requires considerably less logic to match x86 performance with the same manufacturing technology.


And driving all your customers in design, media, print, video, music etc into the arms of Microsoft?

Or were they intending to pay to port Photoshop, Word, and a bunch of video and music editing apps to ARM?


Apple didn't pay Adobe or Microsoft to port any of their apps from PowerPC to x86, and yet it happened. So I don't see why they would pay them to port from x86 to ARM.

They'd either port and retain a large part of their market, or don't and leave an opening for competitors to take their customers.


Apple hadn't at that time just pissed Adobe off by banning flash, and MSFT didn't see Apple as a threat.


Thanks to the millions of Apple-loving creative types in the design industry, Adobe makes way too much money on the Mac to abandon it as a platform.


Just like with the PowerPC to x86 conversion, Apple will have an emulation layer when they ultimately add ARM support to OS X. It was all seamless to end users. Photoshop et al ran under emulation for years before they eventually made native apps.

I wouldn't be surprised if Lion contained ARM support courtesy of iOS. The "Back to the Mac" theme hinted at that.


How is intel in the way of "controlling an integrated experience for their customers"???

and apple never did #5. ever. well, maybe within their product line the new models are killers.

what they do is screw suppliers for price. just that.

They secure a great deal with companies that will go great lengths doing designs for them cheaply, in hopes that apple will be a recurring client. Hell, they even pulled that on IBM! They will probably pull that one on ARM too. Use the new arm design to exhaustion for a couple years, and then move back to whoever is willing to offer them a better price.


By not providing a version of their most recent processor that provides compatibility with OpenCL at a cost, performance and thermal heat envelope that would enable Apple to produce a notebook as thin and small as the Macbook Air.

The new Macbook Air uses older, slower Core 2 Duo processors with Nvidia chipsets and graphics specifically because Intels most recent chips (with integrated graphics) do not meet these requirements.

This has an affect on the customer experiences (and therefore products) Apple would like to be able to provide.


Arrandale is a weird stopgap chip for Intel. Necessary, but not the one I'd look at to see if the company is at an inflection point.

With that said, Intel is always in these weird disruption points. Intel faced similiar talk against RISC, again AMD64, against ARM -- in the past, and Intel against ATI/NVidia.

Probably of all tech companies, Intel is the one I'd be most hardpressed to bet against in their core market.


"Probably of all tech companies, Intel is the one I'd be most hardpressed to bet against in their core market."

True, in Intel's core market (high-end server, desktop and laptop CPUs), Intel is nigh unassailable. No other semiconductor manufacturer in the world can afford to keep up to date with semiconductor manufacturing R&D on their own any longer.

The handheld isn't part of Intel's core market though, and the MacBook Air doesn't quite fit, either.

CPU margins are quite small now, outside of high end parts. That's less true for graphics processors right now. That's probably why Intel's going after the graphics market, but Intel doesn't have much expertise in that area, and it

Also, as graphics processors becoming more general-purpose, the CPU will gradually become less critical to application performance...


Your analysis in #5 isn't strictly comparable. Intel fabricates semiconductors, ARM does not. Intel designs whole chips, ARM designs cores. To get good numbers here, you would need to separate out only the CPU design groups accounting information from Intel, or somehow isolate the integration and fab contributions of companies like Qualcomm/TI/Samsung/TSMC/etc... and add it to the "ARM" total.

But your broader point, that the desktop CPU market is at or near its peak, is I think spot on.


Good point on the whole chips versus the cores. Does anyone have any rough estimates on this? As a company, if I licensed the latest generation ARM processor (e.g., Cortex-A15) and then factored in any additional design and manufacturing costs (e.g., via a foundry), what would be the total cost advantage of using a Cortex-A15 versus a comparable Intel Atom chip?


I cannot tell you about cost advantage, but I can tell you about risks related to building your own system from ground up.

One of the advantages of Atom-based systems is that you can use Atom today and A15 is due to next year. And ARM will not be available in silicon, only as a licensable IP core.

If you have some domain-specific IP to get an advantage for your system-on-chip, it would be wise to license A15 as a CPU. Otherwise, you better go with ready-to-use chips, be they Atom or another ARM.

By deciding that you need to create your own system-on-chip and your own PCB, you introduce substantial risks into your product. Even PCB alone involves quite burdensome process of testing which can cost you time to market.


Unless you're designing and manufacturing the CPU's yourself, a custom Cortex implementation would probably cost you quite a bit more than just buying Atom or someone else's Cortex implementation would. You'd have to pay for the custom design, fabrication, and validation yourself, but if you buy someone else's chip whether its Atom, Bobcat, or a Cortex, you'd be sharing the manufacturing, design, and validation costs with all of their customers.


Posting an article as a comment due to it's relevancy without even a link to the HN submission for said article is awesome and self-less, but my redundancy-hating-programmer-brain wishes you had just posted a link to the submission. My brain is confused.


"The End of x86?" Really? I think a few x86 processors are being used in some servers somewhere too. Or really in any computer that isn't meant to be carried.

True, there are similar (slower core, lower power) alternatives in the server market (Sun SPARC CMT systesms come to mind), but you didn't mention any of those and they don't have a significant foothold.


I think you have a typo in there:

today AMD has a price to earnings ratio of 77.93

should be ARM Holdings, right?


Indeed, this was a typo. Thank you! It's now corrected.


11.6" Air + Mac App store = iPad with a keyboard (approximately)

That's why core2 is fine.


would be interesting if they could make the screen swivel around and be used like an ipad (like Dell and some others are doing). Would be hard to get the hinges on such a thin device though.


It's been almost a decade since I had a CPU on my personal computer that seemed so slow somehow that I would have been considering an upgrade specifically to a faster CPU...


Same here. It was a 2001 white iBook. 500 MHz G3.

With modern machines, I think the main cue isn't slow perceived performance, but the sound of my laptop's fans ramping up, indicating that it's working hard.

That said, my boss' MBA is awfully leisurely at times. That may be due to the OS throttling or halting the CPU in order to keep the temperature down.


Have you ever tried to stream a full-screen movie on Netflix using an MBA? It can't keep up, it's very annoying. I thought there was full-screen GPU acceleration for Silverlight, but it wasn't helped in my experience.


I have a desktop with a 1.8 Ghz Core Duo. I have absolutely no problems with performance. Of course, I'm only running browsers, email, Gimp, etc... but what are people going to use a 11" MacBook Air for? Probably not games and professional Photoshop work. The processors in these machines are more than adequate and were probably chosen to balance speed with heat, weight and battery life.


Note the coincident announcement about the Mac App Store... I bet the software you can get there will run snappily enough.


If Intel doesn't either make the Atom line higher performance or the Core line more portable friendly then I could see Apple switching to ARM. The high-end ARM chips aren't quite there yet in terms of performance though, so I figure Intel has at least one more rev to the Core line before that's a danger.


Why? There are already Core i3/i5 ULV laptops around that get 12 hours of battery life. It's only Apple that cares about that extra 1% less size; the rest of the industry is just fine with Intel's new gen.


If I recall correctly, Arrandale does not yet have support for OpenCL. It's in the works, but it's not there yet. In terms of raw performance, Arrandale is competitive with the 320M (maybe a wee bit slower). Overall, Arrandale will be attractive to Apple with OpenCL support, since they no longer need two chips -- just the CPU package which has everything it needs (graphics, memory controller, etc). That allows space for better thermal solutions, more memory or SSD storage, a larger battery, or other functionality.

Hence the decision to stay with the current nVidia 320M and Intel Core 2 Duo -- once Arrandale supports OpenCL, we'll see Apple transition. Not only for the MacBook Air, but also likely in the MacBook, low-end MacBook Pro SKUs, and the Mac Mini.


they no longer need two chips

Today they have C2D + 320M; in the future they'll have Sandy Bridge + southbridge. It'll still be two chips. I also wouldn't bet on Apple ever using Arrandale in the low end, since it's going to be obsoleted soon anyway.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: