Hacker News new | past | comments | ask | show | jobs | submit login
CES2011: The end of the PC era (asymco.com)
138 points by roadnottaken on Jan 6, 2011 | hide | past | favorite | 74 comments



Intel's arch enemy has been AMD and IBM in the past -- and MS has supported running on architectures for both of those companies over Intel in the past.

Remember that x86-64 came from AMD and Intel supported it only after it was clear that it was going to be the 64bit architecture, not Itanium. And Intel was pretty POed that MS went with IBM over Intel for XBox360.

My point, these things happened in the past and will continue to happen.

The end of the PC era began with the emergence of the web. Everything else we're seeing a simply a side-effect of the web. Ironically, MS saw this coming 15 years ago, and their attempts at slowing it may have inadvertantly accelerated it.


I think what I'll be using in 5 years is a 4 ghz device the size of my phone, which can be connected to external displays and input devices. I can already use bluetooth keyboards and mice with my phone. Once it can attach to a 1900x1200 monitor I won't need anything else.


It might take less than 5 years. Both Motorola and Lenovo introduced an Android phone with a "laptop dock". Google for the "motorola atrix dock". It'll be a little while before it's useful, but it doesn't look like it'll take 5 years.


It might be that long before we get what I really want, a monitor that rolls up into a little scroll I can stick in my pocket!


I would think that a 2 or 3 Ghz device with between four and eight cores would be more likely. However, everything else you suggest seems spot on - or at least I would hope it to be spot on, not being a time traveler.


That's what I've thought for awhile, too. As far as enabling technology, we're pretty much already there, with phones such as HTC's EVO 4G that can output HDMI. Add some VRAM and you'll have desktop resolutions. There's probably already half a dozen devices at CES that do that right now.

On the UI side, there's some more work to be done. I haven't yet seen anything that comes close to transitioning between the very different worlds of direct-touch smartphone interfaces and decoupled desktop interfaces. The broadening of handheld interfaces into tablet form factors as well as the gradual introduction of mobile UI elements and paradigms into desktop UIs, though, should ultimately bridge the gap.


Definitely, there are some really big opportunities regarding interfaces in that department. When someone plugs in an Android device into a normal monitor and uses a mouse, I guess a different launcher/home screen could come up?


Hey, you're already a little bit of the way there with stuff like AirPlay.

Please excuse my Mac Fanboyism, but it is really freakin' cool to just be sitting on the couch, pull out your phone and bam tune-age.


Very interesting prediction. That's the first serious alternative to traditional desktops that I've ever considered. Essentially you're using your phone/mobile-computer as a desktop when you need it... so you're not really moving away from Desktop computing. But I could see this taking hold.


Yeah, I don't see the need to use a desk going away any time soon after all. Laptop docks are already popular due to this, why not mobile device docks that work the same way?


Why wait five years? You should be able to get a Motorola Atrix within a couple of months:

http://www.engadget.com/2011/01/06/motorola-atrix-4g-hd-mult...


Combining hd telepresence with such a phone , and a dock connected to a big screen tv could be a really nice feature.might even make hd telepresence more viral.


Except long-elusive battery breakthroughs.


I disagree. The PC era is evolving, not ending. How many people still have and use a traditional desktop or laptop computer at home? Microsoft is filling a growing demand for an OS on smaller form factor devices by supporting ARM processors with the next release of Windows, but the traditional PC market is still alive and kicking, its just not "cool" anymore; it's seen as more of a utilitarian tool than a novelty at this point, whereas smartphones and iTablets currently retain the novelty factor, thus their pervasiveness at CES.


I hope you don't mind. I just thought I'd try this on for size...

I disagree. The Mainframe era is evolving, not ending. How many people still have and use a mainframe when they interact with their bank? PCs are filling a growing demand for an OS on a smaller form factor, but the traditional mainframe market is still alive and kicking, its just not "cool" anymore; it's seen as more of a utilitarian tool than a novelty at this point, whereas PCs currently retain the novelty factor, thus their pervasiveness at BigIronExpo.

Yup. I actually recall people making this argument back in those days. And they were right in a sense: big iron still exists in its niche, after all. Regardless, the PC revolution was still a massive shift — large enough to justify the rise of the PCs as heralding the demise of big iron.


When I saw the iPad launched I was very concerned we are moving towards a market of consumer-only devices; devices which you can't use creatively to make content, program applications, etc.. Things like programming will be considered a "specialist" activity and new generations of nerds will have nothing to cut their teeth on because the only PCs are development workstations in corporate offices. Consumer computing will turn into pressing buttons and seeing magic happen, with no ability for curious people to figure out what's going on behind the scenes or modify it. A good litmus test is : can you use applications on the device to make entirely new applications for that device? You can do this on a Ti-86 calculator, but not on an iPad (please correct me if I'm wrong).

I'm not entirely convinced things will pan out like this, but the thought still occurred to me.


We've been heading in that direction for some time, but it's only because of the increasing complexity of our devices.

Televisions, radios, cars, and home appliances are all devices that are harder to tinker with today than when they were first invented. Computer hardware, for the most part (outside of hobby kits) is all magic black boxes plugged into each other. Something wrong with your video card? There's not much you can do about it.

Luckily, the hobby/amateur market in computer hardware/software (and other hobbies, like cars) is still alive and kicking. IMO though, corporations seem to like taking steps to make that sort of work harder (e.g., utilizing encryption to ensure "blessed" hardware/software can only be used).


IMO though, corporations seem to like taking steps to make that sort of work harder (e.g., utilizing encryption to ensure "blessed" hardware/software can only be used).

I've given some thought to this, and I don't think that it can mostly be blamed on the corporations. Sure, they're part of it, but I think the ever increasing complexity of technology is mostly what causes things to be harder to tinker on. There's far more to understand than in older technology, and it makes tinkering/hacking on things far more difficult.

It's kind of like classic cars vs cars today. Older cars were much more tinkerable/hackable/etc, but newer ones have certainly benefited from the increased complexity in the form of better gas mileage, safety, etc at the cost of hackability.


I don't think complexity is nearly the enemy of hackability that manufacturer lockdown is. Modern cars can still be hacked -- everything from OBD-2 interfaces to modified firmware for engine control units. FPGAs provide a way to tinker with hardware concepts.

The demise of hackability is and will be due to DRM and the hardware equivalents (like impossible-to-find proprietary screwdrivers).


To me, that's the critical distinction. A Motorola Atrix (http://www.motorola.com/Consumers/US-EN/Consumer-Product-and...) is a "PC" to me. An iPhone in a dock with a keyboard isn't, even though they're very comparable otherwise.


I think the biggest difference is that mainframe customers were primarily limited to large businesses, whereas the PC has gained the mindshare of the masses in the last 15 years, giving it an inertia unseen before in tech. Unlike the transition from mainframes to PCs, I think that the iTablet/Smartphone market supplements the PC market, rather than outright replacing the core functionality provided by PCs.


But 'the PC' is a slippery term. That analyst didn't declare the death of 'the PC' in the sense of "a traditional desktop or laptop computer", he declared the death of 'the PC' in the sense of Wintel. So for example, the Mac is a PC in sense 1, but (still, mostly) not in sense 2. Of course the two phenomena are closely related by now, and the sense-2 PC is under a certain amount of pressure too. But the fate of the traditional desktop/laptop is a much wider question, and more tied in to Big Questions about the future of software.


The PC era won't end as long as people need to use keyboards to input text. A lot of home/casual computing might evolve away from PCs, but they'll always be on your desk at work and I'm guessing most people will keep at least one traditional desktop/laptop at home for decades to come.


> I'm guessing most people will keep at least one traditional desktop/laptop at home for decades to come.

I think you're being very shortsighted. "Most people" haven't even had a laptop for a single decade yet. Most people have had one for 1-10 years I'd say. Also, "most people" haven't had desktops for more than a decade and a half.

What on Earth makes you think these form factors / devices will still exist in most people's homes for decades?! To me, "decades" implies at least 20 years....

So do you think, in year 2031, we'll still use desktop computers? That's crazy. No, we will have moved on.

Keyboards will be out-phased for most users by 2031. I image maybe devs will still use them, and they'll be common in specialty scenarios, but they won't be for "most users".

> they'll always be on your desk at work

Also shortsighted. Workstations will evolve too, even if this is slower.


I'll bet you $1000 that most people will still regularly use keyboards in 2031. Isn't there a website somewhere where we can do this?


http://www.longbets.org/

Also, most people don't use keyboards now, unless you count cellphone keypads or restrict "people" to mean "people with income above $1000 per year" or something like that.


Haha okay, I will probably take that bet! But we need to decide on more specific definitions, as hinted by pingswept. What defines a keyboard, and what defines "most people"?

Does "keyboard" include both physical and virtual keyboards? Most people on planet Earth, or most people in the USA? Or by income?

If we can settle these specifics, I think we should indeed register at longbets.


Maybe we should just do a HN Poll. Note: I'm also betting on the fact that I'll have an extra $1000 by 2031. :)



We'll probably still be using some sort of device that involves tapping keys with our fingers. But I'd be very surprised if that device is anything like current keyboards.

(And then there's the small chance that brain-computer interfaces will allow us to type with our minds, but I'm not betting on that.)


Worth considering that QWERTY keyboards have been the method-of-choice for text-entry for over 100 years. I'm not saying it will last forever, necessarily, but it's a very efficient technique. Call me old-fashioned, but I don't see it being supplanted by voice-transcription or gestures or whatever for a very long time.


Agreed. I'm predicting more along the lines of really good on-screen keyboards, or ThinkGeek's laser keyboard (http://www.thinkgeek.com/computing/keyboards-mice/8193/), or something entirely new but still in that vein. In other words, keeping what works but evolving past the big physical slab of plastic.

> Call me old-fashioned, but I don't see it being supplanted by voice-transcription or gestures or whatever for a very long time.

Well, we can type way faster than we can talk, so even if voice transcription were perfected, it would still be impractical.


If you said most engineers I would agree with you. If you said most secretaries, I might still agree with you. But most people won't. (An awful lot of construction workers and truck drivers don't regularly use keyboards, even today.)

Don't confuse your workflow with the rest of the world's.


Don't forget - most people in any sort of school - something every one in every generation has to do.


I bet truck drivers do, especially drivers with terminals in their trucks.




Intrade doesn't accept arbitrary contracts, wouldn't accept this one as currently too ill-defined and too long-term, and depending on how their fees work may make a 20-year bet way too expensive.


The typewriter was commercialized around 1870. What makes you think keyboards will go away, and which market do you have in mind when you say "most users?"


The OP should have titled it "The end of the Wintel era". The original article specifically mentions that they're not talking about form factor.

FWIW, going along with the misunderstanding in this thread, I agree with you. I have a hard time seeing any creative work done in a non-desktop environment (maybe work that involves drawing/writing can be done on a tablet/drawing surface, but that's about it).


When is the end of jumping the gun going to begin? :p

Some people are just too eager to see the PC die - rest assured that it will some day but you need to be a bit more patient. Windows on ARM or ARM based tablet announcements do not make a PC dead - not so fast.


The PC doesn't have to die for the PC era to gracefully decline, any more than Henry Ford had to personally shoot every horse.


But for horses to have prevailed even a little longer, someone would have had to have monkey-wrenched an awful lot of cars (IE6).


So it's not really an END of the PC Era as the article proclaims but merely a beginning of the end?


[OT] That's a brilliant way of putting it, and had me in chuckles. Thanks for driving the point home :)


Henry Ford... "Driving" the point home.

Oh you guys..


Who were the Windows-exclusive OEM customers? I thought all of them tried releasing Linux netbooks they just didn't sell


I have an HP mini110 that runs Windows 7 32-bit. I got it as a gift this Christmas and I remember thinking "oh boy, let's see how this thing runs". But it does just fine. I mean it's no speed demon, but for the limited purpose of the form factor, it's perfectly fine. I have to admit I was surprised that it works so well. I kinda expected Windows to be a hog on that hardware (1GB RAM fixed, with no upgrade path), but it really isn't. Even the 5400RPM HDD seems to chug along well, contrary to my prior experiences with laptops running slower hard drives. I would have preferred an SSD though. And the battery life is insanely good.

For ~$350 or so, it's a really good device.


I don't know why commentary has taken this bizarre turn lately of expressing predictions 5,10, or even 15 years in the future in the present tense.

Sure, the mobile OS market has crazy momentum right now and may well end up obsoleting the entire PC industry. However, that hasn't happened yet. The PC industry is still bigger, people developing PC apps are still making much more money.

This is not the end of the PC era, at best it's the prelude to the beginning of the end.


Why Intel ? Some years ago AMD was ahead of them in terms of x86 cpus...even more years ago there were companies like Cyrix or Via all building x86 chips that ran windows very well.


One thing's for sure...it's not going to happen anytime soon.


The irony is we're now finally giving up x86 years after the point where moving to a risc architecture would have offered the biggest jump in price/performance (p5 era).


Whats the era after the PC called? Is this mobile computing era? Or the ubi-comp era? I'm inclined towards the latter.


PC++.


Augmented Reality


Not yet. One important thing about real reality is the persistence. Looking through a pocket-able screen can give you a window into augmented reality, but there is a lack of persistence to that user interface. Afaik the only thing that will really bring about an era of augmented reality is good huds that carry little to no social stigma.


One thing is abundantly clear:

If you're a developer or a designer and you aren't learning at least Android or preferably all three mobile platforms, you're looking at having a huge hole in your skill set by the end of the year.


Really ? All my enterprise DB systems are going to be obsoleted by a cell phone that can do facebook?

I remember 10 years ago when the web was going to make big iron irrelevent, and 20years ago when PCs were going to replace mainframes


All three? I think you may have miscounted:

BlackBerry, WebOS (if HP keeps it up), WinMo, Android, MeeGo, Symbian, iOS. Did I miss any?


Intel exclusivity? What?

x86 is an architecture, like ARM, that has seen many contenders come and go over the years. Intel's dominance is because of engineering excellence, albeit coupled with some questionable business practices. Show me an ARM device that compares with a Core i7-950. Show me one that compares with one with 3 of the 4 cores turned off.

Microsoft, for that matter, has had versions of Windows for alternate platforms going back well over a decade. Those platforms got abandoned not out of some strategic affinity to x86 (much less to Intel), but simply because they didn't offer the advantage to be worth the bother: Instead they were pursued during the "the Future is RISC" era and it was preparing for the purported demise of x86.

As to manufacturers that were "Microsoft exclusive", show me even one. Such a thing hasn't existed in years.

There is nothing I have ever read on Asymco that isn't full of glossy big conclusions, yet absolutely falls apart under any scrutiny at all. It is garbage analysis.

I love the smartphone era. I love the fact that embedded chips in everything from your HVAC control to your PVR to inside televisions and cars, etc, are all massively benefiting from this explosion of reasonable performance, low power devices.

But let's get real. The only reason x86 is behind right now is because ARM had been working on low power processors for years, and the chips came up that it was the hot area. Intel isn't sitting on their hands, though, and I have no doubt they'll have some killer solutions in the very near future. People are so quick to jump to conclusions though, carried on by simplistic analysis of the sort that Asymco performs.


"But let's get real. The only reason x86 is behind right now is because ARM had been working on low power processors for years, and the chips came up that it was the hot area. Intel isn't sitting on their hands, though, and I have no doubt they'll have some killer solutions in the very near future."

This reads like something from 2007, back when the Atom was almost ready.

When Atom arrived, it simply drew too much power for it to compete with ARM SoCs, and in the intervening years, the ARM chips have been improving more quickly than the Intel chips.

It's been more than 2.5 years since the Atom shipped. When are we going to start seeing the results of Intel's earnest effort?


Apparently the Atom SoC (system-on-a-chip) is shipping any day now (early 2011).

Intel also has a 32nm SoC process developed, unlike the cheap taiwan/korean fabs which churn out most ARM-based designs.


Atom was created for the netbook market, where the target for low power consumption was significantly higher than it was for smartphones and from that tablets.

How many netbooks shipped with ARM? Atom did what was necessary to win that race, even if it wasn't prepared for the subsequent very low power device explosion. The Atom chips themselves are actually remarkably efficient -- with a TDP down to 0.65W for some models, well in the ARM territory -- but Intel never bothered to make efficient supporting circuitry, while failing to full incorporate it into a SoC.

ARM is improving dramatically because it became a critical field with some low hanging advances that were possible. The pace of innovation is impressive, but it has slowed dramatically in the past year -- recall that the Tegra 2 was demoed over a year ago. Today, the high point of the ARM ecosystem is still the Tegra 2. People today are doing the sort of naive extrapolation that we've seen over and over again.


Intel did bother to make ATOM soc's.

They enabled other companies do an ATOM soc using 3rd party fab. nobody took them on their offer.

At the same time , they did there own SOC's. for example the CE4100 ,which is especially fitted for digital tv's and includes graphics(powervr) + memory controller + dsp + hd decode on chip.

It has took the place of tegra in the boxee box, because the tegra wasn't able to decode high profile H.264 HD streams.pretty disappointing for the tegra after all the hype.

But still , the power consumption is high : 11.2W for the boxee box(it's common for streamers to have half that). but intel claims it hadn't done sleep mode on the ce4100.

Edit: sony will also use the ce4XXX(sodaville) for tv's.


"Those platforms got abandoned not out of some strategic affinity to x86 (much less to Intel), but simply because they didn't offer the advantage to be worth the bother."

I disagree. MIPS and Alpha offered significant performance advantages during the 90's.

Even though Windows NT was available on these platforms, very little 3rd party software was provided in Alpha, MIPS or PA RISC compatible binaries. In industries where key applications were available, such as 3d animation, Alpha was quite popular, and the others somewhat so.

Running these applications Alpha offered 2x-3x the performance of anything available from intel, and so compared well on a price/performance basis despite higher cost. If they were ever shipped in volume comparable to the P5 the advantage would have been staggering.

If you read the archives of comp.arch you'll see a general sentiment of how highly regarded the architecture was, including postings by intel's own designers at the time. In fact I'd say the only thing more highly praised there would be the transputer (which was probably 3 decades ahead of its time).


> The only reason x86 is behind right now is because ARM had been working on low power processors for years

That's kind of a big reason though.


Yes intel is a great company. but it has:

1. intel has a certain business model that probably requires some level of monopolistic or quasi-monopolistic control on the market(with 80% profit margins).

2. it's strength is within the x86. maybe x86 can't be competitive with ARM on power/performance ? time would tell.

3. is not perfect. has already failed in the ARM market(sold it's ARM unit) , and probably failed in the GPU .

4. performance: in today's world , where things like memory accesses, cache sizes and multi-core connectivity are very important to performance , intel has some contenders in those areas, for example IBM(CPU+DRAM manufacturing on the same die, optical connectivity), tilera(64 core chip, with claimed better performance then intel at an older process node).

5. performance: it might just be that for consumer parts that require multicore (math intensive apps), a gpu is prefered, and a 4 core cpu is enough.

6.ARM biggest advantage is it's open business model , that leads to a very large cooperative ecosystem both at the hardware and software levels.

And if microsoft is serious about the ARM move, intel has just lost a large part of it's biggest competitive advantage.

So i wouldn't be so sure about intel's bright future.


I knew a senior manager at Intel. The company encourages you use the words 'significant competitive advantage'. Not that ugly 'M' word.


So i wouldn't be so sure about intel's bright future.

I'm not invested in Intel. I wouldn't invest in Intel. I don't know where the market will go. However I'm not so knee-jerkish to instantly discount them.

It is truly remarkable how many prophetic statements of absolute truth we've seen over the year based upon temporary states in the industry.

6.ARM biggest advantage is it's open business model , that leads to a very large cooperative ecosystem both at the hardware and software levels.

Intel used to be one of those businesses. Again, way back in the late 80s ARM and other RISC architectures were seen as, without fail, the imminent successors of the universe. Intel got heavily involved. It didn't pan out, so Intel dumped it.


Show me an Intel device that compares with a 2W ARM cpu.


Some of Intel's Atoms run at 2W or less. A year ago I built an Atom-based nettop whose total power drain is something like 4W.


If true I am impressed, I thought the accompanying chipset pushed the baby Atoms up to ~12w total.

Anyway, the Tegra 2 is <2w for complete SoC including the both cores, nvidia gfx, ram and 1Gb storage.

e.g. http://www.netbooknews.com/4891/nvidia-tegra-2-module-fits-o...

TBH, T2 is from 2009, we know Tegra 3 is done and they are already working on Tegra 4 from an interview with the CEO last year


>I thought the accompanying chipset pushed the baby Atoms up to ~12w total.

This is one of Intel's own motherboards; let me dig it up...D945GCLF.

And I was wrong. The Atom itself has a TDP of 4W. The motherboard claims its total draw is 75W, with hard drive, optical, "and all board peripherals enabled". In practice, I kept it down a lot by installing (a) a laptop hard drive and (b) nothing else. I have a USB optical drive I hook up when I need it.

I should plug it into my Kill-A-Watt tonight and report back--I know I measured it before, but I didn't write down the numbers. I know the idle draw was stupidly low, like under a watt. (That's cheating a little, though, since it means unplugging the dongle for the wireless mouse&keyboard. Even when it's plugged in, it's on a USB hub, which I wasn't measuring.)


Thank you for following up. Much of that will be the rotary drive and unfair to compare to an SSD.

For what it is worth, I do like Intel, my Core 2 Duo is so good it will tide me over until Ivy Bridge (22nm).




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: