Analyses that repeat the “post PC” mantra in its various forms may be correct in doing so, but the mantra is getting threadbare. Since I’ve read this sort of thinking so many times (desktops are dead, Intel is doomed, ARM is the new hotness, etc.), I don’t find it terribly interesting to hear the same restated. Don’t get me wrong, I appreciate the detailed analysis provided by the author, but the thesis is unsurprising.
Here’s what I would like to read if a technology journalist could dig it up: What kind of strategic planning is going on within the halls of Intel, Dell, HP, Lenovo, et al with respect to keeping the desktop PC relevant? Put another way: I find it astonishing that several years have been allowed to pass since desktop performance became “good enough.” The key is disrupting what people think is good enough.
The average consumer desktop and business desktop user does consider their desktop’s performance to be good enough. But this is an artifact of the manufacturers failing to give consumers anything to lust for.
Opinions may vary, but I strongly believe that the major failure for desktop PCs in the past five years has been the display. I use three monitors--two 30” and one 24”--and I want more. I want a 60” desktop display with 200dpi resolution. I would pay dearly for such a display. I want Avatar/Minority Report style UIs (well, a realistic and practical gesture-based UI, but these science-fiction films provided a vision that most people will relate to).
I can’t even conceive of how frustrating it is to use a desktop PC with a single monitor, especially something small and low-resolution like a 24 inch 1920x1080 monitor. And yet, most users would consider 24” 1920x1080 to be large and “high definition,” or in other words, “good enough.”
That’s the problem, though. As long as users continue to conceive of the desktop in such constrained ways, it seems like a dead-end. You only need so much CPU and GPU horsepower to display 2D Office documents at such a low resolution.
There was a great picture CNet had in one of their reports (and I grabbed a copy at my blog [1]) showing a user holding and using a tablet while sitting at a desktop PC.
In the photo, the PC has two small monitors and is probably considered good enough to get work done. But the user finds the tablet more productive. This user should be excused for the seemingly inefficient use of resources because it’s probably not actually inefficient at all. The tablet is probably easier to read (crisper, brighter display) and faster, or at least feels faster than the PC simply because it’s newer.
Had desktop displays innovated for the past decade, the PC would need to be upgraded. Its CPU, GPU, memory, and most likely disk capacity and network would need to be beefier to drive a large, high-resolution display.
So again, what are the PC manufacturers doing to disrupt users’ notions of “good enough,” to make users WANT to upgrade their desktops? I say the display is the key.
In general I agree with your ideas, but I think there's a missing component: general use case. Most people only need their computers for email and web browsing. We can further simplify that and just call it - communicating.
Sure, a developer or designer salivates at the idea of more screen real estate, but that's because there's a practical use for it. PC Manufacturers follow, not decide, the needs of their users.
I love my screen real estate because I actually need it. If I just browsed Facebook, wrote a word document, and maybe planned out my finances with a spreadsheet - I'd have a hard time justifying some giant monolith of a monitor. I think this is a largely overlooked factor in the success of the mobile market.
I don't know if everyone's forgot this already but when the iPad first came out, most people were thinking: what in the actual eff is Apple thinking. Sure we all knew it'd sell because, well, Apple is Apple. But if I recall correctly most people were scratching their heads asking, "so, it's a big iPhone right?"
And guess what? It is just a big iPhone! And it succeeded NOT because it was the next "cool" thing but because it was designed to do what most people needed their computer to do, namely, send pictures of their grandkids to each other.
Once the monitor is so large that you need to move your head to view it all, you have gone too large. Almost no one wants to stare at a wall sized monitor from 2-3 feet away.
Here’s what I would like to read if a technology journalist could dig it up: What kind of strategic planning is going on within the halls of Intel, Dell, HP, Lenovo, et al with respect to keeping the desktop PC relevant? Put another way: I find it astonishing that several years have been allowed to pass since desktop performance became “good enough.” The key is disrupting what people think is good enough.
The average consumer desktop and business desktop user does consider their desktop’s performance to be good enough. But this is an artifact of the manufacturers failing to give consumers anything to lust for.
Opinions may vary, but I strongly believe that the major failure for desktop PCs in the past five years has been the display. I use three monitors--two 30” and one 24”--and I want more. I want a 60” desktop display with 200dpi resolution. I would pay dearly for such a display. I want Avatar/Minority Report style UIs (well, a realistic and practical gesture-based UI, but these science-fiction films provided a vision that most people will relate to).
I can’t even conceive of how frustrating it is to use a desktop PC with a single monitor, especially something small and low-resolution like a 24 inch 1920x1080 monitor. And yet, most users would consider 24” 1920x1080 to be large and “high definition,” or in other words, “good enough.”
That’s the problem, though. As long as users continue to conceive of the desktop in such constrained ways, it seems like a dead-end. You only need so much CPU and GPU horsepower to display 2D Office documents at such a low resolution. There was a great picture CNet had in one of their reports (and I grabbed a copy at my blog [1]) showing a user holding and using a tablet while sitting at a desktop PC.
In the photo, the PC has two small monitors and is probably considered good enough to get work done. But the user finds the tablet more productive. This user should be excused for the seemingly inefficient use of resources because it’s probably not actually inefficient at all. The tablet is probably easier to read (crisper, brighter display) and faster, or at least feels faster than the PC simply because it’s newer.
Had desktop displays innovated for the past decade, the PC would need to be upgraded. Its CPU, GPU, memory, and most likely disk capacity and network would need to be beefier to drive a large, high-resolution display. So again, what are the PC manufacturers doing to disrupt users’ notions of “good enough,” to make users WANT to upgrade their desktops? I say the display is the key.
[1] http://tiamat.tsotech.com/i-see-the-problem