I've felt this way since I built my last desktop in 2008. I was sort-of waiting for the "gee its time to upgrade" mark to roll around in 3 or 4 years, but it hasn't happened yet. Any games I want to play it still runs very well, and it still feels very fast to me even compared to modern systems.
When my friends ask for advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.
I think I can pinpoint when this happened. It was the SSD. Getting an SSD was the last upgrade I ever needed.
~~~
Something does worry me slightly about the large shift to tablets, which are great devices in their own right. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.
I think its extremely healthy to have the lowest bar possible to go from "Hey I like that" to "Can I do that? Can I make it myself?"
I think its something hackers, especially those with children should ask themselves: Would I still be me, if I had grown up around primarily content consumption computing devices instead of more general purpose laptops and desktops?
That's about the time I noticed that I no longer felt the need to upgrade.
And, yeah, tablets feel very constrained for creation. It is a consumption device. Perfect term. I am typing on a Galaxy Tab now, and after over a year, it still drives me slightly nuts to type more than a couple sentences; not to mention the thought of actually designing or developing on this thing.
Maybe that will change as new input devices and peripherals are developed. But, at some point if it evolved enough in that direction, then it would no longer be a tablet per se, but more a PC in a different form factor.
>yeah, tablets feel very constrained for creation. It is a consumption device.
Which why I like the idea of the Ubuntu smartphone so much; a portable consumption device that can easily turn into a development machine. At least, that's my hope.
Just to note that some of us are the exception to these adequacy feelings, but that is possibly because I am a software developer. My current workstation has 32GB of RAM and an 8 thread i7. That is the maximum amount of memory the motherboard can take, and Intel had a few faster i7 models but not sufficiently faster single threaded to justify the massive price difference. Even my laptop is an i7 with 16GB of RAM (again both maxxed out).
I am having to go in and make all the work I do be increasingly parallel and pipelined. This is a lot more effort than straight forward code.
I used to have a first gen i7 and am now on 3rd gen i7. I recently transcoded a DVD which I had done before and the performance difference was incredible. Before it would take several hours, and now it takes 10 minutes! (There may also be a lot of credit to improved software too.)
I will admit that sometimes I do feel the adequacy. During the tail end of the Soduko boom I decided to write a solver as an intellectual exercise. I started out with a brute force solver that tried every possibility. It was written in Python and I gave no thought to optimising data structures or the code - I just needed a baseline. It solved most puzzles before the return key had sprung all the way back. It took less than 5 seconds on "hard" puzzles. The most difficult puzzle I could find took under 45 seconds. I gave up at that point.
>Something does worry me slightly about the large shift to tablets, which are great devices in their own right. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.
Depends on the type of content.
For writing text, painting, or making electronic music for example, there are tons of extremely easy to use apps for tablets.
For example with something like Procreate people can draw stuff nearly as intuitive as using paper and brushes, something that is difficult in the PC, except if you have one of those Wacom monitors/tablets.
> It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.
At the risk of using yet another car analogy, I see PC's becoming the pickup trucks of the computer industry. There will always be plenty of them around and will be primarily designed and built for utility. Many people who don't really need the utility will still buy them because they like the style. Over the years, the pendulum will swing back and forth between mobility and power -- and there will inevitably be the SUV cross-overs as well.
I think his point was that this would be a very bad end result. If there are no desktops or laptops in a household it becomes very hard for a kid to be immersed in development from a young age as tablets and phones are almost exclusively meant for consumption.
What mitigation steps are you taking for when your SSD dies a horrible death? Do you keep all your important files saved on another SSD or in the cloud? And you'll just buy a new SSD and re-image? Or are you doing something else?
There was an article posted yesterday in which IDC blames Microsoft and Windows 8 in their report for the decline in PC sales but it's refreshing to see proper journalism acknowledging that isn't solely the case and in-fact Apple are selling fewer Mac's as well. The world is moving toward mobile devices. PC's will always serve a purpose, but for some people a PC isn't needed at all. As a developer and a bit of a designer, I couldn't picture myself coding on a tablet nor designing on one (prototyping a design maybe). It's a changing landscape, the likes of Google Glass give us a glimpse what a future without desktop computer domination looks like.
The real issue here as touched upon in the article is the fact that new computers don't really offer an advantage over older computers. Upgrading from a 386 to a 486 back in the day was a reason to upgrade but my current machine which is a spec'd out Core i7 will last me until it stops working in 4 to 5 years time. The only sector of computing probably still thriving is storage, people probably upgrade their hard drives more than they do their computers. Computing has reached a point where a CPU will last 4 years but a hard drive only lasts as long as it has space left.
>As a developer and a bit of a designer, I couldn't picture myself coding on a tablet nor designing on one
That pretty much describes me and the thing is that even though I still need a PC for the same reasons as you, I haven't felt the desire to upgrade as before. The reason? Pretty much exactly what the article states.
I used to upgrade every couple of years, and maybe slightly more frequently. And, I would see significant performance/productivity gains per upgrade.
But, unlike before, I don't feel any performance "pain" with my current desktop or mobile workstation. They are both 3-4 years old, Windows 7 64-bit machines, and pack plenty of power/memory for my (heavy) use. So, I no longer feel a need for better performance.
Seems there was a time when software (including the OS) pushed the hardware, such that users were ever hungrier for more power. But, now it seems that hardware has gotten out in front permanently for all intents and purposes. A slightly above average consumer rig with sufficient memory can now handle just about any task most folks will throw at it and with relatively no latency.
Part of the reason that software isn't pushing up against hardware limitations as hard anymore are the relatively old and underpowered current generation consoles. Games were traditionally on the front of the hardware curve, but over the last few years only a small proportion of the game market has had fundamentally high hardware requirements.
It'll be interesting to see what upcoming console releases will do for the the hardware market.
There's that, and also the fact that cpus aren't getting faster as much as they used to. I imagine if my cpu was still doubling in single-core power every couple years that there would be dev tools that could make use of that power and I would want to upgrade.
And games would definitely be doing much more as well.
But it's not practical to make software that will take twice as long to run if cpus only speed up 10% every two years.
True. But, I think that's the other side of the coin. That is, I'm not so sure we need faster CPUs as we once did.
Save for extreme gamers and other more esoteric applications (e.g. CAD, video transcoding, etc.), most folks (i.e. the majority of the PC market) wouldn't benefit much from a CPU that's much faster than those now commonly fitted to the slightly-above-average consumer rig.
So, in general, I don't think software makers are holding back from making software that pushes the hardware. I just think it's more difficult to push today's more powerful hardware with typical software applications.
Agreed. There are a host of other potential applications that could push hardware as well.
And, I do believe that if one were to have mass market appeal (i.e. broad utility and demand), then we may see increased PC demand again (provided the PC is the appropriate platform).
But, do you think that there are a significant number of such applications waiting in the wings for PC hardware advances, or do you believe that perhaps no such applications are ready for prime-time as of now?
There is a certain class of developer where something like the ASUS transformer is perfect.
They can:
- Play all sorts of media
- Download things with utorrent
- Browse the web with webkit
- Access terminals,
- Use SSH,
- Write their PHP websites with a highlighting text editor
- Edit images
Getting all of that in a package that is:
- Under 2 pounds
- Has 10 hour battery life
- Is $300 if you shop around, almost disposable.
- Backed up on the cloud
Add a mSATA SSD and more RAM and it would work with even more developers, photographers and other professionals. Student who just needs to write papers and use facebook? Especially them too.
To me the Asus Transformer is more of a netbook than a tablet as it is being marketed as. It's about the same size as an Acer Aspire One which I owned a few years ago, battery life sucked though. To me a tablet is a device without a keyboard, a device with a keyboard is merely a netbook or depending on screen size a laptop. I could definitely see myself coding on a Transformer, but I have this thing where staring at a small screen gives me the feeling of tunnel vision and I feel as though I can't concentrate and it distracts me. There's nothing better than coding in Sublime Text full screen on a 27" monitor.
Side note: the Asus Transformer is a beast, you don't get any better for that kind of price. I'm going to buy one for the commute to and from work on the train.
I expect that tablets will one day get the knack of supporting multiple resolutions and multiple monitors, and serve just as a secondary input device when on your desk.
I'd go even farther than that: for a lot of people, a PC is something they never really wanted. For a long time it was the only practical way to get the things they did want: Email, YouTube, online shopping, social media, music, the web in general. Now that those things can be done, and done well, from a phone or a tablet, and now that you can watch Netflix on your TV with a game console or even directly, a PC is just clutter.
The sad thing about this is that tablets and phones aren't nearly as good at content creation. A physical keyboard is still the fastest brain --> computer interface in town. Also, tablets and phones aren't self-hosting. You can't develop iOS apps on an iOS device. This makes it much harder for inexperienced people to get into programming. Taking the plunge into programming will be like deciding to buy an instrument and learning to play it.
While most people are never going to write software, those who do will be hurt by the drop in PC sales. In the past, PC R&D costs were borne by the general public. Now the public is moving to mobile devices, but developers still need to buy full-fledged computers. Lower PC sales means costs will go up (since R&D can't be spread across as many units) or manufacturers won't develop new features as quickly.
There's some silver lining: the technologies used in tablets overlap quite a bit with those used in laptops. Developers won't be stuck completely in the past, but future PCs might be a little too tablet-y for their tastes. (This is already happening with Windows 8).
I don't think the price for PCs will matter. You can always grab a bluetooth keyboard if you want to type something out, and there are even web based IDEs available, which means that you're not limited to platform. Right now, you can add a keyboard to your droid or iphone and start hacking away.
Additionally, the programming experience is in many ways focused too much on the text based code itself, and less on the act of creation. It may be that changing the PC/developer interface causes a revolution in the way that people program.
Yes, you are. The parent was arguing that PC hardware would become specialized high $ developer only equipment, but as you say all you need to recreate a PC from a mobile device are a couple of peripherals.
The sad thing about this is that tablets and phones aren't nearly as good at content creation. A physical keyboard is still the fastest brain --> computer interface in town.
Tablets are a faster brain --> interface than a keyboard for visual arts like drawing, painting, video and photography. Arguably music too, because you can simulate many different types of input from drum machines, to strings, to even wind instruments (ala SMULE's Ocarina).
>tablets are a faster brain --> interface than a keyboard
Whoa, I could not possibly disagree more with this statement.
I currently work in post production and thus spend my days in from of either Premiere or Pro Tools. Being able to turn around good work, and turn it around fast comes down to knowing your shortcuts. I cannot see how a device like a tablet could top having dozens of tools literally under your finger tips. My left hand is constantly changing tools, issuing commands,zooming, deleting, etc, etc, etc.. Right hand takes care of the mouse position. I can't image anyone who has had any experience with an editor being willing to give up their keyboard anytime soon.
Moving a finger 1/4" > moving your entire arm to poke a button on a screen.
>I cannot see how a device like a tablet could top having dozens of tools literally under your finger tips.
Can you see how a physical mixer, with faders and all, beats a virtual mixer on the Pro Tools screen as an interface in some cases? A tablets beats the keyboard and mouse in the same way.
Not to mention that most people don't work as experienced shortcut automatons, so sparing the tedious use of mouse and/or the tedious keyboard dance for immediate, visual, feedback on a touch screen is even better for them.
>I don't think you understand how shortcuts work..
I've been using computers since the late eighties, worked on from Sun OS to OS X 10.8, and have used Vim for decades, so I think I do.
But I don't find them that useful anyway.
As they get more numerous (aside from standard stuff) they only serve to give your mind a slight pause (to trigger shortcut recall, because not all are in muscle memory) and they trick you into thinking you're doing something useful for 0.5s, which for a lot of operations mostly the same time it would have taken for you to do it with a mouse. Just that with a mouse your mind is not working that hard (whereas the effort to remember the shortcut makes your mind think less time has passed).
You might think that doesn't apply to you. You'd most likely be wrong though (unless you stop-watched compared it). That's the kind of tricks the mind plays. I've not speaking out of my ass here. Here's from UI expert Bruce Tognazinni.
"We’ve done a cool $50 million of R & D on the Apple Human Interface. We discovered, among other things, two pertinent facts:
-- Test subjects consistently report that keyboarding is faster than mousing.
-- The stopwatch consistently proves mousing is faster than keyboarding."
I think you forgot to change accounts while agreeing with your post...
At any rate, you've narrowed your original broad statement,
>Tablets are a faster brain --> interface than a keyboard for visual arts like drawing, painting, video and photography. Arguably music too,
down to a specific use case -- and one I actually agree with. However, writing off the value of all shortcuts as "tedious" is fundamentally ridiculous.
I respect that you've "used computers" for a long time. However, the following assertion makes it relatively clear that you've got no idea how people interface with non-linear editors.
>As they get more numerous (aside from standard stuff) they only serve to give your mind a slight pause (to trigger shortcut recall, because not all are in muscle memory) and they trick you into thinking you're doing something useful for 0.5s, which for a lot of operations mostly the same time it would have taken for you to do it with a mouse. Just that with a mouse your mind is not working that hard (whereas the effort to remember the shortcut makes your mind think less time has passed).
Being that this is the frame of mind you have, I'd just going to leave it be. There seems to be an entire way of using a computers that you are unfamiliar with.
I recommend you drop into a post shop one day. You can see how we use keyboard short cuts ;)
(This is the strangest disagreement I've ever had online)
While I think we all know that iOS will never be self-hosting, Android is most definitely capable of this. [0] It may be tedious on a 4" phone, but it is actually comfortable on a Nexus 10 (though I prefer a bluetooth keyboard).
Last year I bought a new motherboard with an eight-core processor when my old one died. Just a couple of day ago I realized that my experience with that set up is exactly the same as the previous one I bought in 2008. The 1TB hard drive I bought in 2009 is only half full. 16GB of RAM runs no better than 4. Two cores humming along at 3000 mHz can handle everything I throw at them. The others sit idle.
It's a huge change from ten years ago when I would anxiously await the day when I could afford a new rig because I was already pushing my three-year old one to its limit. Desktop PC technology has clearly reached the point where its capabilities far exceed the needs of ordinary users.
The ordinary user argument. I argue it too, because it's true for now. Facebook and email don't require 8 cores and a Kepler card and 32gb of RAM. They require an $80 Pentium 4 machine. Youtube HD is about the only thing that a normal person uses that'd push that, other than games.
The best counter-argument I've seen so far is that these modern machines are capable of great but uninvented or unpopularized things. If developers give users a reason to upgrade, they will. Nvidia wouldn't exist if game developers hadn't made 3D games to take advantage of their hardware. Same with PCs, developers have to give them a purpose.
I agree with that argument, and I hope someone capable steps up and makes it happen.
I still think the problem is the complete lack of innovation with desktop displays for the past twelve years [1]. I very badly want a home computing environment that features a ~50 inch high-DPI screen that I view at a distance of approximately 2 to 3 feet.
I feel high resolution, high density displays would reinvigorate what we currently call "desktop" computing.
I have a dual monitor setup, with 27 inch (2560x1440) and 24 inch flat IPS displays.
They are fantastic to develop software on, and they cost less than my PC.
I couldn't have afforded anything like this 12 years ago.
I guess it depends what you call 'innovation', but from a making-my-work-experience-nice perspective, I'm delighted with the progress of desktop displays.
I'm not sure how much benefit I'd get from higher res, with the distance the displays are from my eyes?
50 inch high? Do you mean wide? Due to our physiology of two horizontal eyes, we're more suited for a wide screen than one that high. I'm not convinced your field of view could even s make use of an entire 50 inch high screen at that distance.
On my dev computer at work i used to be able to integrate the whole application environment. No more. To get a bigger and bigger machine,for me that is not affordable. Not to forget the taken space, the produced heat and the noise from this clumsy pc box.
So we started to build our own infrastructure, enabled virtualization and giving everyone, what he needs. Growing as needed. It feels like a natural development to me.
The modularization of racks becomes better: separate hot-swappable and inter-connected cpu, fast ram, slower storage units. Feels like a pc itself again. Maybe that shrinks and we get it at home again.
I like the idea of owning my own pc. But i think, it gets more and more difficult to have everything on it. I will end up with a lot of servers anyways.
On the other hand, a lot of people develop web applications. For that i dont need much power.
-> i need more power
-> some people dont need much power
=> maybe thats one reason why the pc market shrinks
Are they taking into account global economic crisis?
Because I'm pretty sure that businesses were the main buyers of new computers, and that they're not going to buy new computers unless they really need to in this climate.
I agree that Vista, when launched, put a lot of people off.
I think you are overestimating how bad the "global economic crisis" is. Besides, computers are vital to many companies--they're not an expendable "luxury."
But they've bought the computers. Where they would have been replacing everything every year now they're replacing every three years, and they're not replacing everything, they're shifting the machines down through the ranks.
The PC (or the Mac) is, of course, a long way from perfect, but you can count me as another data point for the theory that hardware pretty much reached "good enough for just about anything" a few years back, and I'm saying that as someone who is a programmer and a gamer, so for Joe Q. Public running Office and Chrome this point was reached even sooner.
Core 2 Duo w/ 4 gigs of RAM was, I'd guess, basically the tipping point for normal users, Nehalem w/ 8 gigs of RAM, GeForce 4xx and an SSD for the system disk was the tipping point for people like me.
I used to upgrade my system yearly (buying parts off Newegg, reusing existing bits where they made sense to do so) but now it is more like every three years and growing each time.
There are, of course, lots of ways PC manufacturers can turn this around with increased novel input methods, more hybrid devices and especially an increased focus on higher resolution screens (which has a multiplier effect because if you truly boost your on-screen resolution, you'll soon start feeling cramped by your CPU, GPU and memory again), but the days of tossing out more powerful CPUs, GPUs and a bit more RAM (and then calling it a day as far as new features are concerned) are over.
When I upgraded from my C2D to a first gen i7 was the first time I didn't feel pain and the need to upgrade... I did go with an SSD at that time, and spent about $1500 on that desktop iirc (main case, not monitors, keyboard etc)... I recently replaced it with AMD's top 8-core option, which works better for me than an i3/i5 at that price.
I only upgraded because my system was unstable, and it was likely the motherboard (which I would have to replace the mb+cpu) or the power supply, either way pretty much the same effort/cost as upgrading both. New system runs great.. the irony is most of my non-work stuff gets done on my htpc in the living room, or my nexus tablet. My c2d macbook pro, and my desktop aren't used that much.
I must say, despite the gloomy outlook, isn't the fact that most of the market is running on old machines not a bad one?
Why do you buy a new pc? Why, of course, to do something your old one couldn't. Now, you have six year old consoles so your standard port won't be incredibly pretty or taxing.
Give it a months after the launch of the next-gen consoles and PC sales should see an uptake as people start buying the awesome looking ports that are being crapped out in the dozens by the big AAA devshops.
While it won't fix the market, it should have a serious effect upon the profitability of a pc business. Combined with the fact that your tablet, which was nearing the current console power, is now looking pretty bad that is even more reason for the market to keep on chugging with consumers realizing their all in one isn't the beauty they thought it was.
You guys say you haven't upgraded in years because you can run the new games on a very decent level? Just you wait.
I would guess that Windows Blue is another data point to support the thesis. Windows sales have traditionally been tied to hardware sales. With declining hardware sales it seems like a way to try to keep milking the cash cow.
Basically, PCs are dumb, boring work terminals for old people.
Children coming of computing age when the iPhone was released are now 9 years old, perhaps on their second or third portable gaming device, and lobbying their parents for an iPhone.
What on Earth would they possibly want a slow, dirty, heavy keyboard computer for? You can't even take a picture with it unless it's an Apple.
A PC is a "Personal Computer," though traditionally it only refers to IBM-compatible computers (or "a Windows machine" if in the context of Mac vs PC).
This "moving to mobile devices" debate seems to put "PC" in the context of "desktop and laptop computers" vs "tablets and smart phones" with ChromeBooks in a kind of middle-ground.
When my friends ask for advice I tell them if they like the keyboard and screen, then its just plain hard to be disappointed with anything new.
I think I can pinpoint when this happened. It was the SSD. Getting an SSD was the last upgrade I ever needed.
~~~
Something does worry me slightly about the large shift to tablets, which are great devices in their own right. It's hard(er) to create new content on a tablet, and I don't really want that becoming the default computer for any generation.
I think its extremely healthy to have the lowest bar possible to go from "Hey I like that" to "Can I do that? Can I make it myself?"
I think its something hackers, especially those with children should ask themselves: Would I still be me, if I had grown up around primarily content consumption computing devices instead of more general purpose laptops and desktops?