Hacker News new | past | comments | ask | show | jobs | submit login
Why Ubuntu Linux is now ready for primetime (redferret.net)
87 points by jaggs on Aug 27, 2011 | hide | past | favorite | 120 comments



Nope. Ubuntu won't be ready until upgrading Firefox (or any individual package) doesn't force insanity like opposite window controls or Unity without asking. Regular people aren't interested in seeing drastic and seemingly nonsensical changes like that for no reason, and almost certainly aren't interested in being part of some grand OS experiment. They want something that is consistent and works.

While package management is an excellent concept, the way Linux packages are handled--with shared dependencies--is what engenders the ridiculous milestone-distro system that I think is keeping desktop Linux back. Having to swallow upgrades to every single package on the system just to get one package (like FF) upgraded means that instability and unwanted change--exactly what regular users hate--will be ingrained in the experience.

The solution, I think, is either to get rid of the shared-dependency concept and move to a more Windows-like static install system, or form a truly stable rolling-release distro. The only big one is Debian/Mint, but from what I hear that's still too unstable for even power users.


This is one of the major reasons I don't suggest Linux to anyone. There was a time I'd get people to switch from Windows to Linux; now I tell them to either upgrade to Windows 7 (if they're not on Windows 7 already) or get a Mac.

I'm also concerned about three secondary issues. One, there is a lack of strong leadership in the Linux community. Shuttleworth is the only person who can shake things up, and whenever he does everyone whines as if Shuttleworth had caused them some grave personal injustice.

Second, of course, is fragmentation and the fact that Linux developers change things needlessly (see GNOME 3, KDE4, PulseAudio, etc.) The end result is different distros doing things in slightly incompatible ways. These days I see more and more people package their software for Ubuntu only, and provide a generic .tag.gz for other distros. I'm no Ubuntu fan, but I jump for joy every time I see this. Distros need to die.

Third is politics. I know people who contribute to a few high-profile open source projects and I've watched them waste their time on bikeshed discussions on mailing lists and IRC. I've even watched one guy work on a project for three months only to have the design team discard it because they had a different idea in mind. The result was a long battle between programmers and designers. I guess this is a direct result of lack of leadership.

I've been hearing talks about a GNOME OS. I sincerely hope the GNOME folks go ahead and build it. As I've said before, distros need to die. So much manpower is wasted on just packaging things for different distros. From a consumer standpoint, choice is okay if you have to choose between two or three things, each with their clear merits and demerits. Choice is fatal if you have to choose between ten things which are all nearly identical.

Sometimes I feel Linus is the only guy who knows how to run a project properly. He knows how to say "no". If only we could coax him into building a distro ...


I would advise the ordinary non-technical user to go Mac. Fewer headaches all around, largely through restricted choice. The platform drives me bananas. Windows makes me postal though.

Linux / Free Software has strong leadership, but that leadership generally has limited scope. The GNOME team have certainly taken bull-headedness to new heights within their own limited realm of control. To somewhat muted applause as I recall.

OpenBSD (granted, not Linux) isn't particularly renowned for having a shrinking violet at its head. Nor Linux itself, nor the FSF. Other major projects have a mix of leadership styles, itself part of the (r)evolutionary culture of trying different technical, social, organizational, and leadership structures.

That diversity (including various ill-advised developmental branches) is part of Linux / Free Software's strength: a great many parallel paths are being tried. Some will succeed, some will fail. In general, the better alternatives generally at least get a good, if not fully fair, hearing.

Politics is inevitable when you're dealing with more than one individual or organization. Grow up and get over it. You'll note that these same political discusses exist in the proprietary world, you're just not (usually) party to them.

The proper counterproposal to a bikeshed argument is to demand construction of a nuclear power plant.

Those "creative differences" are called "forks" and again, are a key strength to Free Software (see previous comments).


"There was a time I'd get people to switch from Windows to Linux; now I tell them to either upgrade to Windows 7 (if they're not on Windows 7 already) or get a Mac."

What's changed?

"Shuttleworth is the only person who can shake things up, and whenever he does everyone whines as if Shuttleworth had caused them some grave personal injustice."

People get comfy in their distributions. When Shuttleworth dictates "design" decisions with the reasoning of "just because," and as a result many people no longer find their distribution of choice comfortable, can you really blame them for being pissed off?

I know many people in this situation since Ubuntu 11.04. I'm one of them. I wasn't happy with Unity. I had to spend the time to migrate my desktop and laptop to Fedora, and get readjusted. This was annoying, but at least I'm comfortable again.

"Second, of course, is fragmentation and the fact that Linux developers change things needlessly (see GNOME 3, KDE4, PulseAudio, etc.)"

Fragmentation is inevitable in free software. People build what they want - you can't stop them, and you don't have to use what they make. The solution is not to provide the "one distribution to bind them all" and somehow dictate that everyone should use it, but to get the different ecosystems and environments following standards. They're doing that pretty well.

Anyway - you wish for "strong leadership" in the community. There will always be more than one center of strong leadership (because of personal preferences), and hence, fragmentation results.

"These days I see more and more people package their software for Ubuntu only, and provide a generic .tag.gz for other distros. I'm no Ubuntu fan, but I jump for joy every time I see this. Distros need to die."

It seems you wish for the same universality for Ubuntu as exists for Windows and Mac OS X. Do you know why those platforms are universal? Because it's relatively difficult for graphical programs to be ported between them. That problem doesn't exist for the free software ecosystem thanks to portable graphical toolkits, freedesktop.org, etc. Any good software written for distro X will soon be running on distro Y (if it doesn't already from the get-go), and that will never change.

"Third is politics. I know people who contribute to a few high-profile open source projects and I've watched them waste their time on bikeshed discussions on mailing lists and IRC. I've even watched one guy work on a project for three months only to have the design team discard it because they had a different idea in mind. The result was a long battle between programmers and designers. I guess this is a direct result of lack of leadership."

I agree that this is a huge problem. Design by committee is inefficient and ineffective. I think it would be a good idea for GNOME to have leadership elections at the end of each release cycle, to get someone to lead the GNOME project through to the next release.

"I've been hearing talks about a GNOME OS. I sincerely hope the GNOME folks go ahead and build it. As I've said before, distros need to die. So much manpower is wasted on just packaging things for different distros."

By those distro maintainers' own initiatives. They do it because they want to.

"Sometimes I feel Linus is the only guy who knows how to run a project properly. He knows how to say "no". If only we could coax him into building a distro ..."

Linus is (a) busy, and (b) has no clue about usability. He runs Xfce, for heavens' sake :)


This is such a great reply. Thank you, majika.

> What's changed?

Nothing much on the Linux side; Windows has just become better. Windows XP as I remember it was utter shit. Windows 7 is actually pleasant to use. I also got myself a MacBook in 2008, wrote many Cocoa applications, and found that the Mac was a better work machine than my Linux machine. So if someone can't afford a Mac, I tell them to use Windows 7. Otherwise it's time to hit the Apple store :)

> People get comfy in their distributions. When Shuttleworth dictates "design" decisions with the reasoning of "just because," and as a result many people no longer find their distribution of choice comfortable, can you really blame them for being pissed off?

Totally agreed. So can we say that not only does the Linux world lack leadership, the little leadership it does have makes terrible decisions? :p

> Fragmentation is inevitable in free software. People build what they want - you can't stop them, and you don't have to use what they make. The solution is not to provide the "one distribution to bind them all" and somehow dictate that everyone should use it, but to get the different ecosystems and environments following standards. They're doing that pretty well.

Once again you make a very good point. But I'm looking at this issue from a consumer's standpoint. The consumer wants to deal with one entity; he doesn't like fragmentation. In any case, these mostly identical distributions serve no purpose. Most of them have no reason to exist. Personally, when I want to use Linux -- and I do use Linux on servers because that is where it shines -- I default to Debian and pretend that nothing else exists. I'm not good at dealing with choice, and I believe the man on the street is almost as bad as I am.

> It seems you wish for the same universality for Ubuntu as exists for Windows and Mac OS X.

I do. I honestly do. Universality is a good thing. I know a properly written GTK+ app will run identically on every Linux machine out there, but there are still some very tiny incompatibilities, especially w.r.t. packaging and distribution. As an indie developer, I don't want to deal with that stuff. I just want to create a DMG/MSI, ship it and expect it to work for the next 5 years without issues. The universality of Windows and the Mac gurantees that.

Oh, one more thing: Windows and OS X let even bad developers bang out working applications. All their assumptions about their OS, their hardcoded paths, their assumptions about certain libraries existing at certain versions, all their bad code ... it all just works. Small point, but I think it's an important one. All of us were ignorant once :)

I know I'm going to sound like a crazy person here, but I just feel there is something very wrong with fragmentation. I don't know; it's just something I feel. I guess I probably am a crazy person ...

> Linus is (a) busy, and (b) has no clue about usability. He runs Xfce, for heavens' sake :)

Hey, Xfce is pretty damn good :)


> Having to swallow upgrades to every single package on the system just to get one package ... [means] ...instability...

Just curious, have you really experienced instability by installing all updates on any mainstream distro? If so, could you share what happened? I've not had that experience but would like to know about it.


Yes. For example when Ubuntu switched to Pulseaudio, it borked my sound. Even with the release after that one (can't remember which one it was), I had to have a shortcut on my desktop for 'killall pulseaudio' because just playing music for more than 15 minutes would end up in garbled sound. Or in Karmic, when a regression in the Intel video drivers made even dragging windows stutter on my laptop with 4GB of RAM. Or when Lucid broke hibernate for my laptop. It was fixed in Maverick, then broke again in Natty. Literally every distro upgrade of Ubuntu for me has resulted in more than one serious regression in something that used to work.


Ubuntu and Fedora, though the most common desktop distros, also are notoriously NOT very stable. Fedora is RedHat beta-test playground, and Ubuntu is built from a Debian-unstable snapshot. OTOH many people complains that Debian, RedHat or CentOS are too long to implement the latest features, but this is the price to pay for extreme reliability. Know your needs, choose accordingly.


I just wish there was a middle ground though where you dont have to choose between extreme reliability + years of waiting and fast turnaround + notorious instability


For desktops: Debian testing + pinning as needed to backports, sid, and experimental.

For servers: Debian stable + backports.

That last in particular is vastly preferable on servers to the usual "Oh, we've got to run Red Hat because it's industry standard, but we're going to have to rip out the guts and install a metric shit-ton of third-part / roll-your own / nonstandard packages, and then hope that the poor third-generation admin to inherit this cesspit can make heads or tails of it.


Debian testing is both reliable enough (I think there were no more than 2 cases when an upgrade would lead to an unusable state in the past 8 years) and not too far behind the bleeding edge. Or you could use a rolling release distro such as gentoo, arch, or in a related world FreeBSD.


The problem with Debian Testing is that once something breaks badly, you usually have to wait two weeks until it is fixed, because changes need to trickle down from Sid. In my experience, Sid is more useful as a desktop. But it's certainly not for the beginner.


Since my favorite Debian combo was already mentioned, I should mention my usual plan-B: CentOS with CentOS Plus enabled on the server. What flavor of Unix you run on the desktop is irrelevant, although I frequently find myself in dependency hell when I am working on a Mac. Installing the qtconsole version of iPython is next to impossible.


It has happened to me once in a while. In fact my full upgrade to (k)ubuntu 11.04 ended up leaving the machine in a unbootable state. Not a big deal since I had data backups. But still time consuming.


+1 here. mysql-server is compltely broken in 10.04LTS.


First: Linux does expect (and reward) a certain level of technical literacy. If you don't want to acquire or exercise this, well, you have alternatives and you've mentioned several. Those do come with trade-offs of their own.

If you don't like system upgrades for single-package updates, learn to use your distro's 'backports' feature. What? Your distro doesn't have backports? Shame.

Or switch to Gentoo and install/upgrade just the packages you want to use. Yes, you'll have to learn how to compile shit.

The UI in Linux is highly modular. You can switch to a different (or no) desktop environment pretty easily. There's a switcher in your log-in manager. Try it.

There's a reason for package dependencies, and there's a huge payoff (single-point-of-control whole-system upgrades with very low failure risk). It's not perfect, but on systems which have developed a mature packaging system over time (Debian, derivatives), it works exceptionally well, and provides access to over 30,000 packages.


Firefox is upgraded as soon as it's released by Mozilla in all supported versions of Ubuntu (except 10.04, which will be upgraded when Mozilla EOLs 3.6.x for good)


Even with 10.04 you can use the FF PPA for the latest version.


> The only big one is Debian/Mint, but from what I hear that's still too unstable for even power users.

You're forgetting about Arch, which is absolutely amazing. Simple, yet powerful.


The thing I hate about Arch: the AUR. It's horrible. There is no official tool to handle it, the package stability or coverage is spotty at best, and the unofficial tools (yaourt, etc.) suck.

But you have to use AUR, because that's where the breadth of the packages are. Arch's team is relatively small (compared to Debian's anyway) and they can't maintain that many packages. I agree with their choice of letting the community handle the rest — from their perspective it makes sense.

From my perspective, I'm never returning to the nightmare of AUR again. a) if I want to wait for stuff to compile for ages and then fail, I can just use Gentoo. b) Gentoo is at least more or less stable and has good coverage/stability. c) Portage is at least a good package manager (the second-best in Linux-land, imho, with dpkg/apt-get leading the pack.) Not to say pacman is bad, but yaourt just isn't good.


Can you give some examples of must-have software that are only in AUR? Having use Arch for years on multiple machines, I've rarely had to use AUR. And when I do, I've found it to be a marvelously simple mechanism - a bit like homebrew on a Mac, where you can easily edit the recipe if needed. It sure as hell beats installing unsupported software on Ubuntu.


I'll second that. Although you can get a similar simplicity in many other distros, however, what really makes Arch great for me is the community; they have one of the best wikis I have ever seen.


Debian Stable with Backports[1] seems a decent solution to me.

[1]: http://backports-master.debian.org/


I'm guessing that you also think web apps are also not ready for the average user? After all, there you don't even get to have a say when the upgrade happens. And you have no way of delaying, cancelling, or undoing it.

As far as users are concerned, random stuff changing for reasons unrelated to their actions are all the same.


Upgrading Basecamp doesn't force an upgrade to Gmail. Your comparison doesn't really apply here.


Does that matter? In both cases, stuff still gets changed under you at more or less random times. With web apps, you don't even have control over when random stuff could possibly change.

In Linux terms, that would be like your package manager constantly and silently updating everything in the background all the time. How dependent or independent packages are doesn't matter in this sort of scenario.

I'm not entirely sure that it makes much of a difference. Chrome shows that gradual silent updating works for local apps. Linux needs to handle that better, sure, but I think that's the way to go.


Gradual silent upgrading is poised to be a huge problem for users and yet another reason I stay away from Chrome. You should never impose an upgrade on a user, at most just strongly suggest they install and make it easy if they choose to do so.

Upgrades have the potential to break things users may find critical. Say for example in a browser, User A uses a plug-in that they find essential to their daily work and it hasn't been updated to the latest release (developer is on a sabbatical roaming the world for 2 months). Now you force an upgrade to their machine and you break their plug-in leaving them with no recourse. This is only one slim example but it's potentially a huge issue. A couple years back a Windows auto-update once killed a client's server for 2 days until I could identify the culprit. There at least I had recourse to uninstall and reject the update, with something like Chrome I don't.

In a web app things don't inter-relate the same as they do on a local OS so it's mostly a non-issue outside of user experience. Forcing upgrades to locally installed software is just asking for trouble all around.


Chrome's updates have been nothing like KDE4 or Unity or Gnome3.

Also, people have also revolted against and deserted versions of Web Apps like Digg v4, so it's the experience that matters, not just change in general. There should be no need to foist updated versions of other apps just to get a version of another app.


I haven't seen an app that depended on KDE4, Unity, or Gnome3, or couldn't be run outside of them.


> Nope. Ubuntu won't be ready until upgrading Firefox (or any individual package) doesn't force insanity like opposite window controls or Unity without asking.

Last time I checked, Ubuntu kept existing setup, desktop etc. when upgrading. Anyway your rant is nonsensical: where is my good ol' "Start" menu organisation when I upgrade to Win 7?

> The only big one is Debian/Mint, but from what I hear that's still too unstable for even power users.

I'm speechless, you seem to know nothing of Linux but quick to make judgements. Debian is the most stable (too stable, may be argued) system; I have several machines that were upgraded flawlessly from 3.0 to 6.0 without ever breaking. On the other hand, if you like rolling release, there are many distros that work just like this, like gentoo, archlinux, etc.


> Last time I checked, Ubuntu kept existing setup, desktop etc. when upgrading.

Have you ever dist-upgraded an Ubuntu distro? Upgrading to Lucid moved your window controls. Upgrading to Natty turned on Unity. While it's true that upgrading from, say, XP to Win7 will change your start menu, the point is that you can still upgrade your "packages" on XP, WITHOUT having to get the new Win7 start menu.

> I'm speechless, obviously you're babbling without even knowing what you're talking about; ignorance and hate won't lead you anywhere.

Obviously YOU don't know what you're talking about; Debian Mint (LMDE) is based on Debian-testing, which is supposedly HIGHLY unstable.


> the point is that you can still upgrade your "packages" on XP

Like DirectX 10 or 11? The fact that you can upgrade is due to the abnormal shelf life of this particular release of windows. Try upgrading most programs on a win98SE or win2K machine.

> Obviously YOU don't know what you're talking about; Debian Mint (LMDE) is based on Debian-testing, which is supposedly HIGHLY unstable.

Mint is based upon Ubuntu, which is based upon debian-unstable (Sid). Debian testing hardly is highly unstable; most debian users run testing on their desktop systems, and keep stable to the servers.


I think you're misunderstanding what I'm referring to. I'm specifically referring to rolling-release distros, of which LMDE is one. It's based on Debian-testing.

Regular Mint is indeed a milestone-based distro built on Ubuntu. But LMDE is a rolling-release based on Debian-testing, with no relation to Ubuntu. That's specifically what I'm referring to.

There are certainly other stable rolling-releases out there, but I think everyone can agree that a distro like Arch is about a lightyear away from being Grandma-friendly (which Ubuntu more or less aims to be).


> It's based on Debian-testing, which is more unstable than Debian-unstable

Wrong. Testing is more stable than Unstable, which is more stable than Experimental. Furthermore, Ubuntu does extremely heavy modification to the Debian base, repackaging quite a number of it's core packages, and replacing a number of central parts at times (eg, the boot system).

If you're going to say that people don't know what they're talking about, please make sure you do.


You're right, I've corrected my post. But I still stand by my general point :)


You're doing a dist-upgrade, which is exactly like going from XP to Win7. Stay on LTS if it bothers you that much.


That's precisely my point. In the Linux world, your distro version is inextricably linked to the versions of every major and minor software package in a way that just doesn't happen in Windows.

If I stuck with Lucid, I'd still be using Firefox 3.6. (Yes you could add a PPA, but opening the terminal is beyond the technical skill of the average user.) Linux won't succeed on the desktop until the phrases "Upgrade Firefox" and "Potentially bork my entire system with unexpected regressions" don't go hand-in-hand.

FF will still auto-upgrade to 6 even on an XP machine.


Ubuntu has a security policy to upgrade supported versions to the latest supported browser version. So, in Lucid, if 3.6 is still supported by Mozilla, that is the version still on maintained (you know, for consistency...thing thing most people here are railing against).

However, as soon as mozilla drops support for 3.6, the latest supported version is installed. This happened for Lucid, as I understand it.

This is a sane policy, IMO.

If, however, you are on Lucid and want to run a different version of Firefox, it is rather trivial. You can find it in Ubuntu Software Center after adding a repo and install it. This is NOT what most non-technical people want to do, however, and that is why the policy makes sense IMO.

And, I agree, FF auto-upgrading itself to 6 is nice on OSX and XP, but that is a Mozilla thing, not Linux. Mozilla makes that difficult to do and the various linux distros need to deal with it.


> your distro version is inextricably linked to the versions of every major and minor software package in a way that just doesn't happen in Windows.

What's the big deal with versions? It's not like it's too much trouble updating Ubuntu from one version to the next (you know you can turn off Unity). Most of the time, you can even continue working while it upgrades. This is not like Windows where you have to reboot the machine with the install media and wait a couple hours to get the new version.

And while the Firefox teams does an amazing job making FF 6 for XP, the same can't be said about the Internet Explorer team. If you want the latest IE, you need to upgrade Windows.


What an awful article. Nothing here was 'new' but rather the same things that have been said in near every other 'Year of desktop Linux!' articles. Replace the post date with any year going back to ~ 2006 and it wouldn't be out of place at all. Also, any writer that is suggesting primetime usage of a Linux distribution and then suggests to use WINE or a VM solution is out of his mind.

It's amusing how the writer referenced Ultimate in order to show the apparent high cost of Windows yet how many people actually need Ultimate? Also, his 'success' story in regards to his wife isn't too realistic or at least common—how many people have a husband who is willing to look for DbxConv (A command app) and do all that work for her.

The site itself looks as if it's stuck a few years back. I'm a little disappointed Windows ME wasn't referenced and no use of 'm$'.


>Replace the post date with any year going back to ~ 2006 and it wouldn't be out of place

There have been breathless articles about this being the year Linux is ready for the desktop ever since the late 1990s.


He mentions he's fed up with the financial cost and the need to activate Windows, and cites these as reasons to use linux. He then later suggests running Windows in a VM to deal with "dodgy programs". At least I'm not the only one that finds the irony here hilarious.


Hey, the mouse "just works!". Out of the box! Linux is ready. It's always disappointing in articles like these not to see 200 corporate workers trying Libre Office (which would be fine for many many people).


I used Linux (Ubuntu primarily, but I dabbled in the other distros from time to time) as my only OS on my netbook for 5 years (through college and my first few internships / job). For that atmosphere is was great: it taught me the UNIX philosophy, I can work a command line shell and vim like a beast, I knew a lot more about OSs and programming in general than if I was using Windows, and it overall prepared me excellently for software development in the future.

That said, I was young and had the time to deal with its inadequacies. I could spend a whole weekend getting a driver to work properly, and fiddling with kernel options. These days I get a few hours of free time when I come home from work, and then I have my weekend. If I am on the computer at all, I don't want to waste my time dealing with that stuff any more. It's either to sit down and knock out a sizable chunk of a personal project, or play some video games for entertainment.

I have a proper place for all three OS flavors. My desktop runs Windows (for gaming, heavy VM work, and games development), my primary development machine is a Macbook Air (running OS X), and any servers I have run Linux. I feel like these are the perfect tools for their respective jobs, now that money isn't a factor as a poor college student. Using OSX over Linux as a development environment was like night and day.

For servers, Linux is king. Having it be a usable personal OS (for the average user and programmer) will take a lot more time. While it's true that most of my time is spent either in the terminal or in the browser (I'd say about 75% of my computer usage), for the other 25% paying for a machine that runs OS X is completely worth every penny twice over. The quality of tools, utilities, and applications that aren't a browser or terminal is light-years ahead of anything that Linux offers, in terms of usability, UI, and functionality.


> I could spend a whole weekend getting a driver to work properly, and fiddling with kernel options.

I haven't done that stuff since around 2005. I just install a standard distro, and leave it to do what it does. It Just Works out of the box. It even detects my Raid5 and sets that up for me. Debian is my choice because of the rolling updates in testing -- I don't even want to think about version numbers.

I use Linux because it means I don't have to think about package versions, I don't have to worry about finding drivers after installing, I don't have to search around for software, etc. Thanks to distro choice, I don't even have to think about what version I'm running.


If you're lucky enough to own a machine that works out of the box, with all of its components, while running Linux: all the power to you. I've owned many computers over the years, of all shapes, sizes, brands, and/or custom parts, and I've never had a single one "just work."


I've had it just work on a Fujitsu tablet, several Thinkpads, and a couple of HP laptops. And any desktop I've cared to try so far.


Pretty much what I was thinking. I've messed with Linux HP, Lenovo Thinkpads and Ideapads, and most other major manufactured laptops. From Ubuntu, Fedora, Arch, Slackware, Mint and plain Debian, I've never had major issues with hardware.

IIRC, the last time I had a hardware/driver issue that pissed me off to no end was back before Fedora even existed.


I have an HP dm1z, and I still haven't gotten the wireless drivers to work (RALink)


> If you're lucky enough to own a machine that works out of the box

Actually, it's the other way around. If you own a machine that doesn't work out of the box, you are terribly unlucky.


About a year ago I installed linux (ubuntu) for the first time in years on a dell machine. I had to spend a few hours fiddling with the video drivers to get it to work right.


No,it's not.

The article seems to suggest that ubuntu is an OS that the non-geek joe can install and use without experiencing any issues, it just works. Well, just wait until he decide to upgrade to the next ubuntu release or try to install some restricted driver. Good luck to these newbies if they don't have a friend who knows linux.

I've just upgraded from 10.10 to 11.04 yesterday and only after 5-6 hours i've found the right kernel+ati driver combination to make my system work again (after the upgrade the GUI freeze after a few minutes of use, i've made a few tests patching 2.6.38 for BigKernelLock, trying different driver releases, checking the installed packages for incompatibilites,etc...). A newbie would have never come out of this alive.

Imho, 11.04 shows that the Ubuntu releases are still made in a rush, with unfinished things that get included anyway (Unity, but every release has its own unfinished or barely working new functionality). My suggestion for Ubuntu? Test more. And don't be afraid to push forward the release date if there are still severe bugs open that need to be addressed.


> The article seems to suggest that ubuntu is an OS that the non-geek joe can install and use without experiencing any issues, it just works.

While my mother (a 76 year-old lady) asked me to install her computer, she has been using her trusty IBM desktop always running the latest Ubuntu since 2006 or so. Never had an issue. She can read her e-mail, share files with friends. The only ability she lost was to infect her machine with the most horrid forms of malware. I don't think she misses it and I, certainly, don't.


So who's going to step up and become the premier Linux hardware vendor? I was just looking at Linux laptops this morning and the problem is battery life. The System76 Lemur 13 gets... 2-3 hours of battery. That's not going to work. The ZaReason Strata Pro 13 gets... I don't know because they don't advertise it and I can't find a review that mentions it.

The conventional wisdom is to get a Thinkpad, which I might have to come to, but it would be wonderful for one of these Linux vendors to step it up and put out something that's competitive in terms of battery life with the improvements that have come in recent years. Apple, Lenovo, Asus, even Toshiba are all putting out laptops with 8+ hours of battery. I wouldn't consider buying one that gets less than 6.


Asus tried Linux on the EEE PC. They unfortunately tried an odd version called Xandros, with some modifications and a cut down IceWM theme. I'm not sure they enjoyed supporting it. This article says they've stopped pre-loading Linux: http://www.pcworld.com/article/196987/has_asus_abandoned_net...

and this article says they're going to start pre-loading Ubuntu: http://www.theinquirer.net/inquirer/news/2075819/asus-preloa...


Good to know they know better than to write the whole thing off after making some bad decisions.


Dell have good support. They have some deal with canonical, so it seems most of their laptops work great.

Also, ASUS ones run quite well. ASUS even ship linux on their netbooks. My girl friends netbook runs for ages on it. So much that it has never run out of battery on us.

Sent from my Ubuntu Dell.


Any particular Dell models? I've heard bad things about Asus and compatibility with touch pads, etc. Not true?

(I had Thinkpads before, but am considering a change in the next month.)


I am very happy with the XPS 1330m laptop. I don't know if they still make it anymore, but I bought it with ubuntu pre-installed and it has been fantastic.


I thought a bit and then got the idea to use a web tool called "Google"...

http://www.ubuntu.com/certification

I found a couple of laptops which are locally available and suitable.


My Thinkpad T400s runs at 8W/h when tuned with powertop2 (3g disabled, wlan disabled). But as the battery only has a capacity of 43Wh it only comes up to a bit more then 5 hours.

I could however replace the disk drive for another battery and have about 9 hours of run time with it.


A few times a year, I download the latest version of star-office / open-office / libre-office and try it on a MSWord document from work. My most recent attempt was 2 weeks ago. I have yet to see a correctly-formatted result. In the beginning, simple things like bullets were wrong. These seem OK now, but equations are completely messed up, as are figure numbers, etc.

This stream of office suites is probably fine for creating new documents, especially if they are simple. But they still do not work adequately for any task that involves collaboration on MSWord files.

I despise MSWord as much as one can despise any software. But if that is what is demanded by a funding agency, I am quite sure I am not going to use anything else.


That (continues) to say much more about the inner brokenness of the Word "file format" than attempts to reverse-engineer it. That's a quoted term as the format has little to do with a typical mark-up-version-saved-to-disk, as was standard on such modern formats as ... WordStar and WordPerfect in the late 1980s.

Granted, it doesn't help you much when you've got to interact with someone using some random version of a borken instance of MS Word.

I'm aware that MSFT have moved to an XML-based markup, and that support may have improved. I find the word-processing model (with its assumptions of a static, un-shared paper-based document) fundamentally broken. Most of my communications are in text files, Wiki documents, or on the odd occasion, Google docs or similar which allow simultaneous shared online edits.

It's also curious that nearly 30 years since its release, there's no simple reader for MS Word. Yes, Microsoft released one such. If you hit space (or any other character), say, to scroll through the document as you would a 'less' pager or PDF viewer .... a dialog appeared telling you that this was not an editor. Which you had to dismiss. Every. Fucking. Time.

And somewhere deep within Redmond, someone is still wondering why MS Word DOC format didn't become a universal document interface interchange format.


Which is a pity when you think about how crappy and unfinished the unity UI in ubuntu 11.04 is. In years this is the first time I've stopped using a standard ubuntu (now I'm using Gnome 3, which is almost as bad).


You can disable Unity and switch back to Gnome 2 at login time. After you enter or click your username (before you enter your password) you can select which login session you want to use down at the bottom of the boot screen. If I remember correctly the default is "Ubuntu" but there is also a "Ubuntu - Classic" that sends you in to Gnome 2 rather than Unity.


Correction: Ubuntu WAS ready for prime time


Unity is the incarnation of everything I hated (and worked hard to disable) in Windows 7. I rebooted and selected Ubuntu Classic as the default, and haven't looked back.


At an internship at MSR I recently used windows 7 for real, and after I got through the first week the general experience was maybe more pleasant than ubuntu with unity. Even things that annoyed me at first (grouping windows by program in the task bar, for example) grew slowly.


Ubuntu moving to Unity and Gnome3 forced me to give Xubuntu a try; I like it ok. xfce is like a less-polished Gnome2.


dito moved away from ubuntu and found crunchbang linux, because of unity


Crunchbang isn't great. It breaks regularly; it's pretty big; and there are weird dependency problems. Compare it to distributions like Slitaz or Tiny Core, both of which really are small.


What I've done is to install Debian (you can choose testing or stable) base system. And Fluxbox / Openbox on the top of it will all the applications you need. I'm happy with that.


thanks for mention them, i'm going to try both of them :D


I'm happy with Lubuntu and Xubuntu variants. Maybe give those a try.


You should try out Mint Linux. It's basically a fork of Ubuntu with a much better UI (in my opinion).


Could this article be more wrong? I took the Linux challenge last month, making Ubuntu my main OS. The experience was terrible and I switched back to Windows. Linux is still almost awesome and I've retained a Linux install as a "work machine" to keep me away from my video games very nicely. But the fact that I can successfully use Linux as an "isolation chamber" betrays the fact that it's really not ready for prime time at all.

The first thing I'd say they need to fix is installation. There should be one install protocol to rule them all. Whether the program apt-gets, installs .sh, or just sits there and runs as is, it should be forced through a setup process to remove user confusion. That might be as simple as a Python script that asks where you want to copy files, but it just needs to be consistent for every program. I never have to read an installation instruction or ReadMe on Windows, why do I have to read one on Linux?


You pretty much just use the package manager for your distribution. The big distributions have pretty much every program you're going to need. I barely ever install something manually.

And for the things you do install manually, it's almost always one of two methods. If the software creator is kind enough to provide a package for your distribution, you download and double-click it. Otherwise, it's a .tgz file that you extract and (almost always) do:

    ./configure && make && make install


Yes, I'm aware and that's the problem: I have to be aware of this sort of thing even for consumer software. Contrast Windows: I download, double click and it goes. No knowledge needed.


But there is knowledge needed with Windows. You just already had that knowledge.


I switched from a Mac with OSX to Ubuntu about a year ago, and I haven't looked back. I'm a developer, and I have found that everything in the Linux world seems to be set up with me in mind.

Dependencies are easier to install, there is a larger, more knowledgeable support network (compare the Ubuntu forums to any Mac forum, it's not even close), and I have discovered how powerful the command line is as a development tool.

Also, when I run into a hitch during development, Google takes my operating system into account. Googling answers to development questions provides me much more relevant results on Ubuntu than it ever did on the Mac.

Plus, being able to develop web apps on the same or a similar operating system to the environment it will be deployed on is incredibly convenient.

I will say that most of my leisure time now goes to developing personal projects, rather than to gaming or other multimedia - things I often did on my Mac. But I think that's a good thing. When I do invoicing or I need to work with clients' data on spreadsheets, OpenOffice and LibreOffice, respectively, have served me just fine.

If you live in a multi-computer household like I do, I think it makes sense to have one machine that dual boots the latest Windows and Ubuntu to be able to take advantage of games and other multimedia. However, for the PCs that are only being used for day-to-day things like email, web surfing, and office tools, then Ubuntu is the best value there is.


>Also, when I run into a hitch during development, Google takes my operating system into account. Googling answers to development questions provides me much more relevant results on Ubuntu than it ever did on the Mac.

This isn't said enough. Type a coding question into google and 9 out of 10 hits seem to be good tutorials involving the linux terminal.


The point about it running well on old hardware is well taken. Installing a new OS onto an old box very often results in an apparent speed-up. I was amazed at how fast my old pc was when I upgraded to a new pc, and re-installed XP from scratch on the old one. It was amazing. Unfortunately we then started loading software on it, doing all the normal day-to-day stuff on it, and it's since slowed down again.

Unfortunatly installing Ubuntu on _new_ hardware is a different thing altogether. There you're far more likely to encounter problems as you have newer hardware (espaecially graphics hardware). We got two new "desktop" machines a year ago to use as file servers. Nice Intel motherboards, integrated high-end graphics and so on. A week of trying though (and we tried hard) and we simply could not make the graphics work right. And don't get me started on the RAID support. In the end we started again with Fedora and were up and running in a day. Fedora was more conservative with window transitions and so on, but I could get it to run for more than an hour, and VNC worked.

We use Linux a lot for servers, but I'm in no rush to roll it out for workstations. We'll keep trying every couple years, but it seems like 2 steps forward, one step back sometimes.


> Unfortunatly installing Ubuntu on _new_ hardware is a different thing altogether.

This. I made the unfortunate mistake of trying to install Ubuntu on one of the new Toshiba Porteges. I was not expecting such terrible device support (built-in Intel graphics would freeze or kernel panic in full-screen; insert/remove to the HDMI port would cause X to run up to 100% of CPU and be unresponsive to input; screen dimmer would seemingly randomly dim and light while I was working; etc.).


Unfortunately laptop support somewhat lags behind in Linux, mostly due to bizarre hardware idiosyncrasies. Toshiba, Sony are particularly despised for their botched BIOS and ACPI, strange hardware management, and complete lack of alternate OSes support. Most of the time Asus, Dell and Lenovo laptops work, though.

However it's unfortunately a good advice to always check with a live CD that your distro actually supports decently the laptop you're about to buy.


"6. Linux makes older hardware sizzle."

No it doesn't. It might seem so because Windows "gets dirty" with time if not properly maintened (running a defragmentation, removing unused programs and residents, deleting temporary files, ... : things most people don't know/care about). So to linux credit this should be "Linux does not get as dirty as windows".

Even so-branded "minimalist" distribution make things like a pentium III seem slow as hell. X itself is simply slow. On a netbook purchased in 2009, the difference between running Win XP and running X is actually visible (greater delays when switching windows for instance).

"8. Ubuntu is totally non-geek friendly"

> When you have used for a week you might say so. But changing even minor things to one's liking quickly becomes a pain in the ass and degenerates into wizard level hackery. On other OSes you can often rely on installing a freeware and for it to work out of the box. But the 100's of ways to do the same thing on linux + packaging issues and dependency soup makes it hell there.

"10. Security is a nice warm fluffy penguin feeling"

> Greetings from myth planet ! There's no reason things are inherently more secure on linux than on Windows, for instance. For both, the major problem is security breaches in applications.


I find the UX in Ubuntu much better than Windows and OSX ( haven't tried Ubuntu 11) mainly because of how fast and instantaneous everything is. Every time I try a Windows machine I am appalled at how unresponsive it is and how often programs freeze. OSX is better but I still find it has usability issues. I think if I'd become a regular user of OSX and learn the keyboard shortcuts I would be a satisfied customer. However, for a casual user like me (pretty much just for xcode), I find myself often annoyed at my mac. I curse at OSX, every time I use iTunes, every time I plug my iPod and the photo application locks up my mac for seconds even though I've never used this application, when windows pile up everywhere and there is no task bar to quickly switch between them, only slow multiple step processes with annoying animations.

I'm also surprised at how many OSX features are a poor implementation of old unix ide features. For example, I'm used to have multiple desktops on Linux. I've been using this since the mid 90s. Gnome has it, kde has it, I've used it on CDE the Sun interface. OSX has Spaces but it is horrible mainly because there isn't a taskbar icon that allows you to switch 'space' with one click. Worst even, the Spaces doc icon is a tease that looks like a proper desktop switcher but doesn't work that way.

In term of hardware support (and to a lesser degree software support), I admit Linux is behind. But this is a chicken and egg problem. If more people used Linux, hardware manufacturers would make sure their drivers worked with Linux. I have zero interest in doing manual configuration, so in the past ten years, buying a computer has consisted of bringing a Live Ubuntu disk to Bestbuy and poping it in to make sure everything on the computer works out of the box. By using this method I never have to fiddle with anything to get Ubuntu to work. It's very easy to do and my computers end up costing me half the price of macs by allowing me the choice of a smaller 1.5 hour battery (I'm almost always plugged in anyways) and lower quality display (so the contrast is not as good, big deal). My current laptop is a one year old (corei3, 4GB ram, 500GB HD) that cost me $600 (useless Windows licence included!) + $100 for a two year extended warranty and Linux makes it feel much faster than any other computer I use. Chrome on Ubuntu is a crazy fast web experience.

I've been hesitant to suggest Ubuntu to non technical users mainly because I don't want to be responsible for providing technical support but I do believe Ubuntu is probably ready for it. Maybe I should suggest it more given that they call me for problems with their win7 or osx anyways and I often find myself thinking: this wouldn't happen on Ubuntu.


I've often been told that Linux will work well on old hardware. I can confirm this is true with distros like ArchLinux but I've never had Ubuntu run well on old hardware.

A year ago I got a free PC. A P2 with 128MB of RAM. Attempted to install and run Ubuntu. Incredibly slow. I also attempted a windows XP install and I was amazed to see that it ran just fine. Heck, I could even fire up VLC and watch a video. Yes, it was slow but it was certainly more usable than default Ubuntu.

Now my latest experience with Ubuntu comes from a new purchase of a netbook. Samsung N150. It was cheap, it has 1GB RAM and an Intel Atom 1.66Ghz. I decided to do a dual install of Windows 7 Starter and Ubuntu 11.04. Ubuntu just chokes. Windows 7 fares better. Both are slow but windows is not unbearably slow.

I've tried many distros and WMs and so far stock Ubuntu has provided me with the simplest install but the worst performance.

Anyway, just my experiences.


>A year ago I got a free PC. A P2 with 128MB of RAM. Attempted to install and run Ubuntu. Incredibly slow. I also attempted a windows XP install //

It doesn't sound like you're comparing like with like.

XP was released in 2001. Last years Ubuntu was released last year, 2010 and was targetted mainly at hardware that isn't 10 years old. You could at least try Xubuntu or something that was made for old hardware. P2 came out in 1997 so XP working on it is not that great a surprise.

Try Win7 on that old PC and get back to us how well it runs ...?

FWIW I run Kubuntu 11.04 on an Athlon 1.1 with 768MB RAM - it's slow but works without any real issues. I've tried DSL and Puppy on a pendrive and they both pretty much blaze on this hardware.


There is a version of Linux that's ready for primetime. It's called Android. The desktop OS ship has sailed.


I actually agree with the article (having installed Ubuntu and Windows countless times over the past few months made me appreciate how much nicer Ubuntu is), but I take issue with one point:

"Windows is expensive and bad value for money."

No, it isn't expensive. You deliberately chose the version of Windows with extra features that mean nothing to the average user. A more proper comparison would be with W7 Home Premium, which at the linked site is £100. And that's only if you're upgrading- most people just leave whatever Windows version that was preloaded on their system on there, and there's rarely any reason to actually upgrade (since W2000, anyway). It's essentially free.


I just returned to linux after a ~7 year hiatus and have found it to be a night and day experience. My xp install had gotten so bad that I was forced to escape and decided to try ubuntu on a whim. I'm so glad I did because the OS is really snappy and I'm able to do everything I was doing on xp and I'm not subject to random crashes. The reason I think ubuntu is ready for the masses is because the install process, which is the first obstacle to adoption, is mindblowingly fast and smooth compared to what it was like 7 years ago when I was using debian 2.x. Two major pain points that are now fixed (at least for me):

-The wubi installer lets you effortlessly install a dual boot win xp / ubuntu machine in about 45 minutes. Previously I would have to use QParted or Acronis to carve a new partition and then run a linux installer, which for me wasn't that bad, but would be unseemly for a newbie. With wubi, you set everything up inside of winxp and it reboots your computer, installs the distro, and you instantly have linux on a dual boot. Anyone, no matter how little computer knowledge, can give ubuntu a try because of this.

- Driver support is much better than it was. It used to be a major pain to get important hardware such as wireless and video adapters working. When I installed this time, everything worked out of the box. This is huge for newbs.

After my latest ubuntu install, I'm going to try to fully transition my personal computer to linux. I'm also comfortable recommending it to those less computer savvy than me.


This is an awful post. Starts with saying how Windows XP crashed and continues with a bunch of anecdotes.

Here's another anecdote. I have yet to use Ubuntu on a machine where everything just worked. I love Linux and use it every day. It's not ready for prime time but it is, as it has been for almost 20 years, a reasonable alternative for those willing to learn.


I actually remember Windows ME as a better OS than Ubuntu 11.04. It's bad and not ready for primetime and it will take years before it's ready.

The last 6 months I was first forced to switch from Gnome2 to Unity because of Ubuntu 11.04 and that made me ditch Ubuntu. I had heard so many bad things about Gnome3 so I stayed away from ArchLinux and moved to Sabayon which still had Gnome2. One month ago Sabayon did an update to Gnome3 and after testing it for some time I decided to try Xfce (after the Torvalds post). I have spent so many hours tweaking and finding the source behind bugs that it nearly makes me cry.

There are so many small glitches and the only reason I stick with Linux on desktop is because it's free and it brings me freedom. That's the reason behind acceptance of all the glitches and mediocre user experience.


My parents (mid 50's non-techies) run Ubuntu and love it. I installed it at first, but they've been able to do the rest. Including installing the updates, new printers, Chrome and similar things. I don't see how they can be that much of an exception to the average Joe. Just an anacdote...


I thought this was an old article, from a few years ago. Seriously, Ubuntu has been a decent OS for developers and also newbies for a long time. (I am prejudiced - I downloaded Stackware using a 2400 baud modem (in 1992?), and Linux has been core to my business since then.)

That said, Windows 7 and OS X are probably better for the consumer market. Non-techie friends just bought a Windows 7 all in one computer (like an iMac) and it is amazing what they got for $600 (including a touch screen!!), and it all just works for them. I tried to give them a Mac last month and they promptly returned it to me because of the learning curve for switching to OS X, and the switch to Linux would be even more difficult.


Could be ready for primetime, but Ubuntu hugely suffers from a lack of QA polish. Maverick was working well for me after lots of updates, but the upgrade to Natty was a disaster ... a broken window manager which had to be replaced. Volume doesn't stick after reboot, etc, etc.

Oh, and they love to remove working functionality and don't bother to replace it. The services control panel, gone, no replacement. Sessions gone in Natty, I now have to set up my terminal the way I like it every single time I log in, etc, etc. Will write a script I guess. Constant churn, see pulseaudio and unity.

I'm eager to hear recommendations for a more stable (fewer regression) distro that keeps packages up to date.


I've been using a Linux as my desktop for around a decade. It's great for programming/programmers and sysadmin types. It runs the apps I need: perl, sh, apache, php, vim, and stuff from the various software repositories.

When I need to interact with data from coworkers at my non-techie organization, I use Windows. Why? Because the people there barely know how to use MS Office, much less something more complex. We use Linux on servers, but not desktops. Not yet at least.

When we need to deal with video, we send it out. We can do some in-house, but it's time consuming and kind of expensive. We use Sony Vegas on Windows (which is fine), but the real pros use Macs. So we have some Mac drives around.

The same goes for printing jobs. We do most of the in-house stuff in Office, and I've been pushing Publisher (which is a POS, but is available). For dealing with the outside world, though, I need to use Adobe Illustrator, because that's the standard. (At home I use Inkscape.) At the printers, they use Macs and PCs, but do most of the real work on Macs. The UI is just better.

The next platform is going to be the iPad. It's kind of a POS for non-managerial work, but some people just use email and don't really do computer-based work. They're already using phones, or just don't do the computer thing too much.

The Linux desktop is not going to displace Windows. Windows' edge is not the OS but the ISV (independent software vendor) environment. It's the most mature low-end ISV market. The Mac's was decimated with the demise of the old OS, and they're trying to rebuild it, but it's not doing well. The iTunes app store is potentially a challenger. The Android market is not a challenger. Linux's exists only for upper-end enterprise software.

Linux could displace Windows is by having a single dominant platform, and having an app store for it that is successful enough to fund a software ecosystem and pay the programmers. Then, people would choose Linux based on the apps.

At least that's the business model that worked for the Apple II / Visicalc, the IBM PC / Lotus 123 / dBaseII, the Mac / Aldus Pagemaker, Sun / SunOS and later Java, and Linux as a datacenter node.


How timely. I just spent 2 hours re-installing grub2 after Windows wiped it out (I knew it was going to happen but I hoped it wouldn't be any trouble to restore it). As I use LVM, I had to copy the /dev/dm-* devices to /dev/mapper/<lv>-<vg> in order to install grub (this is what took 1h50mins to figure out). Then I had to boot in to Ubuntu and update the grub config again to pick up Windows.

That said, Ubuntu has improved a lot and despite the grub issue, I still love Ubuntu. I have converted both my parents to Ubuntu and I haven't had a support call from them since the switch.


That's not Ubuntus fault. I'm thinking it wouldn't be easier to fix your boot loader in Windows.


Actually, it's pretty simple. Boot a DOS disk. Run 'fdisk /mbr'.

Mind: my preference would be booting FreeDOS, not some borken Microsoft crap. But all the same.

My experience is that fixing Linux GRUB issues is also pretty straightforward. Helps to understand chroots and stuff, but still: boot a bootable distro (Ubuntu or Knoppix on a USB stick rawks), chroot into your installed root FS, mount /boot if necessary, and run 'grub-install' or 'update-grub'.

Mostly just works.


I know, I just happen to be using Ubuntu. However, 99% of users will blame Ubuntu. (I think the issue is already fixed in newer versions of lvm2 (or grub2, I can't remember which they patched.))

Am I missing something with Windows? Can it detect non-Windows partitions? How can I fix it?


As an Ubuntu user I think this is just wishful thinking. My estimation is, that if Ubuntu has enough stamina and accelerates in the following years, maybe in 10-15 years they come much closer to where Windows is today.


I don't think so. The user interface still looks like a dog's breakfast. When was the last time you used Linux without opening a terminal session? Installing new software can be incredibly tedious and error-prone. Drivers may not work or be installed and need to be hunted down (the author raves about how everything just worked out of the box, except of course that HP printer driver, which he had to find - not easy - and install). Windows 7 and MacOS X are ready for prime time. Linux is still an OS for hackers.


I disagree on "Installing new software can be incredibly tedious and error-prone" with centralised package management, I much prefer installing something on linux than in windows, all it takes is "pacman -S" whatever, or alias that to something shorter, it also means searching for software is easy. In windows I have to go and search for it on the net etc.


I was introduced to Ubuntu in the fall of 2005. I remember when I got the free disc to install and also how I struggled to get Ubuntu installed. The thing that sticks with me though is the reason I was trying to install Ubuntu is because it was already said to be ready for primetime.

Ubuntu walks the edge of good enough for random computer user/good enough for super user. But in reality the facebooking/photosharing computer user would rather not have to do it on Linux.


Ubuntu is great, I have it up and running on my laptop since about 2 years. But I have WinXP, too. Just because I need Fireworks (I absolutely can't get along with PS or Gimp) and Wine is okay, but it makes the applications so slow...

That is really the only damned reason, I still have to switch between Win and Linux on my Laptop. If Adobe's products would run flawlessly on Linux, I'd never touch Windows again.


I used to be like that, until I tried Inkscape in Ubuntu.

Past the learning curve, and you will never need Fireworks again.


Hm yeah I recently discovered Xara Xtreme, what seems to be quite similar to Fireworks from my point of view. I once tried Inkscape but it literally grossed me out just like photoshop did. Maybe eventually I'll give it another try.


Have you tried renice'ing the Fireworks process while it's running on Wine? I got Counter-Strike playable at about 30fps that way on a fairly weak machine (Core 2 Duo w/ 2GB RAM). Not bad, really.


It's also about several features which aren't quite working under Linux. But I'll try that.


The logic here is that Ubuntu is "ready for primetime" because of what happened with another OS? For a single user?

This type of commentary simply places headwinds on Ubuntu and other Linux OSes from actually gaining desktop traction.


If you develop software that runs on anything that's neither an Apple or a Windows machine, Ubuntu has been ready for prime time since 2006 or so. It all depends on what you use your computer for.


Ubuntu is no replacement for Windows. It can be a potential alternative, but it is in no way some sort of drop-in replacement.

This particular article reeks of someone who doesn't actually need to get work done "Free is unbeatable!" and "Windows is a poor value for the price!" -- uselessly ambiguous and all-inclusive statements.

If you are a PC gamer, big into video editing or professional image editor using the Adobe suite, Linux is not an option for you. Please ignore all the people that tell you WINE "totally works great" and has "no problems!" and is "rock solid"... I would recommend jamming glass in your ears if you see someone talking like that.

You are certainly welcome to try and take them up on that, but you'll decide in under a month that all the little gotchas you only find once you dive deep will drive you crazy.

I understand the intent of these articles ("The year of Linux on the desktop!") but people have been writing them since 1998 and it has never been true, all that has happened is each OS has grown up to fill its particular niche and if your use-cases fall in that niche, then you are all set... otherwise the move will be a complete waste of time for you.

The only reason Linux has been able to make more headway (IMO) is a good majority of our lives have moved online with web-based apps so the reliance on Linux matching parity with the software on other platforms has lessened.

There is no amazing new office suite for Linux, it's still OpenOffice and always will be. I don't know if KOffice will ever hit critical mass or individual GNOME-based efforts will grow beyond clean v1 implementations.

There is no fantastic PIM software that finally replaced Outlook.

There is no awesome, stable, robust video player except 75 different shitty GUI interfaces to mplayer's codecs that work for the most part then core dump sometimes oh but then are easily fixed by recompiling with the --bullshit flag turned on.

It just goes on an on... of course your mileage will vary depending on what kind of user you are, the more intense/hardcore you are the more warts you'll notice and try and fix/workaround (key bindings, mouse button support, audio card, hardware acceleration, optimal power saving management, etc.) and after a month of pissing half your day away with some broken configuration for the 10th time, you'll go back to some other OS.

The lighter or more online user, you'll probably be just fine.

That is my own personal opinion from 13 years of trying to finally adopt Linux as my primary desktop OS, even giving up games to make it so and finally giving up with Ubuntu 11.04 after realizing I was always going to have the same magnitude of problems... no matter what.

Ok, I've lead with my neck out, everyone go ahead and comment about how their desktop is "rock solid" and I must be brain-damaged and doing something wrong consistently for the last 13 years to always have this experience.


My 'buntu desktop environment is the most solid I've ever used, with the caveat that once a year I can expect to spend a few hours in driver hell. I think this balances out with the once-a-year BSOD hell I encounter on Windows.

I agree with you that WINE is terrible, but I have Win7 running brilliantly right now on VirtualBox. Other than that, linux has all the specialized tools I need for coding and web development, I switched over to console gaming a few years back because I was tired of the hardware upgrade treadmill, and the Microsoft productivity stack, while better than Libre Office, isn't compelling enough to tie me to that platform. Also, I'm not sure what problems you had with video players. VLC has served me well on Windows and linux.

I guess the real killer app for me is sudo apt-get install [some incredibly useful open source library I really need right now, dammit]. The open source infrastructure on Windows is so anemic it drives me nuts.


So, 2011 will be the year when Linux goes mainstream on the desktop? At last!


Tried Ubuntu 11.04 yet? Try that and write back to us. Their new unity launcher isn't very good. Yes, there's ways around it, but that's the default...


It can't sync music to iPhones or iPod touches. My dad loves his iPod and would scream if I tried to replace it.

The ubuntu solution? Just run XP in a VM to run iTunes!

I bought him a Mac.


This is exactly the same article that has already been written by every geek who blogs and has recently tried Linux. Nothing new or interesting. Flagged.


Year n+1 is the year of Linux on the desktop!


In other news, we've had several legitimate years-of-the-Unix-desktop courtesy of Mac OS X :)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: