Hacker News new | past | comments | ask | show | jobs | submit login

This is silly. Although I'm sure it's gotten much better since I regularly used Linux on the desktop, you always had to do some amount of due diligence about what hardware worked well with it and what hardware didn't.

For example, because I use OS X, I can go to any store, purchase my choice of Lexmark printer, and be on my merry way with a minimal amount of checking. The fact that I have to know that there's a "reference platform" is the problem. I don't want to have to think about that crap. I don't have to with my Mac. I want to save my mental cycles for important things, not struggling with which printer to purchase.

Or I can put it in simpler terms. There's a concrete social cost to using Linux on the Desktop. Let's say I want a printer for Christmas. If I'm running the amount of information I need to give my family in order for them to buy me one that works is orders of magnitude smaller than if I'm running Linux.

In fact, I'd sound like some sort of bizarro printer snob when I make my request. "If you're looking at Lexmark printers, these models work, these ones don't. HP is good, but not models like X, Y, and Z."

Now, like I said, it's been several years since I regularly used Linux on the desktop. I'm sure it's better. But the problem is not that there aren't known-good Linux configurations. It's that (1) those configurations are essentially arbitrary and therefore not obvious and (2) they require time and energy to learn (memorize, really).

If it's changed, that's great, but that's also part of the problem. Now I not only have to keep a set of known-good configurations in my head but a rolling changelog, too, to make sure my knowledge is up-to-date.

I switched from Linux to OS X precisely because I decided my mental energy was better spent on other things than remembering, "Oh yeah, the Dell Lenovo X876123 mark 10 is a known-good Linux laptop."

I'll add that I'm sure in aggregate Linux supports a much wider range of hardware than OS X. That's great! That's why Linux is kicking ass in all sorts of non-desktop markets like mobile devices, embedded systems, servers, etc. But there is an inherent tension in an OS that's optimizing for that vs. constraining itself to the desktop and the desktop only.




I generally agree with your main point (that knowing about reference hardware is burdensome).

However, your specific example (printers) is outdated. Both Linux and OSX use CUPS, which was acquired by Apple in 2007. Linux printer support is pretty spectacular these days.

http://en.wikipedia.org/wiki/CUPS#History


CUPS drivers are only used on OS X when there aren’t any official drivers made available by the vendor. That’s mostly the case for very old printers (pre-OS X or non-USB). With CUPS drivers, you can get a printer to work, but the experience is nowhere near as good as using official drivers. So no, printer support on OS X and Linux is not equal.


Sure. I've used Linux on the desktop since then. Even with printers it was still hit or miss. CUPS just acted as a catch-all, lowest-common-denominator of printing.

I'm pretty familiar with CUPS, too. I helped build a school-wide printing system at my university based off CUPS as part of my job at the time. It had to interface with our internal authentication system, debit student printing credits, and interface with dozens of types of printers across campus.


Spectacular, but not reference. General printing is OK for almost any printer, but support for advanced features is limited (in my experience).


For all your nonsense, I've had far less trouble getting devices to work with linux than any other OS over the last few years. Device support on Linux just works more often than it does for Windows or Mac OS these days. And I'm including laptops.


I think we're within the margin of error here. All devices I've ever bought for Mac "just work." Call it anecdotal, but that's the case.

Good for Linux if it has also reached that point, but don't call OSX worse when it isn't.


This comment was probably more relevant 10 years ago.


You have a valid point, but theres another side to it. The hardware that works great are the hardware that developers tend to use. The worst low-end lexmarks which are barely good enough for one cartridge cycle is not something any computer enthusiast/programmer would spend time on makeing work.

Since I switched to linux on my laptop 100%, around 2003 somewhere, I always tried to pick hardware which is professional grade, not home/consumer end. Although a bit more pricey to start with but most of the things have lasted since then as well.


Sure. But still, let's imagine I did professional work that required professional hardware. Further, let's imagine that Linux supported my current setup perfectly.

My setup won't be my setup forever. Maybe some new hardware comes out and I want to experiment with it. Will it work with Linux? Will the person making that hardware have Linux in mind when they're building it? If my current hardware breaks, will it be easy for me to find suitable replacement quickly? As in same-day quickly?

Is memorizing the list of "professional-grade" hardware that works with Linux -- or perhaps just the list of exceptions if that's the smaller list -- worth my time as a professional?

For you, the answer to all these questions might be "yes" or even "yes, given how much I care about using Linux for other reasons." That's great.

However, the mere fact that one has to consider these questions increases the cost of adoption. It shouldn't therefore be a surprise that fewer people use the platform as a result or that people point out this frustrates them. Maybe they even use tricky phrases like "hardware support" which can mean many things to many people to describe their frustration.

If you want to think about it algorithmically, maybe try this picture. You have two points A and B on a map with varied terrain and a path-finding algorithm. You have N seconds to get from A to B, including the amount of time it takes to find a path.

You won't be able to take fastest path if it takes longer than N seconds to find that path -- by that point you'll be out of time. Instead, the naïve solution would be to take the first path P where the time searching for P plus the time to traverse P is <= N, assuming you can find one. Maybe you modify this meta-algorithm slightly so that once you find a P, if the time spent so far plus the time to traverse P is M < N you spend another N - M seconds looking for a second, possibly-shorter path.

Those "N - M" seconds are really important in consumer psychology. If you've found multiple options and weighed them you'll feel much more comfortable in your decision. That comfort translates into confidence in the underlying platform.

On Linux the "time searching for P" is much larger than OS X, but the original article is pretending as if it's only the existence of a fast-enough path that matters. It's not, especially if the time to find the right path dwarfs the amount of time you have to spend on the problem or it results in you being able to weigh as many options as you would on another platform.

For me, personally, as someone who values his own attention very, very highly, my threshold for "time spent searching for P" is incredibly low.


  > Is memorizing the list of "professional-grade"
  > hardware that works with Linux -- or perhaps just
  > the list of exceptions if that's the smaller list
  > -- worth my time as a professional?
Why the focus on 'memorizing?' When you research the best appliance to buy, do you complain that you had to memorize the specs of (e.g.) different refrigerators? Or do you just do research/comparisons, buy the thing, and promptly drop all of that from memory?


Hmm, that's not quite what I meant. I know what I care about in a refrigerator and I know I'll be able to buy one with minimal hassle once I decide I want one. I'm free to research online and I'm relatively confident someone at the store will be able to answer my questions when I get there. I also have lots of friends (and family) who have purchased and installed refrigerators, so I can ask them for input if I want.

My dealings with hardware + Linux has not been like that at all. Instead, what works and what doesn't is essentially arbitrary. Some brands are generally more compatible, but there are always some make/model combinations you want to avoid. And even if it "mostly works" there's a good chance some of the less common features will work poorly, at best. When I say "memorize" I mean internalizing this essentially arbitrary list. The HP 450C is ok, but don't get the HP 451C! That sort of thing.

To top it off, if I go to the store and ask questions about Linux compatibility -- perhaps there's some information I couldn't find online, say -- I'm not at all confident I'll be able to find someone to answer my questions. And even if they didn't, I'd wonder if what they knew was up-to-date or not.


There was a time when "time spent searching for P" was much greater for Apple hardware than Wintel machines.


To be fair, by that argument it's even better to use Windows.


Windows 8 seems to have magical printer support. My printer was just there, I could print... no anything required. It seemed to install the drivers as soon as it saw it on the network. (Apparently this isn't so nice if the computer reboots when trying to install drivers magically)


Only if that's the only dimension you care about. It's one dimension and it's an important dimension (for me). More important than that (for me) is "it should be Unix-like and I should have easy access to a typical Unix toolchain."

My experience is that for the last 5 years OS X and Windows are in roughly the same ballpark in terms of how much I have to think about non-software issues, although Windows is better in that dimension, as you say. Linux is not really in the same ballpark, or at best it's way out in the nosebleed section.


Printing? In 2013? This is the least relevant part of hardware support for me since about a decade, on any OS. And that goes for pretty much anyone of my friends too, no one owns printers any more these days. A decade, more or less, is how long I haven't, personally, needed a printer and hence not owned one. So while I can't counter your claims, I couldn't care less. And all the places I worked @ in the last decade, in VFX, ran their printers on queues on CUPS on Linux servers. They always worked, on the utterly rare occasions that I still needed to kill trees to rely information to some entity stuck in the past. ;)

How about sth. just as preposterous an example: I salvaged a +20 years old 9 needle printer (Star NL-10) from a stash of hard rubbish two months ago. I got it to work w/o problems on Linux (trivia: I used turpentine on the ribbon cassette to get the ink 'flowing' again). I still can't get the thing to do anything sensible when I attach it to my Retina MacBook Pro running the latest and greatest OS X. :P


Printers are still essential for nearly everyone I know: recipes (for those without an iPad), tickets to concerts/theatres/movies, flight boarding passes (these will fade as smartphones become more applicable), legal documents that require signatures, and photos!


It's not particularly bizarre to be a printer snob. If I were to buy a printer today, even without considering drivers I'd probably narrow it down to 2 or 3 options.

Either way, I've never had any issues with any printers I've thrown at Linux. There are issues with more obscure devices, but equally in the world of SoCs and generic mass storage/input devices that we live in I have far fewer expansion cards and peripherals than ever before, and those that I have use pretty standard protocols for communication.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: