Hacker News new | past | comments | ask | show | jobs | submit login

Interesting. The general consensus I get when talking to people, especially devs, is that it's about apple's quality of hardware.



For me it's the software+hardware, and the killer feature is a unixy OS with great battery life and little to no fuss. Is it possible I could get similar results out of Linux on commodity hardware? I'd put it at long odds, but even so I bet I'd have to shut off features (losing convenience) and spend a lot of time tweaking it to make that happen.

It's not even just the OS. I've switched to Safari because losing 1.5-2 hours of battery life to use Chrome or Firefox isn't worth it. I'll just open Chrome when I really need it, use it for a few minutes, and close it.

There's also a "no one got fired for..." factor. OSX refuses to work with the client's projector in a meeting? Apple sucks. Arch Linux refuses to work with the client's projector? You suck.


> Is it possible I could get similar results out of Linux on commodity hardware?

I'm using a Sony VAIO Pro 13 with Ubuntu and EVERYTHING, yes everything, works perfect. Touchpad, standby, brightness, wifi, ...

Also a friend of mine is running the Dell XPS 13 with Linux and also has no problems AFAIK.


Yes, it's good great quality hardware -- for a Unix laptop. That was one of the biggest reason why a lot of developers who used to work with Unices on servers switched to PowerBook. Back then, Linux really lagged behind in that regard, even on ThinkPads.

The next big boost was a GUI and market situation that made it possible to create some neat productivity tools (and profit from them).

The early Ruby on Rails community probably was the epitome of that setup: TextMate as a uniquely OS X editor, easy development on your laptop, then deployment to a Linux server. I think Keynote came out right at that time, too, which was really neat for a rather weirdly conference-driven community.

Then the pretty great Unibody systems came out, combining the aircraft destroyer feel of olden Thinkpads with a more sleek design, and the first of the ultrabooks, the Air… And, of course, you basically have to use a Mac system when doing iOS dev work.

But in the recent years, I feel that Apple lost it a bit. OS X is stagnating (what's the last feature that could compete with e.g. Expose? And still no decent 1st-party package manager). Third party development/productivity software doesn't matter that much these days (it's either webapps or mobile apps; no really shiny single-platform editor anymore). Some of the Linuxens are pretty good with supported laptop hardware, so that's an option for the Unix hardcore.

And as this article is showing, Windows itself is getting more of an option even if you're not buying into the MS stack: RAM and hard drives are big enough to support virtual machines for your development environment (esp. w/ docker/vagrant/etc.) -- if you even need that, as high-speed internet is ubiquitous enough so that you can do remote development throughout.

So, yeah, while I personally still think that OS X is a more developer/hacker-friendly environment than Windows, the incentives don't appear as big anymore. So if they lose their edge in hardware quality, style and solid driver support, they'll lose a few developers more.


Nope. I switched to OS X in the early 2000s because of the OS X Unix underpinnings, while still having a proper GUI for design apps.

Nothing beats having the ability run Unix command line tools.

MS really needs to add these to their base OS.


I guess it depends on which devs you're talking about. I'm mostly doing web development on a day to day basis; very little VM usage, and I've never really gave my hardware a second thought. Maybe because I've always used a mac? Perhaps I'd be more picky if I had to endure poor hardware.

Having said that though I'm much more concerned with the OS capabilities, user experience, aesthetics, toolchains, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: