So out of personal experience and anecdotal evidence, I would say that the conditional probability of a voluntary Windows user being a great coder is less than, say, a Linux user.
If this is true, it's only moderately true and only recently. Historically Windows had a pretty big advantage in terms of great coders. Including virtually every game shop almost exclusively (incuding legends like Carmack, Sweeney, etc...).
It was in part just due to the fact that if you wanted to make money, you wrote Windows apps. It's like iOS is today, except a magnitude more dominant.
It's only in recent years with the web and mobile where this has begun to change. But I'd say that only mobile has typically had super strong devs. The web until just the past year or so has not been a place you could hire generalists. We've really just begun to see devs that I'd hire to do any work at our company -- not just web people.
As a counter example, Carmack wrote Doom on a NeXT workstation[1].
These days OSX seems to be pretty hot for game dev, certainly if you want to do any iOS stuff, and OSX is creeping up as a platform too, with Steam now available there. No doubt windows will be king for a while yet though.
Yes, Carmack did like NeXTStep a lot. With that said, most of his career Windows was his main dev box. Even post Doom he went back to Windows as his main desktop:
"The upside is that windows really doesn't suck nowdays. Win 95 / NT 4.0 are pretty decent systems for what they are targeted at. I currently develop mostly on NT, and Quake 2 will almost certainly be delivered on win32 first."
http://rmitz.org/carmack.on.operating.systems.html
And more importantly, as it relates to DHH's statement, excluding Windows developers isn't like some odd niche. It's probably a healthy percentage of the best developers in the world. Fortunately for DHH he's at a webdev shop where pure coding skill is probably less important than cultural fit.
Fortunately for DHH he's at a webdev shop where pure coding skill is probably less important than cultural fit.
I would argue that, at most programming jobs coding skill is less important than cultural fit, once you've passed the level of "reasonably competent developer."
Games seem to be a Windows thing. If you play Games, you must use Windows. If you make games, you must write them on Windows, because your audience is on Windows. There are exceptions to this, like Minecraft running on Java (but Notch seems to be a Windows user anyway).
Games are a big chicken-and-egg issue for Linux and other OS adoption. A big part of this issue is Direct3D vs. OpenGL. While D3D is clearly a superior API, OpenGL is the only choice if you want to support non-Windows OS'es.
So if there were more games (talking about big titles here) for Linux, there would be more gamers using Linux. And subsequently more games would be written for Linux. As there is no initial group of customers (big enough to attract publishers' attention), the situation is not going to change. Unfortunately.
If this is true, it's only moderately true and only recently. Historically Windows had a pretty big advantage in terms of great coders. Including virtually every game shop almost exclusively (incuding legends like Carmack, Sweeney, etc...).
It was in part just due to the fact that if you wanted to make money, you wrote Windows apps. It's like iOS is today, except a magnitude more dominant.
It's only in recent years with the web and mobile where this has begun to change. But I'd say that only mobile has typically had super strong devs. The web until just the past year or so has not been a place you could hire generalists. We've really just begun to see devs that I'd hire to do any work at our company -- not just web people.