Hacker News new | past | comments | ask | show | jobs | submit login
Herd Mentality (daringfireball.net)
108 points by atestu on Oct 23, 2009 | hide | past | favorite | 80 comments



Gruber doesn't quite go all the way here, but it's important to note that no company can operate in a vacuum. However, what's important, vitally important, is that you don't operate such that one of your key suppliers is a monopoly! Just look at Apple's suppliers: Intel? They better keep up with AMD or else Apple will do to them what they did to IBM. Flash memory providers? Apple plays so many of them off each other that you can tell when a new Apple product is about to launch just by watching the market price of Flash. Graphics? Apple's been bouncing between nVidia and ATI like nobody's business...

...so really, it's not so much about hardware vs software vs hardware+software. It's that the smart businessperson doesn't knowingly make a deal with a monopoly.


It's that the smart businessperson doesn't knowingly make a deal with a monopoly.

I see what you're saying, but hasn't nearly every Apple customer effectively done that? A datacentre could maybe rip out Xserves and replace them with equivalent 1U boxes running some generic Unix apps (e.g. Tomcat) but anyone who has developed on the Mac platform is beholden to Apple.

Not that this is necessarily a bad thing, and not that there is an economic way to stay completely platform agnostic, but it does contradict what you say.


It doesn't contradict it. He talks about Apple, you talk about Apple's customers. Thinking of it from the company's position, other's monopolies are bad for you, not necessarily your monopolistic advantages.


Yes, and as for developers, they're developing for a market (i.e. Apple isn't supplying materials, just opportunity). It's sort of like how Detroit was developing for the petroleum market. The idea there, though, is that in developing for a market, you should have general domain abilities that can span markets. In other words, if you're an Apple developer, you depend on the Apple market to sell your product, but you should be able to take your development ability and develop for Windows or Linux should the Apple market collapse.

Obviously, it's not guaranteed that you'll be that smart. Detroit should have been smart enough to take their expertise in developing for the petroleum market and transfer it to success in the alternative energy market. But it's still not as bad as having a key component of your product be supplied by a monopoly (just ask your local jeweler).


I think a lot of people take Windows for granted. It's amazing that we have one operating system that can smoothly run on the most expensive and advanced PC hardware and run on a netbook that costs less than a really good dinner out. Hardware manufacturers can put other own drivers as easily as anyone can put out software. And the same software runs on PC models from dozens of manufacturers and whatever Frankenstein-PC you can build yourself. Developers can target one API for all that hardware.

The author doesn't seem to think much of that.

Apple is an exception, an outlier, and a niche.


operating systems are a natural monopoly, due to network effects. it's no surprise that one of them 'won' and captured most of the market. that doesn't mean it's good for the industry, though.

microsoft has demonstrated time and time again that, with windows, they value backward compatibility ahead of almost everything else. it still bears the mark of design decisions made in the eighties. that makes it pretty difficult to innovate in the windows space, even if microsoft wanted to, and they don't.

apple's macintosh is a viable competitor, but they're after a pretty clearly-defined market as well. they've got millions of customers they can't afford to upset.

three primary desktop operating systems is not enough. it retards the spread of new ideas. we are taking too long to get to whatever comes after the desktop gui metaphor, because any company that created such a thing today doesn't stand much of a chance of being able to sell it.


Microsoft values backwards compatibility ahead of everything else because that's what both users and developers want. Because Apple has never been in the business market, their is very little custom software written for it. They can toss out backwards compatibility every few years and someone will write a new zip utility or port a web browsers. If Microsoft broke backwards compatibility, zip utilities and browsers would be ported too. But companies that have invested literally millions of dollars in their software don't have the time or money to do that.

If Apple were to ever capture (and keep) more than 10% of the market, you'll see that they too will slave to maintain backwards compatibility. Their niche status affords them ability to make the sorts of changes you want.

I don't understand this intense desire for speedy change -- do we really need to replace the desktop metaphor as quickly as possible? Doesn't that just hurt, rather than help, adoption and understanding of computers? Everytime the GUI changes half the planet has to re-learn how to use their computers. If we had more than 3 primary desktop operating systems, all implementing new ideas, the learning curve for users would be impossibly high.

The fact that one operating system won is good for the industry.


yeah, i don't believe any of that. and i'm surprised to see so much of this attitude here, of all places.

you could have used all those same arguments to claim that guis are bad, let's just stick with ms-dos. back in the eighties, i remember many people making exactly that argument. heck, in the back of my mind, i was sort of thinking that myself! i had a great deal invested in the status quo. i'd spent years learning 8086 assembler and all the tricks necessary to write a dos tsr that wouldn't bring down the operating system. it was an ugly, stupid, convoluted system, but i was a master of it, and so i wanted things to remain the same.

apple is never going to stop throwing out old stuff and replacing it with new stuff, no matter how much of the market they get. not so long as steve jobs is alive, anyway. that's what makes the mac a healthy, thriving ecosystem. once an idea has outlived its usefulness, it's chucked for something better. the fact that the people in charge of windows are not willing to do that means that it is just a matter of time before the entire operating system is obsolete.

i'm not saying there shouldn't be a windows. it caters to a certain crowd that i am not a part of, thankfully, and i think that holds for most of us around here. but the idea that it has to be pretty close to the only operating system is a bad thing. fortunately, microsoft doesn't wield the kind of power that it once did, so there's a chance we'll get to see some real alternatives in the next decade or so.


You might want to take off those rose-colored glasses and take a better look at Apple. The classic Mac OS existed until 2001 and still lacked memory protection -- You had to remember to save your work before opening Netscape! Then they didn't even develop their own OS, but instead purchased NextSTEP which was already 12 years old at the time and circling the drain. Then, instead of throwing out the old stuff, they bolted the Carbon API onto it to allow classic Mac applications to be ported easily. So you really couldn't be more wrong about Apple.

On the other hand, the number of applications that Mac OS X must run to be accepted can be counted on the collective hands of the OS X engineers. They can almost afford to break whatever they want as long Photoshop runs. Microsoft does not have that luxury.

The command line vs. GUI argument doesn't hold water either. Instead we're comparing the desktop metaphor with some concept that doesn't exist yet. Operating system research is always on going and the really good ideas percolate up into all the current operating systems. You can't just radically change direction -- even Apple knows that. In fact, they've been removing features (like creator codes) that distinguished them from Windows and Linux basically because being compatible with the universe is more important than being innovative.


I agree with everything you say except that "Apple is an exception, an outlier, and a niche." 10 years ago I'd have agreed with you, but I don't think you can say that anymore. Their market share continues to grow, their products continue to improve, & their customers are the most consistently fanatical I've ever seen.


You misunderstood me. They are very successful (and very profitable) but they're still an exception and a niche. If every PC manufacturer went the same route as Apple, they'd probably all fail.


I don't know where the folks responding to this statement by downplaying the significance of the OS by pointing to the increasing importance of the browser and the web in applications are getting this line of reasoning. It's a complete non sequitur.

Sure, web apps and AJAXian techniques may be putting a dent in domains formerly owned entirely by standalone desktop GUI apps.

But your browser is not going to implement your hardware device drivers, I/O subsystems, network stacks, filesystems, and, well, ${insert very long list of other ways in which the OS interacts with the fundamentals of the machine}.

To say, "Ah, well who cares, it's all about the browser now anyway" is really, really missing the point. Maybe Google Docs displaces Microsoft Office, but do you really think you're going to XMLHttpRequest your SATA driver, process scheduler, or IP routing FIB (forwarding information base) away?


The low-level aspects of an operating system are now available, for free, to whatever company wants them in the form of Linux. Both of Google's operating systems (Android and Chrome OS) are based on Linux. The Palm Pre is based on Linux. The fact is that even the device drivers, I/O subsystems, network stacks, file systems, and the X number of ways the an OS interacts with the machine are no longer significantly important.

What is important is how the OS interacts with the user. This is where Apple shines. This is also where Google is looking to shine as well. If the user experience can be provided entirely within the browser, why not?


The fact is that even the device drivers, I/O subsystems, network stacks, file systems, and the X number of ways the an OS interacts with the machine are no longer significantly important.

Because it is readily available in free form, it is no longer important, significantly or otherwise? Seriously?

Yes - it's become more commoditised. No, you can't just "replace" it with a browser. And yes, there is a qualitative distinction in degrees of hardware support and underlying engineering.


As you said, it has become commoditised. Is there really a qualitative distinction? Apple's operating supports Apple's hardware (Mac and iPhone). Linux claims great compatibility with most PC hardware and tonnes of other random equipment out there. Windows is Windows. Do you think the average person cares what file system underpins the iPhone or that Android and the Pre share an OS kernel?

But you are right, you can't replace an operating system with browser. However, you can treat the OS as just another layer in the stack in between the network card and the browser. It's even an interchangeable component -- swap out Linux or Windows or OS X and you have still have something that can push bits onto the net and run a browser.


In discussing whether hardware vendors should make their own OSs, if the low level is commoditised enough you don't need to replace it with a browser, you just need to pick the one that better suits you (e.g., Apple with FreeBSD).

As far as the user experience goes, non-Windows systems are still handicapped by hardware and software compatibility issues which are decreasing in relevance, especially for a hardware vendor who can ensure that at least the hardware it ships is fully supported in their OS of choice.


I'm also confused -- we're going to rip out an entire OS only to replace it with a web browser which is, by design, unobtrusive and supposed to operate like pretty much every other web browser. If you're going to do that, what is the point of developing the OS? You'd be much better advised to just ship Ubuntu and Firefox with a custom theme and the metaphorical serial numbers filed off. (Its a bad idea, but it strictly dominates the worse idea of writing your own OS and your own browser.)


If Apple had to compete with someone else innovating on the PC, it would force them to break even further away from the herd. Macs are great, but they still resemble PARC's Alto from 36 years ago far too closely. The times have changed quite a bit. The iPhone is a much bigger leap as a personal computer (smaller leap as a phone, really).

The potential for innovation is there. It would be nice to see that happen on the PC. I hope that "someone else" shows up at some point. Better yet, I hope I can be a part of it. ;-)


"Imagine how much better the industry would be if there were more than one computer maker trying to move the state of the art forward."

It would also be a nightmare of complexity.


That was absolutely true way back when the industry was new and there were myriad system vendors with their own operating systems.

We have learned a lot since then, and we know how to address these problems: openly specified document formats and protocols. Programming languages with portable runtimes and libraries. Open source software. Web Applications.

The world is a much different place from when you felt trapped because your Visicalc files were on LDOS floppies that could only be read by a TRS-80 Model IV.


Not necessarily. Web standards could be a great foundation. The web is open and anyone is free to push the state of the art forward. Yet the result has often been more interoperability, not less.


Do you seriously believe that webapps would eventually replace desktop apps ?


Yes. Not in the way they are right now, but yes, web based applications, running off remote servers, maybe interfacing with a desktop client or something similar to that is the future.


Some desktop apps yes, actually many of them have been already replaced. Other (data/processor intensive: graphic arts, multimedia, etc...) don't think so.


Could you give me some examples please ?


Adobe Premiere for example?


I was speaking of webapps that had actually replaced desktop apps.


Ah, in many cases webmails have replaced desktop client apps. And in my environment the official suite is Google Docs.


I think the benefits of having a diverse OS landscape are much greater than the cost of a little increased complexity. We already have languages that run on virtual machines, and we have the POSIX standard. Software portability shouldn't be a problem.


It should be a problem, otherwise the new OS would be so similar there would be little point in it existing...


A POSIX compatibility layer should be enough. You can easily run UNIX programs on Windows using Cygwin, for example.


What you want is a new and innovative OS that is also backwards/cross compatible and standards compliant?


Haiku actually does this. Haiku is not a UNIX, but there's a fairly complete POSIX layer for building UNIX programs. As far as I know, Syllable and SkyOS do something like this, too (I might be wrong, though).


What's the problem? The standards compliant layer would perhaps just not be as fast as the native metaphors.


You'd be surprised at what's possible. I'm actually working on one right now.


I agree. Ask someone who was involved with developing Borland's Turbo Pascal. When I used to help duplicate it (think digital Kinko's) we had what seemed like 50 masters to dupe from: DEC Rainbow, Heath/Zenith, HP, ACT Apricot, NorthStar, CP/M, Eagle, Pixy, many, many others. Oh, and MS-DOS and IBM DOS.


Because things are so simple on the Windows platform right now?

Every time I have to go back to using Windows I can't believe how many hoops you have to jump through to get anything done.


The irony of an Apple fanboi discussing conformity like this is deafening. Buying Apple is buying conformity.

I suppose with all the downvotes I'll expand.

The conformity is what makes Apple stuff work. A small range of hardware for the chosen operating system. Running Linux or Windows on Apple hardware is hard.

Conformity also happens with the iPhone. Unless I bother to jailbreak my iPhone, I'm forced to use the functionality on my iPhone agreed by my carrier and Apple through the carrier updates.


Did you ever stop to consider why Apple "fanbois" are? And why, for example, there aren't such people for other hardware or software manufacturers? (Microsoft has a good few, actually, and Linux certainly has some, but the numbers of fanatics for both combined are easily dwarfed by Apple. And no PC hardware company has anything like it.) Certainly many people will buy Apple just because it comes from Apple, but the point is that there was something about Apple that attracted them in the first place; hardware and software coming from a company that actually cares about good design and isn't as tonedeaf as Microsoft's attempts at it have been. There was & is actually koolaid to drink. In any case, Gruber is talking about industry conformity, not consumer conformity, which is a whole 'nother pigeon.

edit to reply to edit:

"The conformity is what makes Apple stuff work."

To a degree, yes. Or rather what makes Apple stuff "Just work." It certainly wouldn't be past Apple technologically to make OS X run on commodity standard boxes, but it wouldn't be a very good business decision. They're a hardware company.

"Running Linux or Windows on Apple hardware is hard."

No, not even a little bit. Well, Windows on PPC would be impossible (unless we're talking about maybe one version of windows NT, I think?) but that's Microsoft's thing not Apple's, in only supporting Intel. Apple's OS these days comes with a utility that makes installing Windows dead easy if you want to, and even if you don't use that it isn't any harder than installing it on any standard PC.


> Did you ever stop to consider why Apple "fanbois" are?

Some of them appear to need to follow the underdog, conform but not conform, or are spoken to through the marketing.

> In any case, Gruber is talking about industry conformity, not consumer conformity, which is a whole 'nother pigeon.

The industry is conforming to follow Apple. HP through marketing. Dell through design. MS through marketing/Zune/Pink. Google through Android.

edit: You jump through additional hoops to get Linux working properly. Video drivers and sound drivers are poor, sound quality is poor. Getting used to the synaptics driver for the glass trackpad was hard. And you need the drivers from Apple for Windows, which they tend to take their time with.


"Some of them..."

Some of them, certainly. Are you honestly claiming that all or even most Apple users do so because of marketing if there's no quality product to back it up?

Edit: as per your post history/above, you appear to use Apple hardware. Just because of the marketing?

"HP through marketing."

Without a product behind it.

"Dell through design"

It is to laugh.

"MS through marketing/Zune/Pink."

Which might do something for them, as by all accounts the new Zune is decent, though with some of their typical tonedeaf touches (30-second ad plays before running an app? Puh-leease.), though their smartphone OS is still atrocious.

"Google through Android."

With their 7% marketshare to iPhone's 40% (numbers from AdMob, which is naturally not entirely representative of the smartphone market as a whole, but close enough for the purposes of argument.). And this despite Android's improved openess over iPhone OS and the wider variety of harder. Or possibly because of it... Only having one platform to develop for it certainly a massive competitive advantage for iPhone over virtually anything else out there.

"Video drivers and sound drivers are poor, sound quality is poor."

Considering that video/sound chips used by Apple are all (reasonably) commonly used hardware these days, Linux not having drivers for it is squarely the fault of the Linux community/hardware manufactures who don't provide drivers for them. Ditto goes for Windows, with the trackpad being the obvious exception there.


> Are you honestly claiming that all or even most Apple users do so because of marketing if there's no quality product to back it up?

You said 'fanbois'. I replied to 'fanbois'. You took the 'fanboi' reply and blew it out.

> as per your post history/above, you appear to use Apple hardware. Just because of the marketing?

I bought Apple because OS X 10.1 looked like a way of running Linux while still having access to commercial apps should I need them. That was in 2001. I've not bought a non-apple computer since (5 laptops, 2 desktops, 2 iPhones).

As for the other comments...

Dell's Adamo was meant to be a competitor in design and functionality for the MacBook Air but is strangled by CPU and price. There is also a thin 16" laptop but it too is strangled by CPU. It looks like they inaccurately forecasted a 15" Macbook Air, designed and implemented a competitor without Apple coming to the game, which they should have.

You also have to remember that Zune/Pink and Android are late to the market in comparison to Apple. If Apple were showing Android's AdMob numbers this early you'd be talking up their chances. Because Apple are leading the market and their competitors are in their early stages of market entrance it's acceptable to look at their total numbers and go 'pah' rather than focusing on the growth.


WinMo and RIM certainly aren't in the early stages of market entrance. For that matter, the only competitor that still is anywhere close to Apple's marketshare is Symbian (which surpasses it somewhat, I believe.) But sure, let's look at it in a year or two.

"I've not bought a non-apple computer since"

Why not?

"Dell's Adamo was meant to be a competitor in design and functionality for the MacBook Air but is strangled by CPU and price."

And thus isn't. And part of Gruber's point is that better designed hardware is worthless with shit software, which will still be Dell's problem no matter what they do.

Edit:

"You said 'fanbois'. I replied to 'fanbois'"

Well, technically I replied to you saying "fanboi" in the first place. For that matter why do you think Gruber specifically is a "fanboi"? He certainly doesn't blindly agree with everything Apple does like many Apple fans, and has been very vocally critical of them at times.


How is running Windows on Apple hardware hard?


Totally with you on this. And the herd votes you down (maybe me now too).


He seems to be neglecting the most difficult aspect of releasing your own OS: getting most developers to port/write software for it.

Apple has followed a peculiar path that has allowed it to successfully accomplish this, but it is very difficult to imagine how Dell could successfully found their own operating system at present.

Moreover, I'm not convinced the recommendation to license their operating system is motivated by herd mentality. It's probably just that people did not believe Apple could get as big a market-share for its own machines as it appears to be getting. As for the recommendation to put Windows on their machines, that just seems, well, stupid, but perhaps I am missing something.


That's kind of the key idea behind Microsoft's building of its monopoly: paying developers to produce software for its OS. Eventually you could get everything for Windows and software at random for another OS. The equilibrium here seems to be a for a single dominant OS maker.

Reminds me of the Dvorak v. Qwerty keyboard layouts. The market has space for a single dominant keyboard type. Individuals can choose between Dvorak (and variants) or Qwerty, but since so many people use Qwerty, everything is designed based on that. If Dvorak reaches a tipping point, everyone would likely abandon Qwerty.

Does anyone thing the expansion of open source makes it more likely that the OS market is trending towards being a multi-power field?


That's kind of the key idea behind Microsoft's building of its monopoly: paying developers to produce software for its OS.

Well, it is more subtle than that. If you are a large corporation, you might have 100,000 desktops, and since Windows came out you might have bought 0.5-1M licenses. That's an awful lot. Why would you do this? Because the software you absolutely depend on to run this business is written internally, and the cost of doing this in a Microsoft tool like VB (historically) is low enough to make it worthwhile. Microsoft has always cultivated its developer ecosystem well. Apple are good now, but I remember the dark days of the 90s when you never knew when a key Apple technology you depended on they would simply abandon (e.g. OpenDoc).


There are companies that do well by making brilliant things, and there are companies that do well by making commodities cheaply. It's nice when a commodity company, like Acer, makes something that seems brilliant, like a good netbook. And it's nice when a company holds a streak of brilliance for many years, like Apple has.

What makes the HP example sad is that it's a company that has recently been running on the vapors of it brilliant ideas, and is now stuck competing in commoditized markets, where you can really only compete on price. That's not to say HP couldn't survive, or even become more profitable; but for hackers who cherish the creation of brilliant things, it is a painful thing to witness.

On the other hand, this is progress. We have Microsoft's early PC OS monolopy to thank for the wonderful range of cheap, standardized PC hardware that makes linux boxes easy and macs/ipods/iphones mass-marketable.


The thing about HP, is they have always made commodity crappy consumer systems as long as I can remember. Compaq, which they devoured, was the quality system maker.


I suspect you're a bit younger than the people who remember HP's greatness, especially their scientific instruments and engineering tools. They also used to be good at making big iron servers (I'm thinking pre HP-UX). Also, one of the companies HP devoured was Digital, who ruled the world with their VAX systems.

The hardware giants used to be the likes of IBM, Sun, Digital, HP, Data General, Wang, Honeywell, and a few others I'm forgetting. The only ones left now are IBM (a services company) and HP (a commodity consumer hardware company). I'm not counting Sun because I expect the Oracle deal to go through.


And HP's laserjet printers. Those things were amazing.


I don't know why he had to bash hp at the end; of course hp is going to be making bargain PCs they need to compete with other producers, apple is about the only one that does not offer full computers for ~$500 but they are most definitely not their rival the rest of the industry is.


> I don't know why he had to bash hp at the end; of course hp is going to be making bargain PCs they need to compete with other producers

So, you're saying that HP needs to follow the herd? The entire point of this article is that they don't and shouldn't. That's exactly why he needed to bash HP at the end.


Well neither you or he offered any useful alternatives for what hp should do and I offered reasoning on why they need to sell cheaper PCs


I'll offer one useful alternative: better software integration, and use their scale to push MS into tighter integration with HP's own software. This advice applies to Dell as well.

Suggesting that HP should build its own OS is a pipe dream, and completely unrealistic. They will be running Windows for the foreseeable future. The question is: why do they have to run the same shitty old Windows that everyone else is running?

I just had the fortune of happening upon a decent Gateway desktop at a bargain-basement price, and the sheer lack of software integration is appalling to someone who's been using a Mac for the past 4 years.

Let me enumerate some concrete examples:

- Display settings are handled by vendor-specific utilities that have zero UI consistency between Windows and the actual computer manufacturer (Dell, HP, etc).

- Sound settings are handled by vendor-specific utilities and drivers that are also not configurable in a consistent way. On a HP machine practically nothing is badged "HP" except the box itself - and certainly the usability reflects this.

- Complete utter reliance on third party (both in brand and design) software that fails to integrate deeply into the OS. The whole experience feels very jarring - because it's really a bunch of code haphazardly stuck together without any thought or concern.

Imagine if you buy a Windows laptop from, say, Dell, and out of the box you get a nice Dell splashscreen, and a streamlined setup process that takes care of all your basic configs transparently (as opposed to default Windows, which loads a ton of setup upfront for clueless users)? How about the ability to use your hardware volume up/down buttons without terrible screen flickering (bad drivers)?

There are a bajillion ways that all of the standard PC manufacturers can use software to compete in the marketplace - they simply aren't doing so right now.


Besides, HP does already has its own OS... HP-UX


> why do they have to run the same shitty old Windows that everyone else is running?

I'd say Microsoft has a thing or two to say about that.


Microsoft will stop you deploying Windows with your own branded apps and custom drivers pre-installed? Really? I don't think so...


Of course not, the real problem is that hardware makers generally don't do good software. By good software I mean useful and elegant, there's a reason there's products like PC-decrapifier out there.


Yes, Sony is the classic example of this. A company like HP ought to be capable of it, tho'.


Sure he did:

> HP made news this week for unveiling a Windows 7 launch bundle at Best Buy that includes a desktop PC and two laptops, all for $1199. That might be great for Microsoft...

The OS vendor wins when computer hardware becomes a commodity. The implication is that HP (or a company like them, HP is just being used as an example here) should make their own OS. Actually, in the preceding paragraph, he more than implies that. He pretty much states it explicitly. We need more players willing to make desktop OSes to force the existing players to innovate more.


Something tells me that the author was never around for the rats nest of incompatible micro computers that arrived in the early 80's: kaypro, osbourne, pet, commodore, texas instuments, ibm, apple, atari, coleco...

All of those computers companies made their own computers and OS's and a completely different stack of software and hardware peripherals. IBM getting into the game and creating a defacto standard with their hardware and software choices helped create a boom in the computer industry that we're still riding.


oh man, do i ever not agree with that. i was around for the birth of the microcomputer, and those were very interesting days indeed.

here's somebody else who thinks so as well: "[...] But I think that there will be a good long period of cheerful chaos, just as there was in the early days of microcomputers. That was a good time for startups. Lots of small companies flourished, and did it by making cool things."

http://www.paulgraham.com/road.html

as the micro situation "stabilized," with most of the competitors going out of business, the fun went with them. the darkest period in the days of high-tech was when microsoft so thoroughly dominated everything that they were stifling innovation.


Less fun, but it meant computers went from being a novelty to the core of the vast majority of businesses. This was not a coincidence.


The web is just another OS, with an enormous API and constantly-changing revisions of everything. OSs are not even what most folks mean when they mention Windows etc. They mean the execution environment (runtime,libraries,shells). And the trend is to develop portable environments - JBOSS, .NET, whatever. We are at the beginning of the end of proprietary OS wars - OS will be no more important a decision than your editor or drive vendor, eventually.


Without competition you have one company telling you what a product is. With competition you have consumers telling you what a product should be.


"Operating systems aren’t mere components like RAM or CPUs; they’re the single most important part of the computing experience"

I think this is rapidly becoming less true with the increasing utility of the browser + web. These days, what I value most in an OS is speed, stability, and staying out of my way.


It was at the root of long-standing punditry holding that Apple should license the Mac OS to other PC makers

They did until Steve Jobs killed it.


It was self defense. No jury will convict him.

Edit: for those who don't recall. The clones were intended to speciate and fill all sorts of niches, but they mostly just built the Apple provided reference designs with maybe the odd extra bus slot or two. They then cannibalized the high end and low end of the market.

They got the high end because as Motorola was ramping up each faster chip, Apple would not release with it until they could ship in sufficient quantity to satisfy demand. The cloners had no such qualms and would happily take the piles of money off of the high paying 'must have the best crowd' with the trickle of fast chips at the front of the production runs, simultaneously robbing Apple of the lucrative consumer's business and delaying the time when Apple could get enough chips to launch their own.

The cloners ate the low end of the market, despite their disadvantage of scale, by making machines a little bit crummier than Apple wanted to build, but also by not paying a large enough licensing fee for the OS to cover their share of development.


Its fun to talk about "breaking free" of your OS vendor. As a foundation for software, and about the cheapest software on the machine, I'm not sure I know what the benefit is supposed to be. "Less unhappy"? With a platform that by definition hosts incompatible apps?


This is a really misleading article. I think people wanted to Apple to license it's hardware in the days of Macs. Then Apple stopped producing original hardware for the desktop/laptop and people wanted them to license their OS for the nearly identical hardware that other manufacturers were making.

The iPhone is a mobile device and lots other companies have produced their own mobile device and OS (or more likely heavily customized from another base). Microsoft itself have produced the Zune and Xbox. Palm has it's line. Nokia uses symbian, which it now owns.

Apple have made good hardware and software, but this is fanboy drivel.


Let me put it another way. Apple has stopped making unique hardware for the PC. Yes it sells PCs, but it doesn't innovate on the PC hardware anymore. It innovates on the cases and configurations, but these systems could all run windows or linux. Essentially Apple did start making its operating system for PCs, it just makes it a requirement that you buy the PC from them.

As for the iPhone and mobile market, any number of manufacturers have built hardware and software for mobile market. Apple simply did this by far the best.


Well, assuming by cases you include things like the magsafe power cord, which is undeniably brilliant and which I hope all laptop manufactures manage to miniskirt around the probable patents for. Or battery life for laptops which unless I'm missed something absolutely puts to shame all similarly-specced generic manufactures.


Actually the point I'm making was more directed at the article in question, not innovation at Apple in general. In that I don't think maintaining the exclusive relationship between hardware and software has been necessary for innovation. If it was there wouldn't be a need to specifically ban non-Apple PCs from running OSX. Expanding the point to the mobile market seemed dodgy when you consider the number of manufacturers that have their own OS. Game console manufacturers have also controlled the hardware and OS in their markets. I don't think it's nearly as unique as this article tries to make it.


Their value they offer is the UI software but they're determined to get their money from marking up hardware. Presumably they have figured out people are more willing to pay the same premium when it's a smaller portion of the total price (even when it means they have to replace an entire computer to do it). In the same vein, they charged cloners too little to cover the development costs of MacOS, which is why losing desktop hardware market share to the cloners (who were more efficient hardware producers) almost killed them instead of freeing them from the 10% ghetto.


This is a short list of the computers and their makers who have broken from the herd:

Microsoft XBox

Sony Playstation

Nintendo Wii

Apple iPhone

Palm Pre

RIM Blackberry

HP digital cameras


The only equalizer will be applications online. Right now running web os apps suck compared to PC equivilents, but when they don't and people start using them instead of Office, or other programs, then who gives a rats ass what OS we use. We'll all be on dumb terminals and we won't give a shit. That will be Microsofts fall.....or will it? I heard somewhere that their office online product was way better than googles. I don't know.


Hey, it sounds like the late 80s or early 90s! Dumb terminals are coming back in style! Nice, turns up the Nirvana


What a fucking joke!

Because there aren't thousands of different types of PCs of every shape and size, tablet, netbook, professional laptop, gaming laptop, gaming desktop, small form factor desktop, workstation...

All of these computers could run OS X, but Apple doesn't want them to.

And lest we forget, Gruber's job is to talk up Apple('s stock price). Why anyone should take his opinion seriously is beyond me. He has a vested monetary interest in Apple fanboyism.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: