The analogy of OS as cars (Windows is a station wagon, Linux is a tank) is brought up in the recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.
And Neal Stephenson acknowledged it was obsolete in 2004:
"I embraced OS X as soon as it was available and have never looked back. So a lot of 'In the beginning was the command line' is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."
But people still dredge this quarter century old apocrypha up and use it to pat themselves on the back for being Linux users. "I use a Hole Hawg! I drive a tank! I'm not like those other fellows because I'm a real hacker!"
Neal gave up on Linux because he wasn't a developer. He couldn't take advantage of the freedoms it provided and it worked the same for him as any proprietary OS would. I.E. he had the excuse that programming is hard, a specialty that requires much practice. This is an ongoing issue with free software and is why it is niche... it primarily appeals to software developers as they are the only ones that can take advantage of the freedoms it provides and are the ones that truly sacrifice that freedom when they use a non-free OS.
Yeah, this is basically my take too. I had a hardcopy of this sometime between 99 and 02, and read it several times
At the time I was an embedded developer at Microsoft and had been a Windows programmer in the mid 90s. It was pretty clear that there was some dunning Krueger going on here. Neal knew enough about tech to be dangerous, but not really enough to be talking with authority
Given what OS X has become it's un-obsoleted itself again.
It's kind of ironic that you're using a post from 20 years ago to invalidate an essay from 25 years ago, about an OS that's been substantially dumbed down in the last 10 years.
I also "embraced OS X as soon as it was available". My first Linux install was Yggdrasil, but I cut my teeth on SPARC stations. I respected two kinds of computer: Unix workstations, and the Macintosh.
So when Apple started making workstations, I got one. I've been a satisfied customer ever since.
I have no idea whatsoever what dumbing down you're referring to. The way I use macOS has barely changed in the last ten years. In fact, that's a major part of the appeal.
Just got a Windows 11 machine. Had to, to run Solidworks. Have it next to a new M3 iMac. They’re all configured with the same apps. Despite not having used Windows in 10 years, these machines behave identically. But Windows 11 is snappier. And you can actually find things that you don’t know where they are!
With all the ideological hate against Microsoft/Windows (even among its long-term users it seems) everybody seems to miss the part where Windows 11 is actually pretty good and I would say in some ways actually superior to macOS nowadays, especially with PowerToys.
For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.
macOS isn't too bad yet but is clearly lagging behind, Apple is unwillingly to meaningfully improve some parts and seems to refuse to sherlock some apps because it clearly goes against their business interests.
They make more money earning the commission on additional software sales from the App Store, a clear conflict of interest. They got complacent just like Valve with all the money from running it's marketplace.
> For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.
macOS is behind. And this is speaking as someone who probably owns one of everything Apple makes, Apple stock, and was exclusively Mac for the last 10 years.
I have less than 50 hours use on my Windows 11 machine, a midgrade Lenovo P358 rig I bought renewed because it had plenty of memory and an Nvidia T1000 card. Yet it taught me that the test of an operating system is how quickly you can navigate around, and how well it can find things, given only clues. Windows 11 is just snapper, quicker, than the latest macOS running on a new M3 Mac.
This is also my experience.
Worse, it is snappier on a 10 yo computer (top of the line though) than on an expensive 24GB M2 MacBook Pro.
There is some software that I find nice and convenient in macOS but it has gotten really hard to justify the price of the hardware considering the downsides.
I’m not knocking windows for me I’m just kind of hooked on the Mac trackpad and a few little things about the Mac that I prefer now. I use a networked Windows machine semi regularly and there’s nothing wrong with it. I just remember the days of BSOD and viruses and random shutdowns to install updates that couldn’t be stopped in the middle of the workday and 1000 other little niggles that makes me choose a Mac. I’m sure contemporary windows machines if configured right are totally fine, better even - my housemate keeps touching to scroll on my Mac screen because she has a windows laptop that comes with a touchscreen and I can see how that would be handy on the Mac.
I will never understand the sentiment that macOS has been “dumbed down.”
It’s a zsh shell with BSD utils. 99% of my shell setup/tools on Linux just work on macOS. I can easily install the gnu utils if I want 99.9% similarity.
I very happily jump between macOS and Linux, and while the desktop experience is always potentially the best on Linux (IMO nothing compares to hyprland), in practice macOS feels like the most polished Linux distro in existence.
Do people just see, like, some iOS feature and freak out? This viewpoint always seems so reactionary. Whereas in reality, the macOS of the past that you’re pining for is still right there. Hop on a Snow Leopard machine and a Ventura machine and you’ll see that there are far, far more similarities than differences.
I stopped using Mac because of the planned obsolescence, which is a huge problem with machines even 4 years old. Can't even update those BSD utils anymore because you can't update the OS, because you don't have the latest hardware, because you don't want to spend 2k more to get back the basics you had already paid for.
MacOS Sequoia supports every computer 4 years old. Almost no apps or CLI programs will absolutely require it unless they’re specifically invented to support some brand new feature. I’m running it on my 2018 Mac Mini.
If you wish Apple supported computers longer, fine. I’d personally disagree because I’ve had wonderful luck with them supporting my old hardware until said hardware was so old that it was time to replace it anyway, but would respect your different opinion. Don’t exaggerate it to make a point though.
I never said any such thing. I said it’s not a problem for me but that others may have a different opinion. And you could still use an older OS on that Mac; the ones it shipped with don’t magically stop working.
You're not a typical Mac OS user. The typical users I know do not know how to use finder, let alone shell commands, to navigate file system. Personally I have no issues developing in either Mac, Linux or Windows because I'm advanced level. But for the same reason, I prefer Linux or even Windows because those provide more freedom to the developer.
You are getting downvoted by the fanboys but this is exactly my experience too.
There is a type of macOS users that are experts (in technology in general) but those are by far the minority (and it is shrinking).
For the most part the macOS user is of the religious zealots' type and they barely know how to do the basics, far worse than you average seasoned windows user, even though in principle macOS should be easier to handle (in practice it's not exactly true but still...).
People here who seemed to think otherwise really live in the reality distortion field and it seems to be linked to the mythical Silicon Valley "hacker". At first, I drank the kool-aid on that definition but it actually seems pretty disrespectful for "real" hacker; but whatever, I guess.
In what way has it been “dumbed down?” I use modern MacOS as a Unix software development workstation and it works great- nothing substantial has changed in 20 years other than better package managers. I suppose they did remove X11 but it’s trivial to install yourself.
Not GP, but usually when people talk about the "dumbing down" of macOS, they refer to new apps and GUI elements adopted from iOS.
macOS as an operating system has been "completed" for about 7 years. From that point, almost all additions to it have been either focused on interoperation with the iPhone (good), or porting of entire iPhone features directly to Mac (usually very bad).
Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.
There are so many known systemd-resolved bugs [1][2] that I can't tell which one was breaking both of my simple Ubuntu desktop machines. Systemd-resolved sets itself up as the sole DNS resolver and then randomly reports it can't reach any DNS servers.
Yes and... the tools are now highly distro-specific. I don't want to allocate my study time to resolvectl, I want to allocate it to programming, but my home server requires me to be a beginner again in something that was easy a decade ago. And I am not getting anything of value for that trade.
> Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.
Many of those ideas came from NeXT, so more like 30 years ago.
I don't see how any of that is an issue... basically you can now run iOS software which is great, and there are some interface and design elements from iOS- which frankly has a great interface, and they're improvements I like.
I agree there is some conceptual inconsistency- which I see on almost all OSs nowadays, but Windows 8 being the most egregious example, where you are mixing smartphone and traditional desktop interface elements in a confusing way.
However and unfortunately I feel your last statement is spot tf on! Our only hope I guess is that they have incurred enough tech debt to be unable to enshitify themselves.
For those not in the know apple is an og hacker company, their first product was literally a blue box! Why this matters and gp is correct and why linux peeps gets in a tivvy and what stephenson was getting at with the batmobile analogy is that traditionally if hackers built something consumer facing they couldn’t help themselves but to bake in the easter eggs.
With each new version it has become increasingly hostile to installing new software, particularly open-source software that hasn't been "signed" by a commercial developer, throwing up huge warning windows suggesting that anyone daring to run such stuff is taking a huge risk. And many of the standard UNIX locations have become locked down making it impossible to install stuff there. It's clear that Apple would like to see a world where everything is installed on a Mac via their App Store and everyone writing Mac software are official paid developers as with their phones.
I don’t understand this sort of comment. The warning windows aren’t “huge”. In practice is clicking through the dialog any more cumbersome than typing sudo and entering your password? In reality is the dialog any less appropriate for the average Linux desktop user?
Is locking down the System folder any more problematic than app armor, and any less useful for system integrity? Putting everything from brew under /opt follows UNIX conventions perfectly fine, definitely more than using snaps in Ubuntu for basic command line utilities. And installing whatever you want on macOS is just as easy as it is on Ubuntu.
This sort of complaint just gets so boring and detached from reality, and I’m not saying that you don’t use macOS but it reads like something from someone who couldn’t possibly be using it day-to-day. For me it’s a great compromise in terms of creating an operating system where I can do anything that I would do in Linux with just as much ease if not more, but also not have to provide tech support on for my elderly parents.
I wouldn't mind in the least if it was a matter of using sudo. That's a logical elevation of privileges. MacOS already does this at points, asking you for your password (which if you are an administrator is basically running sudo for you). These warning messages and locking down the /usr hierarchy (even with sudo) are different as they aren't asking for more access but merely to spread FUD about open access software (yes, you can use brew if the program you want is in it, but that is just adding another garden even if less walled, and it works because someone in the Homebrew project is signing the binaries).
I have used UNIX/Linux on a daily basis for over 30 years, and OSX/MacOS daily for over 15 years. I know how UNIX systems work and where things traditionally are located. And until a few years ago MacOS was a reasonable UNIX that could be used more or less like a friendly UNIX system -- but it is becoming increasingly less so.
You are switching the goalpost. Not only are there some "security" features that you can't disable and are of dubious actual usefulness like the system partition but they make it much harder to actually hack around the system and modify stuff as you see fit. It has also complexified the installation use of a range of software that is more annoying than it should be.
The openness and freedom to modify like an open UNIX was a major selling point, losing all that for "security" features that mostly appeal to the corporate are not great. Those features also need to be proven useful because as far as I'm concerned, it's all theory, in practice I think they are irrelevant.
The notification system is as annoying and dumb as in iOS and the nonstop "security" notification and password prompt is just a way to sell you on the biometrics usefulness; which Apple, like big morons they are, didn't implement in a FaceID way, in the place where it made the most sense to begin with: laptops/desktops. Oh, but they have a "nice", totally not useless notch.
Many of the modern Apps are ports of their iOS version, wich makes them feel almost as bad as webapps (worse if we are talking about webapps on windows) and they are in general lacking in many ways both from a feature and UI standpoint.
Apple Music is a joke of a replacement for iTunes, and I could go on and on.
The core of the system may not have changed that much (well expect your data is less and less accessible, forcibly stored in their crappy obscure iCloud folder/dbs with rarely decent exports functions) but as the article hinted very well, you don't really buy an OS, just like nobody is really buying solely an engine.
A great engine is cool and all, but you need a good car around that to make it valuable and this is exactly the same for an OS.
It used to be that macOS was a good engine with a great car around, in the form of free native apps that shipped with it or 3rd party ones. Nowadays unless you really need the benefits of design/video apps very optimized for Apple platforms it increasingly is not a great car.
Apps around the system aren't too bad but they are very meh, especially for the price you pay for the privilege (and the obsolescence problem already mentioned above).
It's not really that macOS has regressed a lot (although it has in some in the iOSification process) but also that it didn't improve a whole lot meanwhile price and other penalty factors increased a lot.
But I doubt you can see the light, you probably are too far in your faith.
These are great features IMO, as a unix savvy 'power user.'
A system should be heavily locked down and secure by default unless you really know what you are doing and choose to manually override that.
Modern MacOS features add an incredible level of security- it won't run non-signed apps unless you know what you're doing and override it. Even signed apps can only access parts of the filesystem you approve them to. These things are not a hassle to override, and basically make it impossible for hostile software to do things that you don't want it to.
I stopped using it a few years ago, but IMO it was definitely being dumbed down and not respecting users any more. Things like upgrades reseting settings that I went out of the way to change - Apple has a "we know better than you" attitude that's frustrating to work around.
Just using at a barely advanced level for 20 years or so as I do, the other comment was correct in that it is the changes that have seemingly been made to make it more familiar to iOS users and “idiot proof”.
Mainly slowly hiding buttons and options and menus that used to be easily accessible, now require holding function or re-enabling in settings or using terminal to bring them back.
- The settings app is now positively atrocious, "because iPhone"
- SIP is an absolute pox to deal with.
- "Which version of Python will we invoke today" has become a fabulous game with
multiple package managers in the running
- AppCompat games.
- Continued neglect for iTunes (which is now a TV player with a "if we must also provide music, fine" segment added - but it still thinks it should be a default client for audio files)
- iCloud wedging itself in wherever it can
Yes, all of those can be overcome. That's because the bones are still good, but anything that Apple has hung off those since Tim Cook is at best value neutral, and usually adds a little bit more drag for every new thing.
Don't get me wrong, I still use it - because it's still decent enough - but there's definitely a trajectory happening.
Settings - I preferred the rectilinear layout, but I don't see why making it linear makes it atrocious.
If you don't want SIP, it will take you a few minutes to reboot and switch it off permanently (or perhaps until the next OS upgrade). This is really the only one in the list which has to be "overcome", and personally I think that SIP enabled by default is the right choice. Anyone who needs SIP disabled can work out how to do that quickly - but it is years since I've had a reason to do it even temporarily, so I suspect the audience for this is small.
Multiple package managers and Python: that sounds like a problem caused by running multiple third party package managers.
If you want games, x86 or console is the preferred choice. Issue for some, decidely not for others. I'd much rather have the Mx processor than better games support.
iTunes - I can't comment, I don't use it.
iCloud - perfectly possible to run without any use of iCloud, and I did for many years. I use it for sync for couple of third party apps, and it's nice to have that as an available platform. It doesn't force its way in, and the apps that I use usually support other platforms as well.
> There was a competing bicycle dealership next door (Apple) that one day began selling motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.
Neal said the essay was quickly obsolete, especially in regards to Mac, but I'll always remember this reference about hermetically sealed Apple products. To this day, Apple doesn't want anyone to know how their products work, or how to fix them, to the point where upgrading or expanding internal hardware is mostly impossible.
The difference between Apple and IBM is the latter lost control of their platform and those who inherited, arguably Intel and Microsoft, had no interest in exerting absolute control. (If they tried, it likely would have backfired anyhow.)
As for Apple, their openness comes and goes. The Apple II was rather open, the early Macintosh was not. Macintosh slowly started opening up with early NuBus machines through early Mac OS X. Since then they seem to be closing things up again. Sometimes it was for legitimate reasons (things had to be tightened up for security). Sometimes it was for "business" reasons (the excessively tight control over third-party applications for iOS and the incredible barriers to repair).
As for the author's claims about their workings being a mystery, there wasn't a huge difference between the Macintosh and other platforms. On the software level: you could examine it at will. At the hardware level, nearly everyone started using custom chips at the same time. The big difference would have been IBM compatibles, where the chipsets were the functional equivalent of custom chips yet were typically better documented simply because multiple hardware and operating system vendors needed to support them. Even then, by 1999, the number of developers who even had access to that documentation was limited. The days of DOS, where every application developer had to roll their own hardware support were long past. Open source developers of that era were making a huge fuss over the access to documentation to support hardware beyond the most trivial level.
I think its still quite relavent. There are still people who enjoy more diy OS. Now a days they use arch or something. That doesn't make them any better or worse than anyone else. Some people enjoy tinkering with their OS, other people just want something that Just Works(tm) and there is nothing wrong with that.
Threre is an implicit supperiority in the text which is just as cringey now as it was at the time, but i think its still a good analogy about different preferences and relationships different people have to their computers.
"Obsolete" is too strong a word, I think. OSX isn't an evolution of the Macintosh's operating system; That'd be Pink, which was even mentioned, and it crashed and burned. OSX was far closer to a Linux box and a Mac box on the same desk, therefore the only change really needed is to replace mentions of Unix or specifically Linux with Linux/OSX as far as the points of the piece are concerned. If Jobs had paid Torvalds to call OSX "Apple Linux" (Or maybe just called it Apple Berkeley Unix) for some reason this would be moot.
I also primarily use Windows and don't have a dog in the fight you mentioned. I might actually dislike Linux more than OSX, though it has been quite a while since I've seriously used the one-button OS.
> OSX was far closer to a Linux box and a Mac box on the same desk
Setting aside the "more BSD/Mach than Linux", OS X pressed a lot of the same buttons that BeOS did: a GUI system that let you drop to a Unix CLI (in Be's case, Posix rather than Unix, if we're going to be persnickety), but whose GUI was sufficiently complete that users rarely, if ever, had to use the CLI to get things done. Folks who love the CLI (hi, 99% of HN!) find that attitude baffling and shocking, I'm sure, but a lot of people really don't love noodling with text-based UIs. I have friends who've used the Mac for decades -- and I don't mean just use it for email and web browsing, but use it for serious work that generates the bulk of their income (art, desktop publishing, graphic design, music, A/V editing, etc.) -- who almost never open the Terminal app.
> though it has been quite a while since I've seriously used the one-button OS
Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)
macOS shares zero lineage with Linux, which itself shares zero lineage with the UNIX derivatives. It would make zero sense for Apple to call macOS "Apple Linux" when it doesn't use the Linux kernel. Mac OS X is closest to a NeXTStep box with a coat of Mac-like polish on top. Even calling it "Apple Berkeley Unix" wouldn't make sense, because the XNU kernel is a mish mash of both BSD 4.3 and the Mach kernel.
I used to have a coworker, a senior dev of decades of experience, who insisted that MacOS was a real Linux, "just like BSD". Sigh.
Of course, this belief probably had no downsides or negative consequences, other than hurting my brain, which they probably did not regard as a significant problem.
The crabs analogy isn't a good one, because they evolve independently to play well in a common environment. GNU/Linux is a rewrite of Unix that avoids the licensing and hardware baggage that kept Unix out of reach of non-enterprise users in the 80s and early 90s.
Your statement seems very strong. Is Mac OS X not based on Darwin? Are you defining "lineage" in some way to only mean licensing and exclude the shared ideas (a kernel manipulating files)? Thanks for the "carcinisation" link.
To be more precise: this simplified history suggests a shared "lineage" back to the original UNIXes of all of BSDs (and hence Darwin and OSX) and of the GNU/Linux and other OSes https://unix.stackexchange.com/a/3202
So, "zero shared lineage" seems like a very strong statement.
Apple was in some ways on the right track with an OS that had no command line. Not having a command line meant you had to get serious about how to control things.
The trouble with the original MacOS was that the underlying OS was a cram job to fit into 128Kb, plus a ROM. It didn't even have a CPU dispatcher, let alone memory protection. So it scaled up badly. That was supposed to be fixed in MacOS 8, "Copeland", which actually made it out to some developers. But Copeland was killed so that Apple could hire Steve Jobs, for which Apple had to bail out the Next failure.
I feel now the different way. Linux is never that much more powerful than other OS. It felt more like a tractor 10 years ago and now it is a good alternative to "Toyota Camry" with pretty good user experience.
Meanwhile Windows has become those cars with two 27" screen as dashboard, which has bad user experience and full of advertisements.
Mostly in all of Sweden, all of Finland, probably still quite a lot in Denmark, but perhaps not so many in Norway any more (since they are rapidly electrifying[1] the entire national car fleet).
___
[1]: Ironically, the heavy state subsidies for buying an electric car are, like anything else in Norway, financed by... Oil money.
The way I explained my continued loathing of Microsoft to a friend was that back in the day, it was like having two choices of car: a Ford Pinto, or a pickup truck. Not everybody needed the heavy transport capabilities of the pickup, but their only choice if they didn't want the added bulk and fuel consumption was the Pinto, which had dangerous failure modes. Later, the Pinto would be withdrawn and replaced with a Ford Taurus, a serviceable but not particularly fun or performant vehicle which worked fine usually, but was based on a transaxle design, so when it did break you had to dismantle the entire front end of the vehicle in order to repair it. And now imagine that there were roads you couldn't drive on and places you couldn't go unless you had one of these three vehicles, simply due to Ford's market pressure on infrastructure planners, not for any good reason; and besides, people mocked you for wanting something else.
They were given “tablet friendly” user interfaces that made navigation extremely difficult. But Microsoft never finished the job so you’d try to find something in the stupid sliding screen interface only to be thrown into a windows 3.1 era control panel. Windows server of that era was even worse, as mostly you administered using RDP but it never played nice with the “hover here to bring up the whole screen menu” required to drive it.
Underneath it was just Windows, but the interface ruined it
Vista significantly changed the security model, introducing UAC and requiring driver signing. This caused a bunch of turmoil. It also had much higher system requirements than xp for decent performance, meaning that it ran at least a bit shit on many people's machines when it was new, and that did it no favors.
> recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.
I assume if anyone associated with Microsoft compared Vista to anything other than an abject failure, it's because they are - at best - broken or defective people who were involved in the creation of Vista, and therefore not objective and not to be trusted in any way.
The first generation or two of Vipers were very raw & unrefined, with ~400 horsepower, no stability control, no anti-lock brakes, minimal safety features, cheap & janky looking interiors, etc. It sounds great to me (aside from the lack of a/c & nice ergonomic interior), but it would be extremely easy to have a bad time if you're an average driver.
I'm not sure about the analogy though, they might have been thinking of later Viper versions where the complaints would be more about cost, gas mileage, or general impracticality for daily use.
The analogy of OS as cars (Windows is a station wagon, Linux is a tank) is brought up in the recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.