Hacker News new | past | comments | ask | show | jobs | submit login
In the Beginning Was the Command Line (1999) (stanford.edu)
412 points by conanxin 3 months ago | hide | past | favorite | 260 comments



This essay by Neal Stephenson was first published in 1999. https://en.m.wikipedia.org/wiki/In_the_Beginning..._Was_the_...

The analogy of OS as cars (Windows is a station wagon, Linux is a tank) is brought up in the recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.


And Neal Stephenson acknowledged it was obsolete in 2004:

"I embraced OS X as soon as it was available and have never looked back. So a lot of 'In the beginning was the command line' is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

https://slashdot.org/story/04/10/20/1518217/neal-stephenson-...

But people still dredge this quarter century old apocrypha up and use it to pat themselves on the back for being Linux users. "I use a Hole Hawg! I drive a tank! I'm not like those other fellows because I'm a real hacker!"


Neal gave up on Linux because he wasn't a developer. He couldn't take advantage of the freedoms it provided and it worked the same for him as any proprietary OS would. I.E. he had the excuse that programming is hard, a specialty that requires much practice. This is an ongoing issue with free software and is why it is niche... it primarily appeals to software developers as they are the only ones that can take advantage of the freedoms it provides and are the ones that truly sacrifice that freedom when they use a non-free OS.


Yeah, this is basically my take too. I had a hardcopy of this sometime between 99 and 02, and read it several times

At the time I was an embedded developer at Microsoft and had been a Windows programmer in the mid 90s. It was pretty clear that there was some dunning Krueger going on here. Neal knew enough about tech to be dangerous, but not really enough to be talking with authority


Given what OS X has become it's un-obsoleted itself again.

It's kind of ironic that you're using a post from 20 years ago to invalidate an essay from 25 years ago, about an OS that's been substantially dumbed down in the last 10 years.

Bad corporate blood will tell.


I also "embraced OS X as soon as it was available". My first Linux install was Yggdrasil, but I cut my teeth on SPARC stations. I respected two kinds of computer: Unix workstations, and the Macintosh.

So when Apple started making workstations, I got one. I've been a satisfied customer ever since.

I have no idea whatsoever what dumbing down you're referring to. The way I use macOS has barely changed in the last ten years. In fact, that's a major part of the appeal.


Seconded. Make it 20 years.


Just got a Windows 11 machine. Had to, to run Solidworks. Have it next to a new M3 iMac. They’re all configured with the same apps. Despite not having used Windows in 10 years, these machines behave identically. But Windows 11 is snappier. And you can actually find things that you don’t know where they are!

I was amazed.


With all the ideological hate against Microsoft/Windows (even among its long-term users it seems) everybody seems to miss the part where Windows 11 is actually pretty good and I would say in some ways actually superior to macOS nowadays, especially with PowerToys.

For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.

macOS isn't too bad yet but is clearly lagging behind, Apple is unwillingly to meaningfully improve some parts and seems to refuse to sherlock some apps because it clearly goes against their business interests. They make more money earning the commission on additional software sales from the App Store, a clear conflict of interest. They got complacent just like Valve with all the money from running it's marketplace.


> For starter, it is much less annoying from a security/notification standpoint, you can tell it to fuck off and let you do your things if you know what you are doing.

In many corporate environments you can't.


macOS is behind. And this is speaking as someone who probably owns one of everything Apple makes, Apple stock, and was exclusively Mac for the last 10 years.

I have less than 50 hours use on my Windows 11 machine, a midgrade Lenovo P358 rig I bought renewed because it had plenty of memory and an Nvidia T1000 card. Yet it taught me that the test of an operating system is how quickly you can navigate around, and how well it can find things, given only clues. Windows 11 is just snapper, quicker, than the latest macOS running on a new M3 Mac.


This is also my experience. Worse, it is snappier on a 10 yo computer (top of the line though) than on an expensive 24GB M2 MacBook Pro.

There is some software that I find nice and convenient in macOS but it has gotten really hard to justify the price of the hardware considering the downsides.


I’m not knocking windows for me I’m just kind of hooked on the Mac trackpad and a few little things about the Mac that I prefer now. I use a networked Windows machine semi regularly and there’s nothing wrong with it. I just remember the days of BSOD and viruses and random shutdowns to install updates that couldn’t be stopped in the middle of the workday and 1000 other little niggles that makes me choose a Mac. I’m sure contemporary windows machines if configured right are totally fine, better even - my housemate keeps touching to scroll on my Mac screen because she has a windows laptop that comes with a touchscreen and I can see how that would be handy on the Mac.


I will never understand the sentiment that macOS has been “dumbed down.”

It’s a zsh shell with BSD utils. 99% of my shell setup/tools on Linux just work on macOS. I can easily install the gnu utils if I want 99.9% similarity.

I very happily jump between macOS and Linux, and while the desktop experience is always potentially the best on Linux (IMO nothing compares to hyprland), in practice macOS feels like the most polished Linux distro in existence.

Do people just see, like, some iOS feature and freak out? This viewpoint always seems so reactionary. Whereas in reality, the macOS of the past that you’re pining for is still right there. Hop on a Snow Leopard machine and a Ventura machine and you’ll see that there are far, far more similarities than differences.


I stopped using Mac because of the planned obsolescence, which is a huge problem with machines even 4 years old. Can't even update those BSD utils anymore because you can't update the OS, because you don't have the latest hardware, because you don't want to spend 2k more to get back the basics you had already paid for.


MacOS Sequoia supports every computer 4 years old. Almost no apps or CLI programs will absolutely require it unless they’re specifically invented to support some brand new feature. I’m running it on my 2018 Mac Mini.

If you wish Apple supported computers longer, fine. I’d personally disagree because I’ve had wonderful luck with them supporting my old hardware until said hardware was so old that it was time to replace it anyway, but would respect your different opinion. Don’t exaggerate it to make a point though.


It's ridiculous that you would claim this isn't a problem.

I'm typing this on a 12 year old MacBook Pro running Debian whose hardware perfectly fine, but hasn't been supported by Apple in years.

FWIW, Debian supports it fine, though NVidia recently dropped support for the GPU in their Linux drivers.

I'm going to miss it when it dies, too. Plastic Lenovos just can't compare.


I never said any such thing. I said it’s not a problem for me but that others may have a different opinion. And you could still use an older OS on that Mac; the ones it shipped with don’t magically stop working.


You're not a typical Mac OS user. The typical users I know do not know how to use finder, let alone shell commands, to navigate file system. Personally I have no issues developing in either Mac, Linux or Windows because I'm advanced level. But for the same reason, I prefer Linux or even Windows because those provide more freedom to the developer.


You are getting downvoted by the fanboys but this is exactly my experience too. There is a type of macOS users that are experts (in technology in general) but those are by far the minority (and it is shrinking).

For the most part the macOS user is of the religious zealots' type and they barely know how to do the basics, far worse than you average seasoned windows user, even though in principle macOS should be easier to handle (in practice it's not exactly true but still...).

People here who seemed to think otherwise really live in the reality distortion field and it seems to be linked to the mythical Silicon Valley "hacker". At first, I drank the kool-aid on that definition but it actually seems pretty disrespectful for "real" hacker; but whatever, I guess.


In what way has it been “dumbed down?” I use modern MacOS as a Unix software development workstation and it works great- nothing substantial has changed in 20 years other than better package managers. I suppose they did remove X11 but it’s trivial to install yourself.


Not GP, but usually when people talk about the "dumbing down" of macOS, they refer to new apps and GUI elements adopted from iOS.

macOS as an operating system has been "completed" for about 7 years. From that point, almost all additions to it have been either focused on interoperation with the iPhone (good), or porting of entire iPhone features directly to Mac (usually very bad).

Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.


> Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then.

This also applies to Windows, by the way (except it’s more like 20-30 years ago).


Whereas Linux never stopped coming up with new ideas, but doesn't have the manpower to implement them


systemd!

(Currently struggling with the way systemd inserts itself into the DNS query chain and then botches things.)


The how-hard-can-it-be-and-who-cares-anyway approach to replacing basic system components. Love it.


It likes to fall over to the secondary server, doesn't it.


There are so many known systemd-resolved bugs [1][2] that I can't tell which one was breaking both of my simple Ubuntu desktop machines. Systemd-resolved sets itself up as the sole DNS resolver and then randomly reports it can't reach any DNS servers.

[1] https://github.com/systemd/systemd/issues?q=is%3Aissue+is%3A...

[2] https://www.reddit.com/r/linux/comments/18kh1r5/im_shocked_t...


Yes and... the tools are now highly distro-specific. I don't want to allocate my study time to resolvectl, I want to allocate it to programming, but my home server requires me to be a beginner again in something that was easy a decade ago. And I am not getting anything of value for that trade.


It likes to botch things.


Which is why I gave up on it. Was tired of something in my workflow breaking every 6 weeks because “ooh shiny”


> Another point of view is that macOS is great, but all ideas that make it great come from 20 years ago, and have died at the company since then. If Apple were to build a desktop OS today, there's no way they would make it the best Unix-like system of all time.

Many of those ideas came from NeXT, so more like 30 years ago.


I don't see how any of that is an issue... basically you can now run iOS software which is great, and there are some interface and design elements from iOS- which frankly has a great interface, and they're improvements I like.

I agree there is some conceptual inconsistency- which I see on almost all OSs nowadays, but Windows 8 being the most egregious example, where you are mixing smartphone and traditional desktop interface elements in a confusing way.


Yeah thats fair and I concur, but gp is right.

However and unfortunately I feel your last statement is spot tf on! Our only hope I guess is that they have incurred enough tech debt to be unable to enshitify themselves.

For those not in the know apple is an og hacker company, their first product was literally a blue box! Why this matters and gp is correct and why linux peeps gets in a tivvy and what stephenson was getting at with the batmobile analogy is that traditionally if hackers built something consumer facing they couldn’t help themselves but to bake in the easter eggs.


With each new version it has become increasingly hostile to installing new software, particularly open-source software that hasn't been "signed" by a commercial developer, throwing up huge warning windows suggesting that anyone daring to run such stuff is taking a huge risk. And many of the standard UNIX locations have become locked down making it impossible to install stuff there. It's clear that Apple would like to see a world where everything is installed on a Mac via their App Store and everyone writing Mac software are official paid developers as with their phones.


I don’t understand this sort of comment. The warning windows aren’t “huge”. In practice is clicking through the dialog any more cumbersome than typing sudo and entering your password? In reality is the dialog any less appropriate for the average Linux desktop user?

Is locking down the System folder any more problematic than app armor, and any less useful for system integrity? Putting everything from brew under /opt follows UNIX conventions perfectly fine, definitely more than using snaps in Ubuntu for basic command line utilities. And installing whatever you want on macOS is just as easy as it is on Ubuntu.

This sort of complaint just gets so boring and detached from reality, and I’m not saying that you don’t use macOS but it reads like something from someone who couldn’t possibly be using it day-to-day. For me it’s a great compromise in terms of creating an operating system where I can do anything that I would do in Linux with just as much ease if not more, but also not have to provide tech support on for my elderly parents.


I wouldn't mind in the least if it was a matter of using sudo. That's a logical elevation of privileges. MacOS already does this at points, asking you for your password (which if you are an administrator is basically running sudo for you). These warning messages and locking down the /usr hierarchy (even with sudo) are different as they aren't asking for more access but merely to spread FUD about open access software (yes, you can use brew if the program you want is in it, but that is just adding another garden even if less walled, and it works because someone in the Homebrew project is signing the binaries).

I have used UNIX/Linux on a daily basis for over 30 years, and OSX/MacOS daily for over 15 years. I know how UNIX systems work and where things traditionally are located. And until a few years ago MacOS was a reasonable UNIX that could be used more or less like a friendly UNIX system -- but it is becoming increasingly less so.


having to go into the terminal to run chattr in order to remove the quarantine bit is a lot to ask of a non technical user.


You are switching the goalpost. Not only are there some "security" features that you can't disable and are of dubious actual usefulness like the system partition but they make it much harder to actually hack around the system and modify stuff as you see fit. It has also complexified the installation use of a range of software that is more annoying than it should be.

The openness and freedom to modify like an open UNIX was a major selling point, losing all that for "security" features that mostly appeal to the corporate are not great. Those features also need to be proven useful because as far as I'm concerned, it's all theory, in practice I think they are irrelevant.

The notification system is as annoying and dumb as in iOS and the nonstop "security" notification and password prompt is just a way to sell you on the biometrics usefulness; which Apple, like big morons they are, didn't implement in a FaceID way, in the place where it made the most sense to begin with: laptops/desktops. Oh, but they have a "nice", totally not useless notch.

Many of the modern Apps are ports of their iOS version, wich makes them feel almost as bad as webapps (worse if we are talking about webapps on windows) and they are in general lacking in many ways both from a feature and UI standpoint.

Apple Music is a joke of a replacement for iTunes, and I could go on and on.

The core of the system may not have changed that much (well expect your data is less and less accessible, forcibly stored in their crappy obscure iCloud folder/dbs with rarely decent exports functions) but as the article hinted very well, you don't really buy an OS, just like nobody is really buying solely an engine. A great engine is cool and all, but you need a good car around that to make it valuable and this is exactly the same for an OS. It used to be that macOS was a good engine with a great car around, in the form of free native apps that shipped with it or 3rd party ones. Nowadays unless you really need the benefits of design/video apps very optimized for Apple platforms it increasingly is not a great car.

Apps around the system aren't too bad but they are very meh, especially for the price you pay for the privilege (and the obsolescence problem already mentioned above).

It's not really that macOS has regressed a lot (although it has in some in the iOSification process) but also that it didn't improve a whole lot meanwhile price and other penalty factors increased a lot.

But I doubt you can see the light, you probably are too far in your faith.


These are great features IMO, as a unix savvy 'power user.'

A system should be heavily locked down and secure by default unless you really know what you are doing and choose to manually override that.

Modern MacOS features add an incredible level of security- it won't run non-signed apps unless you know what you're doing and override it. Even signed apps can only access parts of the filesystem you approve them to. These things are not a hassle to override, and basically make it impossible for hostile software to do things that you don't want it to.


I stopped using it a few years ago, but IMO it was definitely being dumbed down and not respecting users any more. Things like upgrades reseting settings that I went out of the way to change - Apple has a "we know better than you" attitude that's frustrating to work around.


Which settings? I am a long term MacOS (and Linux) user and have not noticed such problems.


The "Allow Apps from Anywhere" setting, amongst others.

Linux works better for me, anyway.


Just using at a barely advanced level for 20 years or so as I do, the other comment was correct in that it is the changes that have seemingly been made to make it more familiar to iOS users and “idiot proof”.

Mainly slowly hiding buttons and options and menus that used to be easily accessible, now require holding function or re-enabling in settings or using terminal to bring them back.


Off the top of my head:

- The settings app is now positively atrocious, "because iPhone"

- SIP is an absolute pox to deal with.

- "Which version of Python will we invoke today" has become a fabulous game with multiple package managers in the running

- AppCompat games.

- Continued neglect for iTunes (which is now a TV player with a "if we must also provide music, fine" segment added - but it still thinks it should be a default client for audio files)

- iCloud wedging itself in wherever it can

Yes, all of those can be overcome. That's because the bones are still good, but anything that Apple has hung off those since Tim Cook is at best value neutral, and usually adds a little bit more drag for every new thing.

Don't get me wrong, I still use it - because it's still decent enough - but there's definitely a trajectory happening.


Settings - I preferred the rectilinear layout, but I don't see why making it linear makes it atrocious.

If you don't want SIP, it will take you a few minutes to reboot and switch it off permanently (or perhaps until the next OS upgrade). This is really the only one in the list which has to be "overcome", and personally I think that SIP enabled by default is the right choice. Anyone who needs SIP disabled can work out how to do that quickly - but it is years since I've had a reason to do it even temporarily, so I suspect the audience for this is small.

Multiple package managers and Python: that sounds like a problem caused by running multiple third party package managers.

If you want games, x86 or console is the preferred choice. Issue for some, decidely not for others. I'd much rather have the Mx processor than better games support.

iTunes - I can't comment, I don't use it.

iCloud - perfectly possible to run without any use of iCloud, and I did for many years. I use it for sync for couple of third party apps, and it's nice to have that as an available platform. It doesn't force its way in, and the apps that I use usually support other platforms as well.


OS X started going down hill as soon as they replaces spaces and expose with mission control.


Consolidating Spaces and Exposé is not one of the things they did that hurt Mac OS X.


I could not disagree more strongly.

Having them separate, and more importantly taking spaces from a 2d array of desktops to a 1d array of desktops ruined it substantially.


> There was a competing bicycle dealership next door (Apple) that one day began selling motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.

Neal said the essay was quickly obsolete, especially in regards to Mac, but I'll always remember this reference about hermetically sealed Apple products. To this day, Apple doesn't want anyone to know how their products work, or how to fix them, to the point where upgrading or expanding internal hardware is mostly impossible.


The difference between Apple and IBM is the latter lost control of their platform and those who inherited, arguably Intel and Microsoft, had no interest in exerting absolute control. (If they tried, it likely would have backfired anyhow.)

As for Apple, their openness comes and goes. The Apple II was rather open, the early Macintosh was not. Macintosh slowly started opening up with early NuBus machines through early Mac OS X. Since then they seem to be closing things up again. Sometimes it was for legitimate reasons (things had to be tightened up for security). Sometimes it was for "business" reasons (the excessively tight control over third-party applications for iOS and the incredible barriers to repair).

As for the author's claims about their workings being a mystery, there wasn't a huge difference between the Macintosh and other platforms. On the software level: you could examine it at will. At the hardware level, nearly everyone started using custom chips at the same time. The big difference would have been IBM compatibles, where the chipsets were the functional equivalent of custom chips yet were typically better documented simply because multiple hardware and operating system vendors needed to support them. Even then, by 1999, the number of developers who even had access to that documentation was limited. The days of DOS, where every application developer had to roll their own hardware support were long past. Open source developers of that era were making a huge fuss over the access to documentation to support hardware beyond the most trivial level.


I think its still quite relavent. There are still people who enjoy more diy OS. Now a days they use arch or something. That doesn't make them any better or worse than anyone else. Some people enjoy tinkering with their OS, other people just want something that Just Works(tm) and there is nothing wrong with that.

Threre is an implicit supperiority in the text which is just as cringey now as it was at the time, but i think its still a good analogy about different preferences and relationships different people have to their computers.


> I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

IME he hates to revisit anything he's already written so this claim is polite but implausible.


"Obsolete" is too strong a word, I think. OSX isn't an evolution of the Macintosh's operating system; That'd be Pink, which was even mentioned, and it crashed and burned. OSX was far closer to a Linux box and a Mac box on the same desk, therefore the only change really needed is to replace mentions of Unix or specifically Linux with Linux/OSX as far as the points of the piece are concerned. If Jobs had paid Torvalds to call OSX "Apple Linux" (Or maybe just called it Apple Berkeley Unix) for some reason this would be moot.

I also primarily use Windows and don't have a dog in the fight you mentioned. I might actually dislike Linux more than OSX, though it has been quite a while since I've seriously used the one-button OS.


> OSX was far closer to a Linux box and a Mac box on the same desk

Setting aside the "more BSD/Mach than Linux", OS X pressed a lot of the same buttons that BeOS did: a GUI system that let you drop to a Unix CLI (in Be's case, Posix rather than Unix, if we're going to be persnickety), but whose GUI was sufficiently complete that users rarely, if ever, had to use the CLI to get things done. Folks who love the CLI (hi, 99% of HN!) find that attitude baffling and shocking, I'm sure, but a lot of people really don't love noodling with text-based UIs. I have friends who've used the Mac for decades -- and I don't mean just use it for email and web browsing, but use it for serious work that generates the bulk of their income (art, desktop publishing, graphic design, music, A/V editing, etc.) -- who almost never open the Terminal app.

> though it has been quite a while since I've seriously used the one-button OS

Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)


> Given that OS X has supported multi-button mice since 2001, I certainly believe that. :)

And since MacOS 8 before that...


>Given that OS X has supported multi-button mice since 2001, I certainly believe that.

Just a joke, mate.


macOS shares zero lineage with Linux, which itself shares zero lineage with the UNIX derivatives. It would make zero sense for Apple to call macOS "Apple Linux" when it doesn't use the Linux kernel. Mac OS X is closest to a NeXTStep box with a coat of Mac-like polish on top. Even calling it "Apple Berkeley Unix" wouldn't make sense, because the XNU kernel is a mish mash of both BSD 4.3 and the Mach kernel.

Linux and the UNIX derivates are not even cousins. Not related. Not even the same species. They just both look like crabs a la https://en.wikipedia.org/wiki/Carcinisation.


I used to have a coworker, a senior dev of decades of experience, who insisted that MacOS was a real Linux, "just like BSD". Sigh.

Of course, this belief probably had no downsides or negative consequences, other than hurting my brain, which they probably did not regard as a significant problem.


The crabs analogy isn't a good one, because they evolve independently to play well in a common environment. GNU/Linux is a rewrite of Unix that avoids the licensing and hardware baggage that kept Unix out of reach of non-enterprise users in the 80s and early 90s.


Your statement seems very strong. Is Mac OS X not based on Darwin? Are you defining "lineage" in some way to only mean licensing and exclude the shared ideas (a kernel manipulating files)? Thanks for the "carcinisation" link.


To be more precise: this simplified history suggests a shared "lineage" back to the original UNIXes of all of BSDs (and hence Darwin and OSX) and of the GNU/Linux and other OSes https://unix.stackexchange.com/a/3202

So, "zero shared lineage" seems like a very strong statement.


>OS X is closest to a NeXTStep

This is already in the piece. Why waste time repeating it?


Because the person they replied to used a term like Apple Linux which is ridiculous. So somebody on the internet was wrong and needed to be corrected


> somebody on the internet was wrong and needed to be corrected

https://xkcd.com/386/


I used it, and on purpose. If you missed why, you should go back and read again before replying.


Apple was in some ways on the right track with an OS that had no command line. Not having a command line meant you had to get serious about how to control things.

The trouble with the original MacOS was that the underlying OS was a cram job to fit into 128Kb, plus a ROM. It didn't even have a CPU dispatcher, let alone memory protection. So it scaled up badly. That was supposed to be fixed in MacOS 8, "Copeland", which actually made it out to some developers. But Copeland was killed so that Apple could hire Steve Jobs, for which Apple had to bail out the Next failure.


Yes, I, too, have read the OP before.


Linux is not just another word for Unix or Unix-like and Mac OS X/macOS has never used nor shipped with the Linux kernel.


I feel now the different way. Linux is never that much more powerful than other OS. It felt more like a tractor 10 years ago and now it is a good alternative to "Toyota Camry" with pretty good user experience.

Meanwhile Windows has become those cars with two 27" screen as dashboard, which has bad user experience and full of advertisements.


The actual comparison is that Vista was the Ford Edsel (hilarious and widely-mocked failure) but 7 was the Ford Comet (a huge hit).

It's a close analogy, because the Comet was actually the next model of Edsel. They just changed the branding. Same with Vista to 7.


Windows XP would probably be an old 1970s Volvo that is still, somehow, running in all kinds of places otherwise forgotten by the waking world.



Mostly in Oregon, Northern CA, and parts of Washington. But parts are increasingly hard to find.


Mostly in all of Sweden, all of Finland, probably still quite a lot in Denmark, but perhaps not so many in Norway any more (since they are rapidly electrifying[1] the entire national car fleet).

___

[1]: Ironically, the heavy state subsidies for buying an electric car are, like anything else in Norway, financed by... Oil money.


OS/2 also! :D


The way I explained my continued loathing of Microsoft to a friend was that back in the day, it was like having two choices of car: a Ford Pinto, or a pickup truck. Not everybody needed the heavy transport capabilities of the pickup, but their only choice if they didn't want the added bulk and fuel consumption was the Pinto, which had dangerous failure modes. Later, the Pinto would be withdrawn and replaced with a Ford Taurus, a serviceable but not particularly fun or performant vehicle which worked fine usually, but was based on a transaxle design, so when it did break you had to dismantle the entire front end of the vehicle in order to repair it. And now imagine that there were roads you couldn't drive on and places you couldn't go unless you had one of these three vehicles, simply due to Ford's market pressure on infrastructure planners, not for any good reason; and besides, people mocked you for wanting something else.


I didn’t know myself let alone computers but anyone know why Vista and Windows 8 got such bad reps?


They were given “tablet friendly” user interfaces that made navigation extremely difficult. But Microsoft never finished the job so you’d try to find something in the stupid sliding screen interface only to be thrown into a windows 3.1 era control panel. Windows server of that era was even worse, as mostly you administered using RDP but it never played nice with the “hover here to bring up the whole screen menu” required to drive it.

Underneath it was just Windows, but the interface ruined it


I'm one of the few people who thought Vista was fine. It mostly got a bad rap because of bad drivers and high system requirements.


Vista significantly changed the security model, introducing UAC and requiring driver signing. This caused a bunch of turmoil. It also had much higher system requirements than xp for decent performance, meaning that it ran at least a bit shit on many people's machines when it was new, and that did it no favors.


> recent Acquired episode on Microsoft, where Vista was a Dodge Viper but Windows 7 was a Toyota Camry, which is what users actually wanted.

I assume if anyone associated with Microsoft compared Vista to anything other than an abject failure, it's because they are - at best - broken or defective people who were involved in the creation of Vista, and therefore not objective and not to be trusted in any way.

Dodge Viper? WTF?


Doge Vipers get a bad rep even in need for speed lol but idk enough about cars to know why


The first generation or two of Vipers were very raw & unrefined, with ~400 horsepower, no stability control, no anti-lock brakes, minimal safety features, cheap & janky looking interiors, etc. It sounds great to me (aside from the lack of a/c & nice ergonomic interior), but it would be extremely easy to have a bad time if you're an average driver.

I'm not sure about the analogy though, they might have been thinking of later Viper versions where the complaints would be more about cost, gas mileage, or general impracticality for daily use.



One major advantage of the CLI is that instructions/fixes etc are very concise and can be easily communicated. If someone has a Linux system that needs a known fix, then it's trivial to send someone the commands to copy/paste into a terminal. However, if there's a known fix for a graphical program, then it suddenly becomes much harder to communicate - do you go for a textual instruction (e.g. click on the hamburger menu, then choose "preferences"...), or a series of screenshots along with some text?


I think your trailing question is a hypothetical but to answer:

Screenshots along with text, then use Microsoft Paint to mark up the screen shots. For example, circling the appropriate menu option in a thick red line. Sadly, I do not know how to graphically convey the double-click operation.

Its a time consuming and error prone process, best used if the buggy GUI application is abandonware. Otherwise, the new version will have the same bug or non-intuitive process but the GUI will be unrecognizable. Probably because some manager, somewhere, read too many HN articles and decided a user-facing refactor was the most important thing they could possibly do.


You raise a good point about GUIs changing over time. Obviously, CLIs can change, but that happens less often and is usually in response to a poorly thought out first attempt at providing functionality (or when additional functionality is bolted on).


Scripts are UI glue in more than one sense.

The usual interpretation is that scripts glue together system commands that don't otherwise incorporate one another. This is tremendously useful and is by itself a huge advantage over GUIs.

But the second sense is that vast libraries of scripts keep developers bound to their previous UI promises. You don't capriciously change command-line options without raising all kinds of ire. I've seen GNU tools (tar comes to mind) in which options have been "deprecated" for decades, yet are still available. I seem to recall a very small number of cases where options had their senses entirely inverted, to much protest, and ultimately abandonment of those tools. In other cases you've got generations of programs which support backwards-compatible options with precursors: exim and postfix with sendmail, ssh with rsh, alpine with pine, off the top of my head. Developer headspace and script compatibility are valuable.

On that note, whilst I find the *BSD's extreme conservatism annoying myself, I can appreciate that it does lead to increased consistency over time as compared with GNU userland.


There's also a lot of text based tools that are designed to be used as part of a script and don't provide much functionality in themselves - the philosophy of do one thing (and do it well). GUIs tend to be designed to provide the entire functionality that's wanted and little thought is given to them being used as part of a toolchain.


Right. CLIs / TUIs offer composability. GUIs don't.

If someone makes a GUI that does specifically what you want, you're in luck. If not, you're S.O.L.

With CLIs, you can almost always find some way to string things together to get what it is you want.

The hard case is working in domains which aren't inherently textual: graphics, audio, video, etc. Even there, you can often get much more accomplished from the terminal than you'd think, though setting up the process can be challenging at times. Not so useful for one-offs, but invaluable for repetitive processes.


The correct response to an API change is to re-version the API for the library. E.G.

bsdtar -> libarchive -- Arguments as presently known

bsdtar2 -> libarchive -- Some new overhaul that does things way differently


Sure, for APIs or CLI tools.

Doesn't work so well for GUIs, which was sort of the original point.


Since CLIs are used in scripts/automation there's much more pressure for them not to change. Essentially, they become an API so breaking changes are a big deal. GUIs, on the other hand, are very difficult to automate so they can't be used in that way even if people wanted.

As an aside, I think a great foundation for a GUI application is a "server" API that the GUI interacts with. You can automate a lot of testing this way and you can give power users the ability to automate workflows. Web apps are already built this way, but if you're making a standalone app you even have the privilege of making a very chatty API, which makes testing even easier.


> As an aside, I think a great foundation for a GUI application is a "server" API that the GUI interacts with. You can automate a lot of testing this way and you can give power users the ability to automate workflows.

An idea already implemented to varying degrees of success in places like BeOS, Haiku (operating system), and to a much lesser extent, AppleScript! You could also throw in the COM interop of Windows and OS/2.


On windows, steps recorder can be useful. https://support.microsoft.com/en-us/windows/record-steps-to-...


Agreed. Poor scripting/replay inherently limits the GUI.

That said, the best late 90s expression of the core advantage of the GUI over the TUI/CLI is that it demands less of the user:

"recognize and click" vs

"remember and type"

That seems very fundamental to me.

I have not seen as succinct expression of tradeoffs for V(oice)UI or L(LM)UIs


The 90s UIs all had 'hints' for how to activate UI features using keyboard shortcuts.

Want to know how to copy and paste quickly? It's Right There in the menu you found the action in. Don't know that yet? Alt + E (underlined in Edit) then some other key to jump to the action in the list, or you now see the list, abort the previous command sequence with Esc, and then start memorizing the new shortcut for Ctrl+c + Ctrl+v.


How about the new kid on the block here, the chat interface? Neither "recognize and click" nor "remember and type", esp. if done over voice.

Maybe the chat interface does away with the first half of the GUI/CLI schemes, skipping over learning the affordance part of the interface.


This seems to deny the possibility of equivalence of any sequence of actions taken in a bounded spaces of entities (named widgets thus type:id) and actions to another 'representation' (e.g. text):

   { select[i]@dropdown:states > click@button:submit }
The fact that we don't have this (yet) does not mean it is not possible. In fact, given that the current darling of tech LLMs can 'compute visually' based on tokens (text) should make it clear that any 'representation' can ultimately be encoded in text.

So a 'record' feature on GUI 'a' can create action encoding that can be emailed to your peer looking at GUI 'b' and 'pasted'.


I really wish one of the little GUI frameworks would develop this, but alas, the evidence is that "nobody" actually wants this. As evidenced by the things like Applescript that existed, and died.

I feel like it could be done if it was really a goal from day one, and there were things like "record this set of actions as a script" built into the toolkit. Even Applescript was still an afterthought, I think, albeit a very well supported one.

In the meantime, given the comprehensive failure of UI toolkits to support this style, it is completely fair for people to act and speak as if it doesn't exist.


MacOS kind of had this with AppleScript, including recording. It's a little disappointing that it isn't widespread now but I realize that demand for GUI automation is extremely niche.


Applescript is so clunky and annoying. I have a few applescript launcd functions on my mac for some personal projects, and only because apple forces you to go through stuff like applescript to use stuff like the system notifications. It was a pain as the documentation is just so very poor for applescript, so clearly neglected by anyone at apple even when they built the tool; its not like good documentation only came about in recent years in programming. Then again most of what apple documents is crap. The launchd documentation isn't any better and I get undefined behavior I can't make head or tail of from it (basically functions working fine when ran as a user but failing silently when the same functions are ran through launchd), I'm guessing from some permissions related issue between launchd running as a certain privileged process and maybe the limits of their system integrity protection, I can't be sure because the documentation is so poor. And what is even worse is that there aren't many people using tools like launchd, at least not nearly compared to people who use systemd, so the various stackoverflow or blog posts you could look for reason in these more popular tools are not there for the apple tooling.


Both Applescript and Automater are still around, and still provide recording.


Not only is it good for sharing but for automating functions in script. Imagine managing an IT environment where you have to click around the gui for each and every workstation, versus what these IT professionals actually do which is write a configuration script that does all their setup functions, which they can push to all networked workstations from their office with a single keystroke.

To do that in a gui-centric fashion, the only tools we actually have are textual commands that direct gui action. essentially its a middleman step: if we are already writing our applescript lets say, we might as well just abrogate the actual commands that our applescript is trying to abrogate through the gui system.


Or even write a script which is just those commands. Meanwhile every step a user as to do is an opportunity to cause a new and different problem.


To be (re)read together with the Unix Haters Handbook https://web.mit.edu/~simsong/www/ugh.pdf to realize that what we need is re-made LispM, Smalltalk workstations or the OS as a single application, framework opened down to the most low level part, in the user hands, fully end-user programmable, discoverable, and fully integrated. A 2D and even 3D/4D CLI as the UI, witch is the DocUIs with eventual 3D and video/multimedia elements.

As a conceptual framework http://augmentingcognition.com/assets/Kay1977.pdf


Riotous applause from the cheap seats


Thanks :-) but allow me to enlarge, in my poor English: "us" the westerns, we came from those who invented the IT, the modern industry. We are evidently in a declining phase. Evidently someone else try to emerge and actually one of the most prominent new power, China, emerge doing what we have done when we was at the top of the world in industrial, public research, education terms. Meaning such "technical model" works.

Evidently the financial capitalism have worked for a certain period but does not work anymore. So, why keeping committing suicide? We have started to suicide with the WWI. We kept going with WWII and we continue now.

We are still the leader for IT, and we know what does it work, the classic FLOSS/at least open IT model, the one where some sell iron not bits, the one where customers own their systems and bend them as they wish, the one where communication is not owned by some walled gardens but open like on Usenet, classic mails (as opposite to webmails, hiding the decentralized ability for most users who do not own the WebMUA). To continue with the China comparison I've recently bought some battery tools, very cheap crap but enough for domestic usage and I've observed that batteries have a standard connector, I can swap them from different brands issueless, batteries, chargers are "standard". I also own some high end domestic battery tools, who happen to have a damn nfc tag inside the battery tying it to the device, even if inside the battery are classic connected li-ion batteries. The same I observed for BEV, some friends and I have three Chinese BEV from different brands/models and they have a gazillion of common parts. So to say "open/standard pay back" yes, it might erode SOME OEMs margins, but pay back the society at a whole and as a result the OEM itself. The same is valid in software terms. My desktop is Emacs/EXWM, I use as a search&narrow framework consult/orderless/vertico, they simply "plug in" any other packages because the system is a unique application end-user programmable at runtime. You do not need to "specifically support" a package to integrate it. You do not need third party to explicitly support your package to use it together. And what we do instead? Our best to create walled gardens, we have had XMPP, SIP/RTP and now most are on Zoom/Teams/Meet/Slack/Skype/* all walled gardens. Even many push to substitute emails with some new colorful walled garden. Similarly any app try to add features someone else have since it's impossible just using it, like a sexp in a buffer.

As a result modern IT from Unix where the user at least can combine simple tools with some IPCs in script we are limited by cut&paste, drag&drop and not much more. Anything is damn complicated because there are no shared data structure in a shared environment, but any "app" do everything on it's own, often with a gazillion of dependencies who have a gazillion of deps on their own, with a big load of parsers of any kind and even "some standard to pass data" (like JSON) who try to emerge but are still not a shared data structure in an unique integrated environment.

All of this is the old Greenspun's tenth rule and actually is killing our IT innovation, killing one of the last sector we still rules, for the same issues that have killed our industry essentially.


I 100% agree, and your points about Chinese tools are particularly incisive.

As an aside, but I think relevant and you might find it interesting:

A decade or so I discovered Oberon, the last masterwork of the great genius of programming languages Niklaus "Bucky" Wirth. A complete OS, UI and compiler, in about four and a half thousand lines of code.

I have it running in various forms.

I introduced it to the Squeak Smalltalk community, and when I told them what I was looking for:

« a relatively mature, high-performance, small OS underneath, delivering some degree of portability -- something written in a type-safe, memory-managed language, with multiprocessor support and networking and so on already present, so that these things do not have to be implemented in the higher-level system. »

That is how I found Oberon. They told me such a thing did not and could not exist.

I gave them some links and people were amazed.

It seems deeply obscure in, as you say, the West.

But I have found an active community working on it and developing it. It is in Russia.

It may be that in other countries now the target of Western sanctions, we may inadvertently be fostering some very interesting tech developments...


I know Oberon only by name, I've used (and honestly not loved it at all) some Pascal dialect back at high school, but back then was not real programming and was an introductory very bad organized course so it's hard to tell, I've encountered probably Oberon for a river navigation applications around 10 years ago but I wasn't really involved so I can't say much, I essentially do not know anything but the name, if you have some interesting links to share I'll skim them with pleasure.

In more broad terms I do not put much attention in a specific programming language even if clearly an OS-single-application is tied to a specif programming language, in the sense that there are many, and they are many factions loving one and hating others, the point is offering something usable at user level, like "just type a sexp and execute it" also from an email, because integration means also an immense small diversity and heavy dialogues so innovation. With such model we can keep our supremacy and more important we can't lose it because essentially anything float in a common see.

The main issue to reach such goal I think it's general cultural of the masses, today most peoples, many programmers included, do think that IT means computers, like saying that astronomy is the science of telescopes. Of course computers are like pen and paper, an essential tool, but they are a tool, the purpose of IT is information and that's not a specific technical task but a broad aspects involving essentially all disciplines. Until this is clear for anyone there is little hope people understand the need, power and issues of IT, they'll keep looking at the finger pointing the Moon instead of at the Moon.

The rest came after, even the most basic computer skills came after because to learn we need to be motivated, learning "just because you have to" as a social rule is not productive.


> if you have some interesting links to share I'll skim them with pleasure.

I do not advise skimming.

I've been a full-time tech journalist for 2 & a half years now (I was in the 1990s as well but the 21st century is very different) and I find the majority of readers who angrily disagree with my articles did not in fact understand the article because they tried to skim it and they didn't get the gist.

(In a previous job I was a TESOL/TEFL English teacher. "Skimming for gist" is a skill we test for, and many people don't have it and don't know they don't have it. I an not accusing you here -- but you did mention your own English in negative terms.

For example, I was on a talk at FOSDEM in February -- https://fosdem.sojourner.rocks/2024/event/3113 -- and it seemed to me that most of the audience angrily arguing about what the GPL meant and implied had not really genuinely read and understood all 6 pages of the GPL.)

Executive summary of Oberon:

https://ignorethecode.net/blog/2009/04/22/oberon/

13 page academic assessment, but very readable and accessible:

"Oberon – The Overlooked Jewel" https://dcreager.net/remarkable/Franz2000.pdf


> I do not advise skimming

That's very right, but modern life is complicated so accurately study something demand much time, quickly see the concepts might helps and well, the concept of textual-UI is definitively not alien to me, since my desktop is EXWM, with almost all my digital life in org-mode, org-roam-managed notes, it's still very different than Oberon (or Plan 9) desktop but the textual concept and org-mode links that can execute sexps on click (a feature I use much, for instance to link specific mail/threads in notes and create interactive presentations) it's similar. The 2D "spaced" desktop concept It's something I see in the far past, SUN Looking Glass LG3D concept desktop https://en.wikipedia.org/wiki/Project_Looking_Glass and yes while the above help they still can't tell me what's inside the package, meaning what's behind the UI concept and the language grammar. I still miss the architecture.

However I suppose for what I've seen so far that essentially it's not really usable in real life so it's a nice to know project but stop here, like Lisp M Genera or Plan 9. Emacs at least can be used for real today. It's sad the IT industry have pushed what I call the glorification of ignorance, but more than preserving knowledge for a more civilized world I think we can't do.

So far in the last decades most of the old valid ideas get anyway accepted, for instance widget based UIs have essentially failed and are more and more substituted by WebUIs witch are read-only DocUIs or NotebookUIs witch are limited 2D CLIs, something close to a DocUI. They are mostly text-based as well. So well, maybe in 10+ years we will finally have something like an Oberon or LispM desktop, surely returned with many anti-users aspects but still offering something of the past glory, and there memories will help to keep correcting the aim and reducing the loss. Anyway until people realize the substantial importance and role of IT there is little hope for a more civilized era...


I really do feel your pain. ;-)

No, it's not viable as a general-purpose OS these days. At one time it was and was deployed to non-technical staff inside ETH.

The last development in the line, not from Wirth himself, has a zooming GUI, resizable overlapping windows, SMP, a TCP/IP stack, an email client and a very basic HTTP only web browser. It is closer than you might expect.

I believe the core OS is on the order of 8000 LOC.

You may enjoy my FOSDEM talks if you're interested in this kind of thing.

I did one involving rebooting the local OS stack based on Oberon and Smalltalk, or maybe Newspeak:

https://archive.fosdem.org/2021/schedule/event/new_type_of_c...

I turned it into an article recently:

https://www.theregister.com/2024/02/26/starting_over_rebooti...

And this year a more Linux centric one based around 9front:

https://fosdem.org/2024/schedule/event/fosdem-2024-3095-one-...

That became an article series:

https://www.theregister.com/Tag/One%20Way%20Forward/


> That is how I found Oberon. They told me such a thing did not and could not exist.

Long ago:

https://blackbox.oberon.org/

https://github.com/excelsior-oss/xds


Thanks for the links!


> Buyer: "But this dealership has mechanics on staff. If something goes wrong with my station wagon, I can take a day off work, bring it here, and pay them to work on it while I sit in the waiting room for hours, listening to elevator music."

Bullhorn: "But if you accept one of our free tanks we will send volunteers to your house to fix it for free while you sleep!"

Did Linux distros actually offer support at some point? (By what I assume would be some project contributor ssh-ing into your machine)

My impression was always the arguments were more like "Well yes, but we have this literal building full of technical manuals that describe every bolt and screw of your tank - and we can give you a copy of all of them for free! And think about it - after you have taken some modest effort to read and learn all of them by heart, you'll be able to fix and even modify your tank all on your own! No more dependence on crooked car dealers! And if you need help, we have monthly community meetups you can attend and talk with people just as tank-crazy as you are! (please only attend if you're sufficiently tank-crazy, and PLEASE only after you read the manuals)"

(This was decades ago, the situation has gotten significantly better today)


> Did Linux distros actually offer support at some point?

Ever wonder how Red Hat became a billion-dollar company before it was bought by IBM, and now makes up a huge segment of IBM's revenue stream?

Have you noticed SuSE is still around?

Have you ever speculated on how Canonical keeps its lights on?

Paid support, my naive friend. Linux support is big business and is what keeps the popular distros alive.


Arch Linux seems to be an exception.

https://wiki.archlinux.org/title/Arch_Linux

> Arch developers remain unpaid, part-time volunteers, and there are no prospects for monetizing Arch Linux


Good point!


> Did Linux distros actually offer support at some point? (By what I assume would be some project contributor ssh-ing into your machine)

I don't think that was the intended implication. I think the analogy is more akin to: "If send us a bug report, we'll fix it and ship a new version that you can download and use for free." In the olden days, you'd have to buy a new version of commercial software if it didn't work for your machine, complementary patches were rare.


Depends on where you lived. In the early days there were lots of LUGs in some areas, usually in college areas, but some would be hosted at a few Companies.

I think DEC had one or two. And you could find someone who would meet you somewhere to help you out, it was an exciting time. Also there were lots of install fests for Linux.

Most activity took place on USENET, so getting help was rather easy.

For example, I had asked how I could connect 2 monitors to my 386SX, one controlled by a VGA card, the other via a mono-card, each monitor with a couple of VTs. That was doable with Coherent on install. A day later I got a patch.

Things moved very quickly back then :)


"By what I assume would be some project contributor ssh-ing into your machine"

Who would want that?

"Stay away from my house, you freak!" would be the normal reaction. Unless some serious trust is developed, I would not let people into my house while I sleep.

Also the actual usual reaction would have been more like: "hey it is open source, you can fix anything on your tank yourself"

You need a new module to connect with your other devices, just build it yourself, no big deal!


I know you shouldn't look too closely at analogies, but I think there is an interesting inconsistency in the "car dealerships" story:

The tank people offer to send someone to look into the car (rsp. tank) but the buyer rejects them from entering their house.

That's significant, because a car is much less private than a house. In the real world, if my car had an issue, it would be perfectly reasonable to give it into the hands of a mechanic, even if I don't know them personally. (And evidently the reputation of the dealership isn't the deciding factor either, otherwise all the independent repair shops wouldn't exist)

On the other hand, I'd be much more wary to let strangers into my house without supervision, because I have far more private and valuable possessions there than in my car.

So the question is whether computers are more like cars or like houses. I'd argue, they sort of blur the line and have definitely moved closer to "house" in the last decades. But it might have been different back then.


That nitpick over car/house analogies is rapidly breaking down as cars have increased compute and storage themselves, and integrate with other personal electronics, most especially "smart" phones.

Some recent discussion on the border phone search ruling:

<https://news.ycombinator.com/item?id=41084384>


I'd say the reputation of dealerships is why independent repair shops exist. :)


Obviously a computer is like an an RV.


It might also be a bus.


Companies want that when prod stops.


Reminded me of an extreme case. In 2012, the largest Russian bank brought its prod back online, and then posted literally this:

> We invite specialists to take part in a professional discussion to identify the causes of Oracle DBMS failure

> System logs and a more complete description of the situation will be posted in the blog. To participate in the discussion, you must fill out the registration form and wait for an invitation to your email.

https://web.archive.org/web/20120716225650/http://www.sbrf.r...


For if your browser isn't okay with the long lines;

wget -O - https://web.stanford.edu/class/cs81n/command.txt | nroff | less


I love that you can do that with the text.


My favourite part explaining how unix/linux users feel regarding windows: 'THE HOLE HAWG OF OPERATING SYSTEMS'.

This essay should be a mandatory reading for all CS students and anyone wanting to call himself hacker.


I teach data analytics and it’s required reading on the first week. It doesn’t soften the blow of throwing people in the CLI but provides perspective to why I am.


Just tell them it's the 'chat interface' to the computer.


Yes, and the part about Reagan.


I read this during my university graduation ceremonies. It was hidden in a fold in my robes. It was the most fitting thing I could have done, as I immediately changed my life direction and focused on exactly the ideas outlined in this work. My goal: move the world from the command line. I've almost managed to.


I like the problem to the move away from IP Chains.

We should move to the Command Table.


I just wrote a command line interface of LLMs in (almost) pure Bash[1]. I endorse the future of LLMs because of the points in this article. People talk to LLMs the same way we talk to CLI shells, and everything is plain text based (It's Unix philosophy! ). I should've read this earlier to get more ideas before writing the CLI client.

[1]: https://github.com/simonmysun/ell


Does this win the HN prize for Most Often Reposted Article? It's been posted on average twice a year for 17 years: https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...


I'm pretty sure it's far from #1 (consider "Story of Mel", "You and Your Research", and many others), but I don't have that list or even know how to create such a list, as it's perturbed by variations in both titles and URLs.


I’ve got this in soft-cover. I think I read it back before the turn of the millennium. As a BeOS aficionado I loved the reference to batmobiles.


Yes. Cults are real. At the same time very telling. Some people really believe they are better because they have a different machine than someone else. And that shit runs deep.


I still remember one of the quotes from The Metaphor Shear chapter, where he tried to explain how he felt after some of his files on his Mac got corrupted to such degree that he just couldn't recover: "It was sort of like watching the girl you've been in love with for ten years get killed in a car wreck, and then attending her autopsy, and learning that underneath the clothes and makeup she was just flesh and blood."


Using plain $LANGUAGE to communicate with LLMs is an interesting contender, which is even easier for end users than GUIs and even less precise and is further removed from the underlying computations. In spirit, it is definitely a step in the same general direction, which makes a lot of formulations in this article sharply prescient.


> Yet now the company that Gates and Allen founded is selling operating systems like Gillette sells razor blades. New releases of operating systems are launched as if they were Hollywood blockbusters, with celebrity endorsements, talk show appearances, and world tours.

I was a kid at the time, but did many people actually buy windows? I know about the ad-thing where the cast of Friends or whatever bought windows 95, but as I recall even back then the OS just came with the device. The only exception was OSX, which was a “Big Deal,” even non-technical people downloaded it.

Anyway, it is funny to see this in retrospect. Nowadays, operating systems have become so commoditized that you can’t even make a business selling them.

I love Linux but his description is quite optimistic.


Absolutely -- yes Windows came with your PC, but you'd buy the new version as a (cheaper) upgrade if you didn't want to wait until the next PC you bought.

Today new OS versions aren't such a big deal, but when Windows 95 came out, and then XP, they were huge, with total interface redesigns.

On the other hand, I don't think people went out of their way as much to buy smaller upgrades like Windows 98.


I think the fact that OS updates are free now kinda killed a lot of the marketing/promotion since most consumers will only update if their computer is old.


> I was a kid at the time, but did many people actually buy windows?

Yep! Windows did indeed come with machines, but the upgrades were always a big seller. I remember when Windows 3.1 hit the shelves and seemed to be everywhere. Same with Windows 95 but that one was a tougher upgrade because of the increased system requirements.


> I was a kid at the time, but did many people actually buy windows?

Definitely. A new boxed OS version would often be the only updates anyone ever applied to their system. Even if you had Internet access, dial-up speeds and limited disk space meant downloading OS updates was often impractical. Even relatively small updates took forever to download.

There was also the relative costs of a computer. A $2,000 computer in 1995 would be about $4,000 in today dollars. Buying an OS update would be a relatively inexpensive way to upgrade the capabilities of your expensive computer without completely replacing it. Going from some Windows 3.1 release to Windows 95 would have been a nice upgrade in system stability for many people. Certainly not everyone but for many.


It feels like while the marketing of launches may still be like that, it is purely to usher along adoption of the target future state which seems to be doing away with versions and having you simply locked in to forever paying an ever-increasing subscription fee to continue to use your computer in addition to data collection which can be separately monetized.

This is markedly different from how it was in the past when they needed people to get up, go to a store, and buy the disc containing the new version of the OS.


I actually bought Windows 95 shrink-wrapped in store, because I was so excited about the new UI, for my previously DOS-only 486DX. Also OS/2 a bit later.


Yes - people bought it to upgrade, as the step up from dos to 3.1 to win95 was huge.


Thanks for posting. A very interesting read. This is my first time seeing this well written piece. Written in a text file, sign of his distrust for software that mangle your ASCII :)


It may be outdated, but it's still one of my favorite Stephenson texts. It's been a while since I read it, but the best part for me is his explanation of abstraction in graphical systems, which I think is still valuable today as we get further and further away from whatever the fundamental interface is between human and computer.


In the beginning was a switch.



> It is commonly understood, even by technically unsophisticated computer users, that if you have a piece of software that works on your Macintosh, and you move it over onto a Windows machine, it will not run. That this would, in fact, be a laughable and idiotic mistake, like nailing horseshoes to the tires of a Buick.

Not anymore. https://justine.lol/ape.html


My first 486DX PC had a "turbo" button. You press it - and it wasn't just a teletype anymore.


Related:

In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=37314225 - Aug 2023 (2 comments)

In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=29373944 - Nov 2021 (4 comments)

In the Beginning was the Command Line (1999) - https://news.ycombinator.com/item?id=24998305 - Nov 2020 (64 comments)

In the beginning was the command line (1999) [pdf] - https://news.ycombinator.com/item?id=20684764 - Aug 2019 (50 comments)

In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=16843739 - April 2018 (13 comments)

In the Beginning Was the Command Line (1999) - https://news.ycombinator.com/item?id=12469797 - Sept 2016 (54 comments)

In the beginning was the command line - https://news.ycombinator.com/item?id=11385647 - March 2016 (1 comment)

In the Beginning was the Command Line, by Neal Stephenson - https://news.ycombinator.com/item?id=408226 - Dec 2008 (12 comments)

In the beginning was the command line by Neil Stephenson - https://news.ycombinator.com/item?id=95912 - Jan 2008 (5 comments)

In the Beginning Was the Command Line - https://news.ycombinator.com/item?id=47566 - Aug 2007 (2 comments)

(Reposts are fine after a year or so; links to past threads are just to satisfy extra-curious readers. In the case of perennials like this one, it's good to have a new discussion every once in a while so newer user cohorts learn what the classics are.)


Shouldn't this be written in Latin?


Hebrew, you mean?


Such an underrated book.


The section on "THE INTERFACE CULTURE" is critical to understand in today's digital media landscape. Disney's Animal Kingdom is to the written works of Lewis Carroll and J.M. Barrie as the GUI is to the command-line interface. The message of one medium is audiovisual spectacle and immersive experience; the other, cold intellectualism demanding participation from the reader to paint a picture in his mind's eye through the interpretation of raw text, words on a screen, or a piece of paper.

    But this is precisely the same as what is lost in the transition from the
    command-line interface to the GUI.

    Why are we rejecting explicit word-based interfaces, and embracing
    graphical or sensorial ones--a trend that accounts for the success of both
    Microsoft and Disney?

    But we have lost touch with those intellectuals, and with anything like
    intellectualism, even to the point of not reading books any more, though we
    are literate.
Elsewhere [0] I have called this concept "post-literacy," and this theme pervades much of Stephenson's work - highly technologically advanced societies outfitted with molecular assemblers and metaverses, populated by illiterate masses who mostly get by through the use of pictographs and hieroglyphic languages (emoji, anyone?). Literacy is for the monks who, cloistered away in their monasteries, still scribble ink scratchings on dead trees and ponder "useless" philosophical quandaries.

The structure of modern audiovisual media lends itself to the immediate application of implicit bias. On IRC, in the days of 56k before bandwidth and computer networks had developed to the point of being able to deliver low-latency, high definition audio and video, perhaps even for "real-time" videoconferencing, most of your interactions with others online was mediated through the written word. Nowhere here, unless some party chooses to disclose it, do race, gender, accent, physical appearance, or otherwise, enter into the picture and possibly cloud your judgment of who a person is - or, more importantly, the weight of their words, and whether or not they are correct, or at least insightful; consider Turing's "Computing Machinery and Intelligence" paper which first introduced what is now called the "Turing test," and how it was designed to be conducted purely over textual media as a written conversation, so as to avoid influencing through other channels the interrogator's judgment of who is the man, and who is the machine.

    The only real problem is that anyone who has no culture, other than this
    global monoculture, is completely screwed. Anyone who grows up watching TV,
    never sees any religion or philosophy, is raised in an atmosphere of moral
    relativism, learns about civics from watching bimbo eruptions on network TV
    news, and attends a university where postmodernists vie to outdo each other
    in demolishing traditional notions of truth and quality, is going to come
    out into the world as one pretty feckless human being.
Moreover, the confusion of symbols for reality, the precession of digitized, audiovisual content from a mere representation to more-than-real, digital hyperreality (since truth and God are all dead and everything is merely a consensual societal hallucination), leads people to mistake pixels on a screen for actual objects; narrative and spin for truth; influencers, videos, and YouTube personalities for actual people; or words from ChatGPT as real wisdom and insight - much in the same way that Searle's so-called "Chinese room" masquerades as an actual native speaker of Mandarin or Cantonese: "What we're really buying is a system of metaphors. And--much more important--what we're buying into is the underlying assumption that metaphors are a good way to deal with the world."

    So many ignorant people could be dangerous if they got pointed in the wrong
    direction, and so we've evolved a popular culture that is (a) almost
    unbelievably infectious and (b) neuters every person who gets infected by
    it, by rendering them unwilling to make judgments and incapable of taking
    stands.

    It simply is the case that we are way too busy, nowadays, to comprehend
    everything in detail.
The structure of modern short-form, upvote-driven media, lends itself to the production of short-form messages and takes with laughably small upper bounds on the amount of information they can contain. In a manner reminiscent of "you are what you eat," you think similarly to the forms of media you consume - and one who consumes primarily short-form media will produce short-form thoughts bereft of nuance and critical thinking, and additionally suffer from all the deficits in attention span we have heard of as the next braindead 10-second short or reel robs you of your concentration, and the next, and the next...

Beyond the infectious slot machine-like dopamine gratification of the pull-to-refresh and the infinite doomscroll, the downvote has become a frighteningly effective means of squashing heterodoxy and dissent; it is only those messages that are approved of and given assent to by the masses that become visible on the medium. Those who take a principled stand are immediately squashed down by the downvote mob, or worse, suffer from severe censure and invective at the hands of those zealous enforcers of orthodoxy. The downvote mechanism is reminiscent of the three "filters" Chomsky wrote of when he was discussing the mass media in "Manufacturing Consent," and the way advertisers, government, and capital all control and filter what content is disseminated to media consumers.

The message of modern, audiovisual, short-form, upvote-driven social media is bias and group compliance bereft of nuance. If you want to produce and consume novel ideas you are better served by media based on the written word.

[0] https://news.ycombinator.com/item?id=39990133


> Elsewhere [0] I have called this concept "post-literacy," and this theme pervades much of Stephenson's work - highly technologically advanced societies outfitted with molecular assemblers and metaverses, populated by illiterate masses who mostly get by through the use of pictographs and hieroglyphic languages (emoji, anyone?).

He returns to it rather explicitly in _Dodge in Hell,_ with the odyssee through the "Facebooked" wastelands of -- where was it, Idaho / Montana / the Dakotas? Something like that -- where the MAGAfied barely-literate natives hound the scientist / tech-type heroes.


This is timeless


The command line is still king.

Whenever I see new coders struggle, it usually is because they:

    - Don't know the context of what they are executing

    - Don't know about the concept of input and output
On the command line, the context is obvious. You are in the context. The working dir, the environment, everything is the same for you as it is for the thing you execute via ./mything.py.

Input and output are also obvious. Input is what you type, output is what you see. Using pipes to redirect it comes naturally.

Not being natively connected to context, input and output is often at the core of problems I see even senior programmers struggle with.


> On the command line, the context is obvious. You are in the context

While I share the opinion that the command line is _the one true way_ of computing, I don't think this is really all that true. Computers are alien. GUIs are alien. CLIs are alien. Everything is learned. Everything is experience. Everything is culture. Learning, experience, and culture blind us from the experience of the novice. This is "expert blindness".

Why Western Designs Fail in Developing Countries https://youtu.be/CGRtyxEpoGg

https://scholarslab.lib.virginia.edu/blog/novice-struggles-a...


More succinctly, "the only intuitive interface is the nipple."


Not even that. Newborns have to learn how to suckle, and their mother has to learn how to hold everything in the right positions so it can work. It’s a tricky skill and many aren’t successful even if they want to breastfeed.


Newborns don't have to learn how to suckle. It's a reflex, called a suckling reflex. Lacking a suckling reflex is an indicator of disease. Basically a newborn suckles everything that goes into the mouth, nipple or not.


Yes they have a suckling reflex, but they might not latch on correctly and then breastfeeding is extremely painful.


Are you a woman? Have you had children? Have you partnered a woman as your child and her cried through the night, both trying to make this “breastfeeding” thing work; both failing. “How can something so intrinsic to basic survival, be so hard!”. And yet it is.

Talk confidently when you have experience.


Having a reflex and breastfeeding are two different things. The baby can have the reflex, but the mother has no milk or the latching technique is poor (as mentioned elsewhere in these comments). So the child not breastfeeding doesn't mean there is no reflex. And yes, the process can be painful and stressful to many mothers most especially first timers (experience also counts).


That doesn't negate the fact that newborns have a suckling reflex.


Sucking it's a reflex; the rest it's pseudoscience. Your personal anecdothes don't count.


Some babies actually do not have that intuition and rather bite and need time to learn.


True. All interfaces are abstractions and i think the command line interface is the best one we have. It gives you the maximum power.. It does have a steep but surmountable learning curve though and the effort is usually worth it given the ubiquitous nature and programmability.


I mean yeah, you are right, it is learned, but "CLI is the one true way" for me because it is much more powerful than most GUIs I have seen. I am much more productive using the CLI, etc.


>Why Western Designs Fail in Developing Countries

Why add this?

Whats the solution then?

We have already given so much in the western world.


>On the command line, the context is obvious.

But CLI contexts are only obvious if the computer user is already familiar with the CLI which biases the learned mind to perceive things as obvious when they really are not.

A lot of CLI commands syntax are based on position instead of explicit argument names.

E.g. creating file system links via CLI has opposite syntax positions in Linux vs Windows:

  - Linux:  ln srcfile targetlink

  - Windows :  mklink targetlink srcfile
If one goes back & forth between the 2 operating systems, it's easy to mis-type the wrong syntax because the CLI doesn't make it obvious. On the other hand, using a GUI file browser like Ubuntu Nautlius or Windows Explorer lets a novice create links ("shortcuts") without memorizing CLI syntax.

This gap of knowledge is also why there is copy&paste cargo-culting of inscrutable ffmpeg, git, rsync, etc commands.

E.g. using ffmpeg to covert a DVD to mp4 by manually concatenating *.VOB files has very cumbersome and error-prone syntax. It's easier to use a visual GUI like Handbrake to click and choose the specific title/chapters to convert.

CLI vs GUI superiority depends on the task and the knowledge level of the user.


Context isn't the same as syntax?

Yes, command line suffers from discoverability of which different applications (such as ln/mklink) may not be consistent.

It is one of the bigger problems (imho) of the cli but it doesn't go against GPs point.

The command line does have a learning curve (partly because of the above), but it is also quite rewarding.


When I start typing a formula in LibreOffice Calc, there is a popup showing possible matching functions, then when I choose the function, the popup shows the required syntax for the function and where I currently am within that syntax. A bash plugin that would do that would be an absolute game changer imho.

The cli excels because it is extremely flexible, with far more options available than a set of buttons could ever display. But discoverability rounds down to 0, and there are footguns. It seems like spreadsheet software has found an almost drop in ui that would greatly enhance the cli.


Tab completion can get you much of the way there.


it's not the same thing. Tab completion is useful and will complete something you know of. But it does not help you discover something you don't know, or provide you the syntax of the command after it is entered. The problem I would like to solve is discoverability.

It's a 3 part problem: available commands, their options, their syntax. Part one would need to capture prompt input before enter was hit using solutions similar to those found at [1] perhaps the most useful but least complete one there is the one that uses apropos so something like `apropos -s 1 '' | sort | grep calc | less`. Similar solutions would be required for two and three. The roughest and easiest prototype would probably be two tabs in a split screen view, which would allow for selection of displayed matches to then modify the prompt creating those matches in the other tab. But Calc style popups directly attached to your cursor would be more useful still.

[1] https://stackoverflow.com/questions/948008/linux-command-to-...


These days it’s easier to ask ChatGPT for the ffmpeg command line to do the thing you want, imo.


Everything is "easy" when someone else is doing it for you. Results are best when you develop and apply your own expertise to a problem, but it's not feasible to do that for every problem domain -- relying on others is often important.

I'ts just that if I'm going to rely on someone else, I'd prefer for that someone else to be one of the people who has developed and applied their own expertise to that problem domain, rather than a statistical model that only actually knows how likely words are to appear in proximity to each other, and has no capacity to validate its probabilistic inferences against the outside world.


When some big shit hits the fan either with ChatGPT or internet connections, the only people able to fix systems without an internet connection will be us, the millenials.

The rest will be prompty fired in the spot.


I've found that as far as generational cohorts go, the most adept people are the late Gen-Xers and very early Millenials -- the people who grew up surrounded by C64s and Apple IIs and later had to fiddle with their config.sys and autoexec.bat files to get their game to fit into conventional memory.

People whose first exposure to computing was in the mid-'90s or later seem to have less depth of understanding of the fundamentals and less experiencing tinkering and using process of elimination to solve problems. I started encountering people who didn't understand certain fundamental concepts and had no exposure to the CLI, but still had CS degrees and were applying for developer positions, about 15 years ago.


Yeah, these too. But lots of millenials learnt with RedHat, Debian and they knew a bit of DOS for some issues with Windows 98 and potential recovering from disasters. Still, far more able than the ChatGPT aided people blindly pasting commands into a shell being a recipe for a huge disaster.


we had the same conversation when Gmail came out.

a true test for a progammer is to see what they can do on a computer without the Internet.


- Debian/Devuan/Gnuinos users with offline DVD's ISOs mirrored under an USB drive: easy mode. Most of the docs are already there, just mount the ISO's as a loopback device and run apt-cdrom add against the mount points.

- Windows user: hell.

Ubuntu/Arch/any distro without full ofline mirrors: hell too.


Yep, I ran a short "introduction to the command line" course for the devs in my team. Afterwards, I noticed that their usage of the vscode terminal was much higher, and folks were more comfortable exploring on their own.

Quick outline of the course, in case anyone wants a starting point:

  * Introduction
    Quick history of Unix

  * When you log in
    Difference between login and interactive shells
    System shell files vs user shell files
    .zshenv for environment variables like PATH, EDITOR, and PAGER
    .zprofile for login shells, we don't use it
    .zshrc for interactive shells
    Your login files are scripts, and can have anything in them

  * Moving Around

  ** Where am I?
    pwd = "print working directory"
    stored in variable $PWD
    Confusingly, also called current working directory, so you may see CWD or cwd mentioned

  ** What is here?
    ls
    ls -al
    ls -alt
    . prefix to filenames makes them hidden
    . is also the current directory!
    .. means the parent directory
    file tells you what something is
    cat displays a file
    code opens it in vscode

  ** Finding my way around
    cd
    cd -
    dirs | sed -e $'s/ /\\\n/g'

  ** Getting Help From The Man
    man 1 zshbuiltins
    manpage sections

  ** PATH
    echo $PATH | sed -e $'s/:/\\\n/g'
    zshenv PATH setting
    which tells you what will be run

  ** Environment Variables
    env | sort
    EDITOR variable

  ** History
    ctrl-r vs up arrow

  ** Permissions
    Making something executable

  ** Prompts
    zsh promptinit
    zsh prompt -l

  ** Pipes and Redirection
    Iterate to show how pipes work
    cat ~/.zshrc | grep PATH
    ls -al > ~/.tmp/ls-output.txt

  ** Commands

  *** BSD vs GNU commands
    BSD are supplied by Apple, and Apple often uses old versions
    GNU are installed via homebrew, and match those commands available in Linux


Reductio ad absurdum: we should all just use an interactive assembler. Then you really are "in the context".

There is a level of abstraction that makes sense. What level that is is dependent on your objectives.


My first real programming experiences was done through the browser’s console (JavaScript) and IDLE’s REPL (Python). The short feedback cycle works wonder for understanding instead of struggling to the multistep process of C compilation or Java verbosity. I tried my hands at reverse engineering and using a dissassembler like IDA also gives the same immediacy feeling. Great DX for me is either a good debugger or a proper REPL whatever the abstraction level.


Agreed. When writing the comment I returned seconds later and added the word "interactive".


> the multistep process of C compilation

it can be a single make command


DDT, the shell for the ITS operating system, was also an assembly-level debugger.


I agree with you, but I think there's a caveat. The command line is king in Linux, BSD, MacOS, AIX, and to a lesser extent Windows. These operating systems were crafted from the bottom up with the commandline as a foundational layer. The idea of the context, of the "working directory", the "environment", were concepts that were lifted from that commandline centric world, into what we run now.

I think Windows very much wanted to be something different with COM. Instead of starting a shell in the context of the program, you'd connect some external "shell" into the very object graph if your program to inspect it. It turns out to be very difficult, and Windows has largely retreated to commandline centric architecture, but I think there was some essential attempt at foundational innovation there.

I would argue that the commandline has very much proven to be the best trade-off between being useful and being simple, but there is no saying if there exists some alternative.


Not every version of MacOS. Classic MacOS, System 1-7 and MacOS 8-9, were definitely not crafted with the command line environment as a foundational layer. Using it was like being wrapped in several layers of bubble wrap. You were trapped in the GUI and if you wanted to do something the GUI didn't allow for, you were "using it wrong".


> The command line is king in Linux, BSD, MacOS, AIX, and to a lesser extent Windows. These operating systems were crafted from the bottom up with the commandline as a foundational layer.

This is definitively not true for macOS.


MacOS is a very complete, very well funded desktop environment targeted towards the general user. You want anything extras and you land in applications using private apis and the command line.


The command line is not a "foundational layer" in macOS, that was my point. It exists on the same layer as the GUI does.


The foundation of MacOS is a fully-compliant Unix:

https://www.opengroup.org/openbrand/register/apple.htm

The GUI is built on top of the Unix foundation and does not stand alone or work without it.

The 'FoxTrot' comic made a big deal about this not long after Mac OS X was released:

https://www.gocomics.com/foxtrot/2002/02/25


The foundation of macOS contains elements of UNIX (or rather BSD) and the OS is UNIX certified, I‘m fully aware of that. But these are two different things.

For one thing, UNIX != command line.

In the same vein, Windows NT is not based on DOS anymore, even if it has a command line which resembles (parts of) DOS.


The more experience I accumulate, the more I rely on GUIs. Explanation: when I was younger I used exclusively the CLI and underestimated GUIs. Now I tend to appreciate GUIs and use them more.


If you're experienced with the command line, it's easy to use GUIs and get good results.

If one starts with GUIs and doesn't really understand what is behind, then all kinds of trouble happen.

So I guess, as with any tool, understanding is key.


Not all GUIs are just a graphical wrapper for a CLI. But in general sure, understanding the tech behind helps.


No doubt, if you're working in AutoCAD, there's no command line that you need to understand first.

But then again, if you're working in AutoCAD, you'd never say "I used to work in CLI only, now I use GUIs more and more".

Clearly they meant GUIs that have CLIs behind, or at least CLI alternatives.


AutoCAD is an unlucky choice of example here, because it's one of the few GUI drawing tools that actually does have a command line behind it that you have to understand sometimes! Look up a screenshot of AutoCAD and you can see the command prompt at the bottom of the window.

And if you were using AutoCAD in the 80's you can say exactly that you used to use the CLI only!


CorelDraw? Word? Audacity?

Or maybe it will turn out that all GUI software has command line somewhere inside. I think my original point stands even better in that case.

Yay for hair-splitting, I guess.


"Or maybe it will turn out that all GUI software has command line somewhere inside"

Not at all. Most maybe, but since I wrote GUIs without a CLI I can say for sure that not all have them.


I mean, every web GUI, which are now probably the majority of GUIs do not have a CLI anymore..


Can you give some examples? Which GUIs are you using?


I have used git extensively in the terminal. But nowadays I see myself more and more relying on GUIs like the ones integrated in Intellij IDE, Source Tree, etc.

Another example could be qemu and the GUIs that we have nowadays. One final example would be simply drag and dropping files via Finder instead of using cp/mv


I used to sort of like the Azure GUI (yes I’m a total psychopath), but then they changed it 9 billion times and now I just use the CLI. It’s frankly often experiences like this which drives me back to the cli. I like Gitkraken, but the it does an update and forgets my SSO or it doesn’t support a specific thing I need to do and then I’m using their console.

I’m not really religious about anything, but I often end up going back to the CLI for a lot of things because it’s just less of an annoyance.


Oh believe me, I wish what you wrote was true, but it isn't.

I've seen people think they have a specific Python environment active just because they were in their project's directory on the command line.

I've seen people not understand that "python -m pip" is a command and even if they are in a directory which has "python" in its name, they still have to type "python" for that command.

PS: The command line might even be an emperor. And the emperor could be naked...


> I've seen people think they have a specific Python environment active just because they were in their project's directory on the command line.

I wrote python-wool as a simple wrapper to python to make that true because it's just easier that way. Direnv can also be configured to do that as well.

http://GitHub.com/fragmede/python-wool


The command line is king, but sometimes the king is mad. Which is to say, it can be difficult to work with the monarchy when the syntax is shit. And there's a lot of bad syntax out there: overly cute, so terse as to be cryptographic, the straight-up baffling ...

Outside of the syntax (which seems to live forever), you have things like non-sane defaults, obscurantist man pages ... the list goes on.


the ability to hit up and edit and retry is such a redeeming feature. repeating the same actions in a GUI with no keyboard shortcuts is an exercise in frustration.


GUIs can have keyboard shortcuts. Have you honestly never pressed Alt+F, x to exit a GUI program or Ctrl+S to save a document in a GUI editor, or Ctrl+Tab to switch tabs in a GUI browser or Tab to move focus between fields, or the context-menu button next to Alt Gr and then a keyboard accelerator key for the menu, or Ctrl+C then Ctrl-P or anything?

Repeating the same actions in a CLI with no readline is an exercise in frustration, but ... that's not what happens most of the time.


> the syntax is shit

When you’re typing a lot, you really don’t want to do a lot of typing for each step. And the shell scripts were for automating some process, not solving a particular problem (you use a programming language for that). The workflow is to have the references ready in case you forgot something.

That brings up the man pages, which can varies in quality, but, for most software I’ve used, tend to be comprehensive. But they assume that you’re knowledgeable. If you’re not, take some time to read a book about system administration (users, processes, files permissions,…).


I'm not sure if I have O'Reilly's System Administration book. I used to get pre-prints back in the early nineties when they were still quite new. In any case, yes, I have read and I have Been Around.

And I still think that we can improve. More over, we ought to improve.


This. And it does not even exclude having a (T)UI. Modern terminal tools like neovim, lazygit, zellij, btop++ or yazi can do many things as window management, image previews and colors as well as having mouse support.


Are there any good tools to be able to ssh into a machine and preview images or render markdown directly in line with the cli?


For rending markdown, I use https://github.com/charmbracelet/glow


sixel support lets you display images to the command line, for terminal emulators that support it.


This is such generic advice about computing, it’s like saying:

“To make a building, you need to have a foundation, something to keep the roof up, and a way for people to move inside.”


The analogy I would make is that living in the command line is like using a CAD program while living in IDEs is like using CorelDraw to design houses.

CorelDraw feels more efficient because one quickly has what looks like a beautiful, colorful house on the screen. And then one does not understand why the doors don't work correctly.


> On the command line, the context is obvious.

You sound like someone who never tried to write a cronjob script…


> Input and output are also obvious. Input is what you type, output is what you see.

Input maybe, although realizing that instead of typing, you can pipe, is a major conceptual breakthrough when new users are learning to work the command line.

But output? The existence of stdout and stderr as two sources of text with no visible distinction is highly nonobvious and continues to trip me up in practical situations to this very day.


"Don't know about the concept of input and output"

Wow, that seems quite fundamental. Computing 101.

I'm not a "coder" and I spend "99%" of time on the command line. Because I prefer it. Ever since the 80s when I first used a VAX.


  On the command line, the context is obvious
Hardly.

If you’re lucky, you might know what your current directory is.

More often than not, at any particular point, your command line is paused in the middle of some likely ad hoc multi-step process. A process with a bunch of state stored as opaque blobs of data scattered across the file system. More so exacerbated in my case as those files are likely cleverly named x, x1, x2.

Modern systems benefit from things like command history, scroll back buffers, and similar constructs that can be leveraged to help you, as the user, restore the current context. But for a very long time, many simply returned to a $ and a soulless, cold, blinking cursor callously expecting that you recall you know where you are and what you’re doing.

The tools are there to help you dig and perhaps restore the current context (current directory, latest files, etc.) but that’s a far cry from “obvious”. Lots of folks return, blow their internal call stack, and just start over when they come back from lunch (if practical).


its unsung strength is in having multiple terminal windows and a browser open, and the simplicity of being able to hit up and being able to edit and then retry a failed command. I can't do that in Photoshop.


The capitalist system loves the CLI.

That complicated series of commands you just ran? Copy and paste them into the Jira ticket so the junior employee who makes half your salary can run them next time.


> Copy and paste them into the Jira ticket so the junior employee who makes half your salary can run them next time.

Or, more likely: so that you yourself can remember what you did the next time the problem arises. Or your colleague, who is senior, but does not know this part of the codebase or infra well. Heck, you can even write a shell script, automate things and have your productivity increased!

These things will be just as true and useful in some communist FOSS context as they are in a capitalist system.


It’s not capitalist oppression to not want to reinvent the wheel every time.

If the solution to a problem is as simple as “copy and paste this command”, I want to hand that off to a junior employee so I can go do other things that require more creativity.


Because I had to read this before falling asleep I had audio going out 2x so it sounds like a fifties cartoon.


The CLI has a massive blind spot in today’s operating systems: it knows nothing useful about events.

Yet events are the primary way anything happens on a computer, whether it’s a user system or a server.


Annoying though it may be, you can run a program in the background that can write to your open terminal.

Just in userspace you have;

   dmesg -w

   tail -f /var/log/messages
There's also dbus to monitor on Linux systems and a lot of kernel hook tricks you can use to get a message pop up if an event happens.

Because it gets annoying to have a process splurge notification stuff to a term you are working in, that's why you have info-bars which many terminal emulators support.


That's the specific hole which Expect fills for scripting:

<https://en.wikipedia.org/wiki/Expect>

Much of modern operating systems is events-based and relies on polling for specific actions, devices connecting / disconnecting, etc. Linux device management is based on this, for example. There are numerous scripts which fire as a service is enabled or disabled (networking "ifup" and "ifdown" scripts, for example), or services are started / stopped. Systemd extends these capabilities markedly.

And of course there's a whole slew of conditional processing, beginning with init scripts, [], test, &&, ||, and specific conditional logic within shells, scripting, and programming languages.


In unix world, everything is a file, so you can poll the file waiting for something to happen. And there’s the whole signal thing and dbus exists.


Yeah, these are the event paradigms that I meant by “nothing useful.”

Files are not a good abstraction for events. Signals are broken in many ways. And DBus is both extremely clunky to use and non-portable.

There isn’t a built-in event paradigm similar to how streams and pipes are an integral part of the Unix-style CLI.


> Files are not a good abstraction for events

Why is that? On the low level everything is a state of electronic cells. Files address those cells in a suitable fashion. Modern programming abstractions such as async/await are very simple, but fail miserably if you need something really complex and efficient.


I can't even understand this. How does the CLI relate to events? Are you saying that all servers secretly have an invisible GUI and handle events there?

This just seems like an odd non sequitur.


You mean, like entr triggering commands?


huh? DBUS is very much a thing and has CLI-tooling?


Me: seems like my sort of thing.

Me: navigate to linked website, see wall of text.

Me: clicks reading mode

Me: *193 - 245 minutes*

Me: bookmark to read later; probably not


It’s a short novel.

Putting it in a browser window gives it bad odds. You can also listen to it:

https://youtu.be/KpaUg6WwdzU

Begins at 01:30, 25 minutes.


> 25 minutes

That link is only part 1 (of 7). It's still around 2 and a bit hours of listening in total. https://www.youtube.com/@robertreads4323/videos


Thanks for correcting me.


Thanks for the link! I'm 3/7ths of the way through it.


It's very engaging, please try reading it.


You can skip ahead to his playful thesis > the universe emerging from the command line.

In his book The Life of the Cosmos, which everyone should read, Lee Smolin gives the best description I've ever read of how our universe emerged from an uncannily precise balancing of different fundamental constants. The mass of the proton, the strength of gravity, the range of the weak nuclear force, and a few dozen other fundamental constants completely determine what sort of universe will emerge from a Big Bang. If these values had been even slightly different, the universe would have been a vast ocean of tepid gas or a hot knot of plasma or some other basically uninteresting thing--a dud, in other words. The only way to get a universe that's not a dud--that has stars, heavy elements, planets, and life--is to get the basic numbers just right. If there were some machine, somewhere, that could spit out universes with randomly chosen values for their fundamental constants, then for every universe like ours it would produce 10^229 duds.

Though I haven't sat down and run the numbers on it, to me this seems comparable to the probability of making a Unix computer do something useful by logging into a tty and typing in command lines when you have forgotten all of the little options and keywords. Every time your right pinky slams that ENTER key, you are making another try. In some cases the operating system does nothing. In other cases it wipes out all of your files. In most cases it just gives you an error message. In other words, you get many duds. But sometimes, if you have it all just right, the computer grinds away for a while and then produces something like emacs. It actually generates complexity, which is Smolin's criterion for interestingness.

Not only that, but it's beginning to look as if, once you get below a certain size--way below the level of quarks, down into the realm of string theory--the universe can't be described very well by physics as it has been practiced since the days of Newton. If you look at a small enough scale, you see processes that look almost computational in nature.

I think that the message is very clear here: somewhere outside of and beyond our universe is an operating system, coded up over incalculable spans of time by some kind of hacker-demiurge. The cosmic operating system uses a command-line interface. It runs on something like a teletype, with lots of noise and heat; punched-out bits flutter down into its hopper like drifting stars. The demiurge sits at his teletype, pounding out one command line after another, specifying the values of fundamental constants of physics:

universe -G 6.672e-11 -e 1.602e-19 -h 6.626e-34 -protonmass 1.673e-27....

and when he's finished typing out the command line, his right pinky hesitates above the ENTER key for an aeon or two, wondering what's going to happen; then down it comes--and the WHACK you hear is another Big Bang.


I ran it through an LLM and asked it to summarize and answer questions about it. Worked great to get the gist.


Are you purposefully being ironic?


I am curious, why is this comment being down-voted? I mean, I would like to hear an opinion against it (not that I care about the points).


> Please don't comment about the voting on comments. It never does any good, and it makes boring reading.

https://news.ycombinator.com/newsguidelines.html


I am just simply asking for the opinion of people who disagree with OP. I do not care about the down-vote per se, more so about the opinion of people who disagree indicated by the down-votes.


Because HN has a hate boner against "AI". :)

The reality is that summarizing text and answering questions about it is one of the best use cases for what we currently call "AI".




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: