Hacker News new | past | comments | ask | show | jobs | submit login
In the Beginning was the Command Line (1999) (inria.fr)
179 points by BerislavLopac on Nov 5, 2020 | hide | past | favorite | 64 comments



"In other words, the first thing that Apple's hackers had done when they'd got the MacOS up and running--probably even before they'd gotten it up and running--was to re-create the Unix interface, so that they would be able to get some useful work done. At the time, I simply couldn't get my mind around this, but: as far as Apple's hackers were concerned, the Mac's vaunted Graphical User Interface was an impediment, something to be circumvented before the little toaster even came out onto the market. "

This is Unix revisionism. Most of the early development of the Macintosh is documented on folklore.org and they certainly didn't rebuild Unix to fit on a Mac. They bootstrapped the Macintosh using the Lisa development environment, which itself was bootstrapped with Apple ][s. Unix was far too large and unwieldy for microcomputer hardware of the time, and Apple didn't have a license for it anyway. There's plenty of stories of early/hobbyist Mac buyers realizing their $2k computer had no development tools, calling up Apple, and being told that they'd need to buy $10k Lisa machines if they wanted to do real app development. MPW didn't come out until 2 years later.

(If you just wanted an easy-to-use development environment, Apple HAD worked on a GUI-capable BASIC for the Macintosh. But Bill Gates got wind of this and refused to renew their Apple ][ BASIC license unless they canned the project. Since no BASIC license meant no more Apple ][s, Apple caved, and the version of BASIC that Microsoft did ship on the Mac had no GUI support whatsoever. They would eventually ship Hypercard three years later, of course.)


> This is Unix revisionism (...) they certainly didn't rebuild Unix to fit on a Mac

I think you're reading too much into this.

MPW, which was made and used by Apple developers and sold by Apple, featured a command line interface, which the Mac otherwise didn't have. They implemented this because a command line interface is a powerful tool when working with software development and most developers know this and will eventually want one. That's what Stephenson is saying here.

The Lisa Workshop, which was used for software development on the Lisa, was also a text-based interface.

> Unix was far too large and unwieldy for microcomputer hardware of the time

No it wasn't. Xenix was released for the Lisa in 1984.


https://macintoshgarden.org/apps/macintosh-programmers-works... describes Macintosh Programmer's Workshop (MPW) as "a Unix-like integrated development environment for the Mac OS."

http://www.math.columbia.edu/~om/mpw-88.pdf is a presentation by 2 of the authors of MPW. (Richard Meyers, Jeff Parrish)


And Cromemco had CROMIX (a UNIX-Clone) for their 8-bit Z80 machines in 1979.

https://en.wikipedia.org/wiki/Cromemco


I loved MPW back in the day but I was an outlier compared to my other Mac developer friends


I wrote the CodeWarrior tooling integration for MPW and presented the finished work to the CEO of MetroWerks :-) Given the engineering quality on both sides of that wide divide, I felt some honor at the time -- I was paid but not that much.


I think Neal Stephenson was overly optimistic about the audience for this book, and I think he would regard it as a failure of the book that the only people reading it are people like us who can make the distinction you just made. He seemed to be writing for people who knew nothing that he hadn't already told them, and identifying CLIs with Unix was a simplification that I think was supposed to make it easier for his target audience to follow.

I did give the book to my computer-phobic father to read, because he was a historian who was interested in social and cultural changes and also kind of curious about what I did for a living, and he said it was interesting, but he also said he didn't really understand it at a concrete level and was relying on the vivid metaphors to get any meaning out of it. I don't think he really considered it worthwhile for someone like him to read, which I think meant it was a fundamentally ill-conceived book. Still a fun read, though.


I don't know, to me this short story practically screams preaching to the choir. Its almost like a patriotic rant to operating system fanboys. I don't think the audience was ever going to be non-computer nerds.


> Unix was far too large and unwieldy for microcomputer hardware of the time,

I think that the article means that they implemented a unix-like command line, but not a full, posixly-compliant unix. Justh a mock sh and a handful of text-based utilities to be able to work in. At least this is how I read this paragraph.


The story of Mac Basic getting canceled is indeed a terribly sad moment in Mac history, documented here:

https://www.folklore.org/StoryView.py?story=MacBasic.txt


Unix was far too large and unwieldy for microcomputer hardware of the time

No. Xenix ran on an 8086. Unisoft built a whole business around porting Unix to 68000s (not 68020s in the beginning, not even always 68010s...68000s). There were many others.


There was a native FORTH environment that came out fairly soon after the 1984 Mac announcement.


As fungi terraformed land on early Earth, blazing a trail for plant life to follow, Forth is often the first colonizer of new computing platforms.


I'm a bit surprised. BASIC is not that hard to implement. Apple could make their own.


The folklore article in a sibling comment to yours clarifies it a bit. Apple did make their own, but it wasn't running on the Apple ][ or compatible with what was running on it. That was still their biggest revenue stream, and they couldn't afford MS withholding a license for BASIC on it or fragmenting the community (two versions of BASIC on the same platform depending on date of purchase). So they made a pragmatic choice to avoid getting screwed by MS, who went and did what they always did (especially back then): screwed them anyways.


Simply starting a credible project to do their own could have cause Microsoft to soften its terms.

As for a different version of BASIC, nothing stopped Apple from making a work-a-like BASIC. After all, that's how the Compaq was made, and plenty of other work-a-likes.

From a modern point of view, the software in those days looks pretty simple. I'm surprised there weren't a lot more clones.


I can only take the folklore article as accurate here, but it seems that it boiled down to timing and cost. They probably could've done it, but their existing BASIC project had already taken a couple years. If they'd elected to replace MS's BASIC implementation with their own, they'd have had a year or so to get it done. And failure would've been very costly.

Corporate risk tolerance comes into play at that point.


> Unix was far too large and unwieldy for microcomputer hardware of the time

“UniFLEX was very similar to Unix Version 7”:

https://en.m.wikipedia.org/wiki/UniFLEX


There was also Coherent which ran on very modest hardware: https://en.m.wikipedia.org/wiki/Coherent_(operating_system)


I love this essay. I’ve been hoping that Stephenson would rewrite it with 20 years of updates.

It describes the landscape better than anything else I’ve found as Stephenson is a user and a great writer, I think. Most other accounts are by people who make their living in journalism, or hardware, or software.

Stephenson is also an example of someone who is really into computers, and programming I suspect, but has a primary goal of writing. I like when non-programmers program (eg Jake VanderPlas [0] wrote chunks of scypi even though he’s an astronomer, even though he works as a programmer now).

[0] http://vanderplas.com/media/pdfs/CV.pdf


If you like programmers who write fiction, you might also be interested to know that Mark Russinovich (of Windows Sysinternals fame) writes tech thriller novels.

(I haven't read them but they have been well-received.)


Vernor Vinge is another good author in the category of people really into computers who write fiction (I remember one of his novels had a side plot about interplanatary usenet having routing failures)


It is _A Fire Upon the Deep_ and the interplanetary Usenet is a core plot point, IMHO.


> There was a competing bicycle dealership next door (Apple) that one day began selling motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.

In retrospect, I think Neal was referring to a specific aspect of Apple's products when writing 'hermetically sealed', but I view almost any Apple product, as a whole, that way. (Apple doesn't want you to service them yourself, or know how the software works.) Even after 20 years, some things never change.


> Reagan would describe the scene as he saw it in his mind's eye: "The brawny left-hander steps out of the batter's box to wipe the sweat from his brow. The umpire steps forward to sweep the dirt from home plate." and so on.

This continues to "blow my mind" people can do this. What a gift and possibly curse!


> When the cryptogram on the paper tape announced a base hit, [Reagan] would whack the edge of the table with a pencil, creating a little sound effect, and describe the arc of the ball as if he could actually see it. His listeners, many of whom presumably thought that Reagan was actually at the ballpark watching the game, would reconstruct the scene in their minds according to his descriptions.


That is telepathy. Amazing use of language.

Story telling in real time. Love it.


“But around the same time, Bill Gates and Paul Allen came up with an idea even stranger and more fantastical: selling computer operating systems”

This article must be coming from some parallel universe. As I recall how Micro-Soft got a contract from IBM to supply an OS for their low-spec personal computer. They didn't have one, so Micro-Soft bought-in 86-DOS from Seattle Computer Products, using the IBM money to pay for it up front. Rather than buy it outright, Microsoft persuaded IBM to license a copy of DOS for each IBM PC sold. Later on with ‘Columbia Data Products’, Compaq and other, figuring-out how to clone the PC without paying IBM, Microsoft was more than happy to license DOS to them.

“Columbia_Data_Products”

https://en.wikipedia.org/wiki/Columbia_Data_Products

“Joint Development Agreement between International Business Machines Corporation and Microsoft Corporation”

“With respect to Phase I Output, to the extent such joint ownership is prevented by operation of law each party hereby grants to the other a non-exclusive, royalty-free, worldwide and irrevocable license to use, execute, perform, reproduce, prepare or have prepared Derivative Works based upon display, and sell, lease or otherwize transfer of posession or ownership of copies of, the Phase I Output and/or any Derivative Works thereof.”

http://edge-op.org/iowa/www.iowaconsumercase.org/011607/0000...


The novelty was in selling individual licenses of operating systems rather than bundling them only with hardware. Computers has OSes, and of course they were bought and sold among companies, but they were included with computer purchases.

So while IBM licensed with Microsoft to provide the OS (and MS just bought DOS from someone else), Microsoft sold the same OS to lots of others as well. And even as retail for upgrades and changes that didn’t come from the hardware vendor.

This is the same universe we’re all in.


CP/M and UCSD-Pascal, two operating systems available for the IBM PC besides DOS, had already been sold to individual users for many years.

Microsoft itself launched Xenix a year before the PC (August 1980).


I think like pretty much all of Microsoft, they didn’t innovate by doing it first, they innovated by popularizing it.

I can’t find sales of CP/M and UCSD-Pascal, but I imagine they aren’t what Microsoft started generating from their OS.


A Byte magazine editorial praised the newly launched PC as being the "Rosetta Stone" of computing for offering such a choice in operating systems. In practice, with PC-DOS being 5 times cheaper than the other two it was a standard from the start.

I mostly used QNX with PCs myself until Linux came along. So though it took a long time, PCs did eventually run nearly all known OSes.


"History" is always horribly wrong as written, if you were there at the time, I suspect. Add OS/9 and Flex to the list. There were countless others.


That's OS-9, with a hyphen: https://en.wikipedia.org/wiki/OS-9

Not to be confused with the IBM family of OSes with a slash, notably OS/2 but also including OS/390 and OS/400.


Actually not DEC PDP's had multiple OS's and not all of them where bundled RT-11 vs RSX-11 or RSTS/E.

You brought the system that suited we (as a Lab) ran RT-11.


I think that refers to https://en.wikipedia.org/wiki/Open_Letter_to_Hobbyists. If you could say microcomputers in the ‘70s had an OS, Basic was it.


Mirosoft was selling software (licenses) way before the IBM deal happened. That phrase is about BASIC.


2004 commentary reflecting developments in computing between the original writing and then:

http://garote.bdmonkeys.net/commandline/index.html

Written with Neal Stephenson's permission.


per Wikipedia page: With Neal Stephenson's permission, Garrett Birkel responded to "In the Beginning...was the Command Line" in 2004, bringing it up to date and critically discussing Stephenson's argument. Birkel's response is interspersed throughout the original text, which remains untouched.

http://garote.bdmonkeys.net/commandline/index.html


> The U.S. Government's assertion that Microsoft has a monopoly in the OS market might be the most patently absurd claim ever advanced by the legal mind. Linux, a technically superior operating system, is being given away for free, and BeOS is available at a nominal price. This is simply a fact, which has to be accepted whether or not you like Microsoft.

AFAIK this is a very common misunderstanding. The term 'monopoly' is being (ab)used by a lot of people as meaning 100% market share, which is NOT how the courts define it. (They just tend to agree that less than 50% market share is not a monopoly.)

And while market share is an easy to prove (ok, not that easy, the hard part is to figure out what actual market is being talked about) and useful warning sign, the courts tend to take more interest in the actual anti-competitive practices.

> The old robber-baron monopolies were monopolies because they physically controlled means of production and/or distribution. But in the software business, the means of production is hackers typing code, and the means of distribution is the Internet, and no one is claiming that Microsoft controls those.

Hmmm… did anyone say 'Github' ?


Great book. Little dated, but still worth a read.

Love the HOLE HAWG analogy about tools that do what you tell them to, immediately and sometimes dangerously, regardless of whether what you told them to do was right.


And here is the mandatory AvE Hole Hawg tear-down: https://www.youtube.com/watch?v=qoR59rzqlxw


Though he sang the praises of the Hole Hawg, it's worth noting that he later switched to OS X. Usability still matters.

"You guessed right: I embraced OS X as soon as it was available and have never looked back. So a lot of "In the beginning was the command line" is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely."

From question #8 of an interview with him in 2004 at https://slashdot.org/story/04/10/20/1518217/neal-stephenson-... His responses to the other questions are entertaining and worth a read as well.


That can describe Unix command-line tools. No "are you sure y/n?" except if explicitly asked for via a flag, unlike DOS.


This is true, but the concept was taken further in the Oberon operating system, which has a design goal of never asking the user questions at any point.


Interesting. Another thing I vaguely remember reading about Oberon was that any subroutine in the OS could be used from any program, or something like that, for high code reuse. I'm sketchy on the details. Read it quite a while ago, maybe in a BYTE article about Oberon. Not sure if that implies if all programs were in one address space, or what.


I think they are, but OTOH I think that's acceptably safe when it's a single-user, client-side-only OS written in a rigorously bounds-checked, type-safe language.

I suspect that the obsession with process isolation in xNix reflects its origins: as a terminal-based multi-user OS written in perhaps the least-safe high-level language ever developed, one in which it is necessary to deploy terrible techniques such as pointer arithmetic just to get anything done at all.


Mostly agree. Didn't know Oberon was a single user OS, though I did read that Wirth was also involved with creating the Lilith workstation (he missed the chance to call it a Wirthstation :), so it could make sense that Oberon was single-user too.

What's so wrong with C pointer arithmetic, per se? I know about the issues with pointers in general, having used C a lot, earlier. But the arithmetic?


There are multiple issues with pointer arithmetic and it is widely regarded as one of the weakest, most failure-prone points in the C language.

Some discussion: https://www.cs.swarthmore.edu/~richardw/classes/cs31/s18/off...

http://web.cse.ohio-state.edu/~reeves.92/CSE2421au12/SlidesD...


Thanks, will look at those.


hehe this has become my favorite saying lately for computers 'do what I want, not what I told you to do!' computers have a lovely way of merrily going along and breaking things at a fairly fast pace.


Lately I find myself saying “do what I told you to do, not what you think I want to do”

Mainly this is due to the autocorrect, autocomplete on most devices nowawadys. I’m sure it’s very helpful, but I seem to notice the mistakes more than the successes. (Eg, trying to type “nowadays,” I had to break out of typing on my iPhone 3 times to backspace and stop it from changing it to other words and expressions)


hehe that is awesome it is opposite of mine but also so true! I turned off autocorrect on my phone. Suggest is fine, but just changing it... not so much.


Many CLI tools have a dry run option for expensive (time/resource wise) or risky commands (one way, irreversible or reversible only with a lot of effort). It would be interesting to see this become the default for some of them, with a separate flag `--now-i-mean-it` to actually execute.


I wish more tools had the option of dry run. Been using it with ansible quite bit in the past few weeks. Look ma I can mess up 50 computers all at once!


Yes, like

make -n

https://man7.org/linux/man-pages/man1/make.1.html

In fact, the above man page shows that one long form of the -n option is named --dry-run :)


I’ve spent so much time with rsync’s -n (I think it supports —-dry-run as well).


Shouldn't dry run be the default and the "prod" run be requiring adding the switch


This is ones of those essays that pops up on Hacker News every so often, and I'm always happy to reread it. Regardless of how the technologies have changed or if the prognostications were correct, this article is about timeless and relevant themes.

Abstraction is the most powerful force within the human mind. It is perhaps solely responsible for the world we've built around us. But it is also absolutely terrifying how increasingly reliant we are on it.

"Contemporary culture is a two-tiered system, like the Morlocks and the Eloi in H.G. Wells's The Time Machine, except that it's been turned upside down. In The Time Machine the Eloi were an effete upper class, supported by lots of subterranean Morlocks who kept the technological wheels turning. But in our world it's the other way round. The Morlocks are in the minority, and they are running the show, because they understand how everything works."

We are more and more surrounded by technology that the majority of us don't understand even at a fundamental level (for the record, I include myself in the Elois). More often than not, these technologies are essentially taken for granted as magic. While no reasonable person should expect everyday people to understand the inner workings of their handheld supercomputers or their cable TV box, we would all be better off if we better understood the fundamental building blocks of the technologies we are surrounded by -- whether that's basic logic gates, the simple patterns of conditional statements in programming languages, what caching and cookies are on the web, or how a hard drive works at 30,000 feet.

"So GUIs use metaphors to make computing easier, but they are bad metaphors. Learning to use them is essentially a word game, a process of learning new definitions of words like "window" and "document" and "save" that are different from, and in many cases almost diametrically opposed to, the old."

Like Stephenson so vividly describes, the way our technology mixes metaphors is not instructive to what's actually happening on the metal. These lossy abstractions don't seem harmful at face value because they aren't, but, as they compound and more complex technologies are adopted in our homes and places of work, they threaten to make us less efficient at our jobs, more reliant on manufactures for repair and troubleshooting, more susceptible to disinformation and encroachments on our privacy, and, in my opinion most importantly, at risk for critical failures in our infrastructure (what if there aren't enough Morlocks?)((the IoT, machine learning and social media algorithms are what really frighten me)).

And we haven't even mentioned how we are now increasingly reliant on fragile systems that no single person can understand, and the dynamic nature of software means that many, many applications out there are essentially ships of Theseus that could sink at any time.

This isn't hyperbolic Doomerism. I don't think this is the Decline and Fall or anything. It's also not a condemnation of super-abstractions or casual technology use. I, like Stephenson, am a paying Disney World customer if you will. I just believe we need to invest more into the right kinds of high-level technocratic education for the general populace (and continue to combine it with liberal arts, of course), and our technologists need to invest in redundancies and stable technologies.

Luckily, we have built the Library of Alexandria 2.0 in the internet; we just need to use it.

Some fun, relevant links off the shelves of that library:

https://reasonablypolymorphic.com/book/preface.html

https://blog.nelhage.com/post/computers-can-be-understood/

https://www.nand2tetris.org/

https://mcfunley.com/choose-boring-technology

https://cs.stanford.edu/people/nick/how-hard-drive-works/

https://singularityhub.com/2016/07/17/the-world-will-soon-de...

https://en.wikipedia.org/wiki/The_Machine_Stops

https://medium.com/message/everything-is-broken-81e5f33a24e1

P.S. I love this related thought experiment --> https://www.scientificamerican.com/article/rebooting-civiliz...


Wow, a ~38k words essay about the command line.


I take it you've not read Neal Stephenson before. I often describe him as someone who uses 1000 words where others would use 500 - but I love him for it; he's probably my favorite author.


Same! I love the degree of plausibility in his novels. Even the most outlandish elements have been well-researched, and thought out.


It's about more than just the command line.


I'd describe it as an essay about complexity, computing history, open vs. closed systems, and personal responsibility.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: