Hacker News new | past | comments | ask | show | jobs | submit login
The Xerox Smalltalk-80 GUI Was Weird (collindonnell.com)
136 points by ingve on July 2, 2023 | hide | past | favorite | 92 comments



I'm assuming the "by the Bluebook" implementation they're referring to is this: https://github.com/devhawala/ST80

It's nice to use, but awkward because almost nobody has a three-button mouse laying around. One of the cool things about the Blue Book is that it lays out exactly how to implement e.g. graphics primitives and even the object system itself. In fact, the implementation I linked above uses the exact implementation of these things as outlined in the Blue Book, just written in Java.

> The first thing is that the Smalltalk environment wasn't really an operating system the same way something like Mac OS was. It’s more like an IDE that runs on bare hardware.

I disagree with this. It is an operating system. Just because it doesn't have barriers like a kernel and doesn't treat debuggers as third-class citizens doesn't make it any less of an operating system. It just makes it "not Unix". Lisp machines were also very similar in this regard: there is no "system" in the Unix sense: your compiler, debugger, and runtime were a whole unit and that is what you booted into.

> The Smalltalk environment was a revolutionary GUI, but it was still a system you would have had to have been a computer operator or something to really use.

Probably true. But I'd also like to mention the images in the GitHub link I posted are very early snapshots of Smalltalk-80 and may lack any features that were added later. More importantly, the author is missing that these machines came with a manual that likely described how to use the interface. But I do have have to concede: these machines were extremely expensive and definitely not targetted towards "everyday people", but can we even say the same about the Lisa? I know the Macintosh was definitely designed for everyday people, but the Lisa was pretty expensive even for its time, and many attribute its commercial failure to this.


> almost nobody has a three-button mouse laying around

With the exception of a trackpad or Magic Mouse (or whatever Apple calls it), doesn’t everyone using a mouse have a three button mouse? Is the scroll-wheel click no longer used as middle click by default? Even Apple’s best mouse, the Mighty Mouse, can middle click.


Holding down a middle click on a wheel is moderately awkward. Grabbing a mouse with a few more buttons and binding one to “middle” click is much nicer.


I bought an actual three button mouse to work with acme precisely for that reason


If you look besides Apple, there are lots of mouse variants with three or more buttons in every computer shop available.


That… is their point?


I probably wanted to reply one upwards...


The apple trackpads support multi-finger clicks and swipes, though of course the chording is different (you can’t simply press the “middle” button by itself).


I've faced similar issues when trying Plan 9, which assumes a three-button mouse and makes heavy use of mouse chording. I had difficulties using my normal mouse wheel USB mice. I ended up purchasing some old Sun USB three-button mice for experimenting with Plan 9.


> I had difficulties using my normal mouse wheel USB mice.

Weird. I do this dozens of times an hour on mine, no problem.


Not by default, they don't. I recently got a Macbook from work, an M1 running the latest macOS, and there is no middle-click. There is no built-in function to enable it, either.

This being the Mac world, there's a well-known commercial tool to provide it: https://middleclick.app/

I found a FOSS equivalent: https://github.com/artginzburg/MiddleClick-Ventura


Since OP was talking about mice, I wanted to differentiate from trackpads. Maybe people don’t use mice anymore and I’m just out of the loop (though my 12yo uses a mouse with his iPad) or maybe OP is thinking of pre-scrollwheel three button mice.


With a trackpad you can two-finger-tap for right click and three-finger-tap for middle click, at least on Linux. This even works on Macs that have had Linux installed.


> the Lisa was pretty expensive even for its time, and many attribute its commercial failure to this.

A Lisa was about $10,000.

The Alto wasn't for sale to the public, but it's successor, the Xerox Star was.

A Xerox Star "starter kit" was about $75,000, and included a workstation for end user use, a workstation to act as a file, print and mail server, and a laser printer all networked together.

Once you bought the "starter kit" you could get additional workstations for $16,000.

There was a recently linked blog post about restoring an HP workstation from the same era that retailed for $25,000.

https://hackaday.com/2023/06/20/repairing-a-25000-hp-worksta...


>> The first thing is that the Smalltalk environment wasn't really an operating system the same way something like Mac OS was. It’s more like an IDE that runs on bare hardware.

> I disagree with this...

Totally, the idea of something like an OS is there, even if it was not implemented by the definition. If the author wants to to deep inside of the OS ideas and Smalltalk it can follow the following links, when even TCP/IP is implemented in Smalltalk: [1] [2].

[1] http://squeaknos.blogspot.com/

[2] http://swain.webframe.org/squeak/floppy/


> I'm assuming the "by the Bluebook" implementation they're referring to is this:

Or this: https://github.com/rochus-keller/Smalltalk/



Sorry, no. I saw a demo of a Xerox workstation at PARC in 1982, probably not that different from the demo Steve Jobs got. It was a lot more advanced than suggested by this article, which seems to miss that office applications were written on top of the system the article describes and provided much of the needed functionality. As a programmer you had a much worse interface, but the programs that had been written did the work. The Lisa end user experience wasn't that different from the Xerox Alto. The Mac made a lot of simplifications to make things smoother.


"Sorry, no" no to what? "this article" - what article?

Jobs came to PARC in December 1978. 4 years later was an eternity. By 1982 PARC had Dorado-based systems on Cedar.

"The Lisa end user experience wasn't that different from the Xerox Alto" -- except that the Alto had many different programs, each with their own interface, so I don't know what you're referring to.

I guess you're talking about Smalltalk, but that was a tiny part of what people at Xerox were doing. It had a negligible influence on the Star.


>"Sorry, no" no to what? "this article" - what article?

TFA ( https://collindonnell.com/the-xerox-smalltalk-80-gui-was-wei... ).

>Jobs came to PARC in December 1978. 4 years later was an eternity.

Not in desktop GUI land, since nobody did anything interesting commercially in desktop GUIs for those next 4 years.

And of course in those 4 years Apple started the project, designed the hardware and implemented the OS, GUI, and basic programs. Lisa actually began in 1978 itself, so it took all those years until its release in 1983. By which time they'd probably also had a look at the Star's ongoing developments.

Doesn't mean they didn't take a lot of their inspiration from that demo.

>"The Lisa end user experience wasn't that different from the Xerox Alto" -- except that the Alto had many different programs, each with their own interface, so I don't know what you're referring to.

He's referring to the GUI concepts as implemented in it (and exhibited in those programs), not necessarily some overall GUI shell or some particular individual program.

>I guess you're talking about Smalltalk, but that was a tiny part of what people at Xerox were doing. It had a negligible influence on the Star.

TFA mistakenly talks about the Smalltalk GUI as if the common understanding is that Smalltalk-80s' GUI was what Apple saw and copied from Xerox.

TFA then goes on to reject that Apple copied Xerox, because he finds that Smalltalk-80 GUI was hardly like a Desktop, but more like an IDE.

That description of Smalltalk-80 GUI is of course correct, but he missed the crucial part: people don't say Apple copied Xerox because of Smalltalk-80, but because of Alto's (and perhaps Star's) GUI concepts.

So it's not the parent who is confused, he merely replies to TFA.


> nobody did anything interesting commercially

Except the Xerox Star, which was "interesting" enough to attract spillover crowds to the Xerox booth at the May 1981 NCC. Including Jobs himself.

If you mean "users didn't buy it" you're certainly correct.

> He's referring to the GUI concepts as implemented in it (and exhibited in those programs),

GUI concepts: if you mean mouse manipulation, you're right. However, the Lisa had a unified interface that enforced some choices, while the Alto let every developer make his or her own choices. Also, as far as I remember, dropdown menu bars across the top weren't prominent on the Alto.


>Except the Xerox Star, which was "interesting" enough to attract spillover crowds to the Xerox booth at the May 1981 NCC. Including Jobs himself.

Yeah, I mean except Xerox. The context here is you saying that "Jobs came to PARC in December 1978. 4 years later was an eternity" and I take it you implied with this that Xerox didn't influence the Mac/Lisa.

So, my point is that not a lot of developments happened between that and 1983 in the commercial space for it to actually be "like an eternity". Star did happen which is also Xerox, but the "let's copy this" spark was already lit in Jobs from 1978 (and Lisa itself started being designed after that time).

>GUI concepts: if you mean mouse manipulation, you're right. However, the Lisa had a unified interface that enforced some choices, while the Alto let every developer make his or her own choices. Also, as far as I remember, dropdown menu bars across the top weren't prominent on the Alto.

Mouse manipulation, but also a hodgepodge of GUI ideas from the programs he saw demoed (and later on Star). E.g. Alto had Bravo with WYSIWYG, scrollbars, etc.

TFA was like: (a) People say Apple copied Xerox GUI ideas because Jobs saw Smalltalk-80 demo (b) But the Mac GUI is quite didfferent than Smalltalk-80, so this is wrong

And basically I'm pointing out that people don't just say it based on Smalltalk-80, but also Alto stuff Jobs saw in that visit, and Star stuff which came out before the Mac.

Love the Future book by the way!


I said 1978 but should have said 1979.

Anyhow, using a mouse to manipulate objects on a bitmapped screen was the influence. There were many ways to do that.

Not everyone in the computer industry bought into that. It seemed like a waste of the computer to them.


Didn't they demo the Smalltalk system to him?


Yes, but in addition to the other stuff they had:

"Steve Jobs visited Xerox PARC, where he was shown the Smalltalk-80 object-oriented programming environment, networking, and most importantly the WYSIWYG, mouse-driven graphical user interface provided by the Alto."

"According to Jobs, “I thought it would be an interesting afternoon, but I had no real concept of what I’d see.” What he and his Apple engineers did see was the Alto with its bitmapped display and mouse-driven GUI, the graphical word processor, Bravo, and a few demo applications in Smalltalk, PARC’s revolutionary object-oriented programming language. This was a carefully curated demonstration created by PARC’s researchers to give a taste of what they had created without giving away any secrets deemed too precious. “It was very much a here’s-a-word-processor-there's-a-drawing-tool demo of what was working at the time,” according to PARC researcher Adele Goldberg. “What they saw everyone had seen.”"


> "and most importantly the WYSIWYG, mouse-driven graphical user interface provided by the Alto"

This should be better, "available on the Alto". Neither was this in the/any OS, nor was Smalltalk or Bravo/Gipsy for the matter the sole software available on the Alto. The wording suggests a consistency, which simply wasn't there.

Notably, what was going on at PARC wasn't a secret: PARC published papers and key developers at Apple, like Bill Atkinson, were familiar with at least some of them. The latter were also those, who wanted Jobs to see this in real live and who accompanied him on the trip, and they came prepared. Moreover, numerous tours to PARC and its software had been given to outsiders before (compare A. Goldberg), some of these tours even more complete. However, Jobs was the first – at least in part (he later said, he totally missed out on the network and its importance) – to grasp what he was shown, which in turn convinced some of PARC personal like Larry Tesler (who was giving part of the presentation) to switch to Apple, in order to see this technology become a reality. (Jobs getting the GUI must have been some of an experience.)

Obligatory link in this context (a series of polaroids from Lisa development): https://www.folklore.org/StoryView.py?story=Busy_Being_Born....

(Mind bitmapped graphics being documented before the visit and, probably also before this, a first prototype of a windowed document. It is also interesting to see how the influence of the PARC visit shows also as some of a distraction, and it's only when they gradually inch back towards the original window design that things start to come together. May we interpret this as a process of emancipation from PARC's influence?)


https://www.livingcomputers.org/Blog/What-Really-Happened-St...

What Really Happened: Steve Jobs @ Xerox PARC '79


This is fantastic—thanks for the link. I always defend Apple when people say they "stole" their GUI from Xerox, but this post has a lot of great details, including what Xerox's VC arm did with their $1M of pre-IPO Apple stock (sold it for $1.2M :facepalm:)


"Dealers of Lightning" by Michael Hiltzik is an interesting book about the story of Xerox Parc.

About Steve Jobs and his programmers visit: <<They asked all the right questions and understood all the answers. It was clear to me that they understood what we had a lot better than Xerox [corporation] did.>>


If they kept it at some point it would have been worth more than Xerox.


Even more amazing, in the 80s they finally realized they should capitalize on all this technology, and allowed a bunch of entrepreneurial employees to use it (although they didn't invest $$$). Several of these (Spectra Diode Labs, Synoptics) could have made them a rival to Berkshire Hathaway.

https://www.albertcory.io/lets-do-have-hindsight


To be fair, Star development started as about as soon as this had become a viable project. Compare the availability of D-processors, and things like Smalltalk76, the Gipsy editor, Mesa as the base operating system being less than a year old, then.

I think, the narrative of Xerox just not knowing what they had doesn't add up with the timeline.


Well, they certainly weren't interested in stepping outside their business model. I think I showed that pretty conclusively.

As far as "knowing what they had" -- Jerry and I did write about that:

https://www.albertcory.io/lets-do-have-hindsight

it's not so much "not knowing", it's not being interested in taking over the computer industry and not wanting to understand it. They wanted a machine they could sell through their copier sales channel, period.

The part where Don Massaro says, "The desktop is OUR market. The 820 is just to hold our place on it until the Star is ready" (and yes, he really did say that) betrays a stunning blindness to what was going on:

"People are bringing their Apple II's to work so they can use VisiCalc? Well, we'll put an end to that stupidity!"


I think, it's complicated. As I see it, Xerox tried to follow through on the paperless office vision, but it was either too early from a technological/price/scale perspective, and/or the organizational challenges and requirements for any adopters of such a complex system were too great and it was too early for these to be tackled in all seriousness (even Steve Jobs got just half the message), and/or it was already too late, as alternatives that required much less adjustments and promised better scalability (in adoption/deployment as well as in production) were already on the market, and these had also far less development costs to cope with. From this perspective, the Star was doomed from the beginning, nevertheless it was still an important and remarkable project. — Maybe, Xerox could have tried more intensely, maybe they could have been more interested, but I'm not sure, if it would have made much of a difference. (However, people like Larry Tesler switching to Apple, illustrates that there may have been also a systemic problem. In contrast to you, I wasn't there, and I haven't witnessed the frustration.)

Having said that, I'm definitely ordering your book (probably today)! :-)

---

Edit:

Regarding the appearance of the IBM's PC in the article, "IBM ignoring the personal computer revolution" is much a popular myth, as is the IBM PC coming out of nowhere. In actuality, IBM had several projects over the 1970s, like Yellow Bird, Aquarius (both home computers, also under the lead of Bill Lowe), the latter, which would have included an app-store-like ecosystem of software on bubble memory cards, evolved to fully functional production prototypes, but was canceled last-minute in the meeting where it should have been finally approved over concerns regarding the reliability of bubble memory. Various other PC prototypes were designed at Eliot Noyes Associates. None of these internal projects succeeded. By this, the frustration had apparently grown to the extent that there had been deliberations of acquiring Atari's computer devision instead (at least, there were design prototypes for this.) Project Chess (IBM 1550) was just the final step in this process, which must have come with its own frustrations.

IBM Yellow Bird (1976) and IBM Aquarius (1977) home computer prototypes: https://images.app.goo.gl/JH9JLUrjHbUSfeQu7

IBM "Atari PC" design prototype (1979): https://images.app.goo.gl/Efh6kusDYboytWJ18

Eliot Noyes Associates PC prototype (1977): https://web.archive.org/web/20161025155337/http://danformosa...

(Images of Yellow Bird, Aquarius, and the "Atari PC" from “Delete.” by Paul Atkinson, where there's more of their story.)

The "birth of the personal computer" was not an easy one… even Apple failed with their more up-to-business Apple III…


Thank you, sir! If you get a paper copy, you can mail it to me for autographing.

(Media Mail at USPS makes shipping about $3, so it isn't really that expensive.)


Thanks for the kind offer! But this would be Europe/Austria — I guess, I'd rather spare ourselves the postage. (Let's see…) :-)

PS: Lowe is quoted in "Delete." with, "The Aquarius would have blown the socks off of everybody. I felt, and a lot of people felt this was going to be a big deal and make IBM believe in this whole business." Now imagine that frustration… They had invented, designed and developed iPhone-like home computing in 1977, and they believed in it and in its transformative potential, and then they came out of the meeting that should have approved it all with the project canceled.

(I understand that IBM didn't want to risk the reputation of their core business by deploying a potentially unreliable technology for what must have still been an experiment to them. [Well, that changed with the Jr. ;-)] But, I think, this also illustrates that personal/home computing ultimately had to come out of startups and not from industry leaders. As you rightly observed, IBM eventually managed to square the circle by implementing a startup-like process and putting their weight behind it. But they probably learned it the hard way.)


It's true, Xerox could have retired by now!


The demo Jobs & Co were given did not involve Xerox Star. (It was still the experimental PARC software.) The Xerox Star interface and OS were developed by Xerox SDD.

(One in practice rather problematic side of the Star was the extensive licensing that came with it. You needed an extra license for about every feature. Italic text? Get a new license, ect. This probably hindered its success as much as its – by virtue of this licensing policy ever growing – price tag. Also, the Star's still discrete processor may have been somewhat challenged at times. On the other hand, other than the Lisa, it feature full network integration. Regarding the end user experience, it was probably noticeable different from the Lisa, as there was no drag&drop and direct manipulation on the OS side of things. Rather, it was: click to select, press keyboard key to select function, use mouse to execute. Technically, the Star relied on property sheets for objects, which could be either presented as menus, button arrays or dialogs. Notably, this did not involve hierarchical structures, so there were no hierarchical menus.)


People can try the emulator.¹

Key to understanding is that GUI operations were two-handed, descending directly from NLS (people have all seen the ‘mother of all demos’², right?). Right hand on the mouse, left hand on the left function key cluster³ — one of the Star papers is explicit that the left cluster was a direct substitution for the NLS/Alto chord set.

A very noticeable difference from the familiar Lisa lineage is Move and Copy vs Cut and Paste — there was no clipboard. (Here I think Star got it right and Tesler got it wrong: the condition of the clipboard containing invisible fragile state is a mode, and a worse one than move-in-progress.)

¹ https://github.com/livingcomputermuseum/Darkstar

² https://dougengelbart.org/content/view/209/

³ https://digibarn.com/friends/curbow/star/keyboard/


> The Lisa end user experience wasn't that different from the Xerox Alto.

The Alto had a command line interface. The Lisa had a standard GUI.

Some of the programs you ran on an Alto implemented their own custom GUI, but each program was free to follow it's own GUI conventions. There was no standard.

The two systems really were very different.

You can still find copies of the Alto User Handbook online so you can read about how it worked.

http://www.bitsavers.org/pdf/xerox/alto/Alto_Users_Handbook_...


> The Alto had a command line interface.

That was true, but you really didn't spend a lot of time on the command line. It wasn't like Unix. People didn't spend their time inventing options for their command lines; they sunk it into the GUI instead.


Before I go into a rant, I will acknowledge that Apple did usability research with the introduction of both the Lisa and Macintosh, along with acknowledging that most popular GUIs being based upon the foundations they laid.

... but ...

It is important to remember that people were still figuring things out in the early days. There were many variations in how GUIs worked, some of which were minor and others of which were quite significant. None were predestined to be the final word in what constitutes a good GUI. In fact, one could argue that the Macintosh (and particularly the Macintosh of 1984) did not prevail. Far more people use Windows to this day, and most people will interact with touch interfaces more often than desktop interfaces. Those touch interfaces have ditched most of the traditional forms of interaction and the metaphors they are based upon.

As for the interface as an IDE, I think a lot was lost when the developer tools were separated from the OS. That's not to say people should be required to use them, but the ability to probe more deeply and modify if desired is amazing. The presence of such tools doesn't have to take away from usability either. Just consider the developer tools found in most web browsers. Everyone has those tools (at least on desktop browsers), they don't take away from the UX either.


An example of what was possible on the living Smalltalk-78 system: https://www.youtube.com/watch?v=eEz08IlcNMg


Thank you for your thoughts. I wander if downvoters have managed to comprehend what you’ve said. I doubt it.


> The first thing is that the Smalltalk environment wasn't really an operating system the same way something like Mac OS was. It’s more like an IDE that runs on bare hardware.

This was true of Xerox’s InterLisp-D environment and the MIT Lispms and their descendants as well.

For the Lispms we typically didn’t save the world (essentially the memory image) but instead used cloud storage.

As the article says on the PARC machines we mostly saved a band (memory image). Filesystems and many other things we would consider local services were (LAN) cloud services instead.

This was late 70s/early 80s and mostly took decades to become mainstream.


The Xerox Interlisp-D system also then got a user interface metaphor: rooms.

https://dl.acm.org/doi/pdf/10.1145/24054.24056

Rooms: The Use of Multiple Virtual Workspaces to Reduce Space Contention in a Window-Based Graphical User Interface

D. AUSTIN HENDERSON, JR., and STUART K. CARD Xerox Palo Alto Research Center


While we're at it you can add the ucsd p-system to that group, although there was actually one iteration of it that came bundled with its own hardware (by Western Digital, with an overly complicated CPU design implemented partially in discretes!) rather than the other way around using platform-independent VMs. Sort of anticipating the similarly forgotten PicoJava project.


> The Xerox Smalltalk-80 GUI Was Weird

"Weird" means only that you haven't used other similar UIs. Smalltalk 80 was a development environment. There were applications written in it with more extensive UI examples.

> At first glance, this looks incredibly similar to something like the desktop of the Apple Lisa or early Mac OS.

Not at all. But the Smalltalk 80 UI was only one of several that were developed. It would be better to compare the Mac OS UI to the office system from Xerox. An early application domain for the Mac was "Desktop Publishing" (Apple LaserWriter, Adobe Postscript, Aldus Pagemaker).

> The first thing is that the Smalltalk environment wasn't really an operating system the same way something like Mac OS was. It’s more like an IDE that runs on bare hardware.

The Mac hadn't also much of an OS in the early days. It was like running c/pascal/assembler code on bare hardware with a loader and a system library.

> You can’t even move or resize a window by clicking and dragging.

The "Direct manipulation interface" was made popular by the Mac.

https://en.wikipedia.org/wiki/Direct_manipulation_interface

A good idea to compare those systems is to check how early applications looked and worked. On the Mac that would have been MacPaint and MacWrite. You'll find earlier such applications for the Xerox systems, too.

The Xerox Star UI:

https://www.youtube.com/watch?v=xJzYRgmnJrE

http://toastytech.com/guis/star.html

Apple also had Smalltalk 80 running on the LISA:

https://www.macintoshrepository.org/23277-lisa-smalltalk-80


Sometimes I wonder if small talk had thrived how different the world might be.

For example, we’ve experienced a trend over the last decade of building platforms out of micro services. They have interfaces, which are sometimes statically typed and serialized with rpc libraries, and ultimately they send messages to each other, because they are isolated.

Now imagine small talk. It’s virtually the same thing, but 50 years ago (and not networked). Still each object is isolated, and they only communicate by sending messages. Everything is an object that’s inspectable and can “independently deployed” by editing it in real time.

This seems really nice.

Are we living in a worse version of the world (technologically) than what we could have had?

I don’t know a lot about small talk though so perhaps someone here can chime in and say why it wasn’t that great. To me though it seems like it was a serious system that was more than just a good demo. I can only imagine how it could have developed over 50 years if it was embraced.

To me it still looks futuristic.


I did work in Smalltalk in the university and I must say that in practice it was horrible to work with.

It was very common to have the image corrupted -- imagine that you by mistake add a bug to your code. Now the live version you're dealing with is just unworkable and you have to find the last version that worked (because in practice the code you add is live with the code of your whole system in just a single place, so, anything bad you did affected not only your target env it also affected your dev env).

-- version control was basically saving the whole image and no, you didn't have a way to diff 2 different images.

I think that the current state where you edit text files and run your code based on those (and which you can sanely version control) and edit with a separate IDE is a much saner approach.


> version control was basically saving the whole image

No, it really wasn't!

Is it possible that there were ways to archive code with Smalltalk that you did not know?

iow you were doing it wrong.

"Within each project, a set of changes you make to class descriptions is maintained. … Using a browser view of this set of changes, you can find out what you have been doing. Also, you can use the set of changes to create an external file containing descriptions of the modifications you have made to the system so that you can share your work with other users.

The storage of changes in the Smalltalk-80 system takes two forms: an internal form as a set of changes (actually a set of objects describing changes), and an external form as a file on which your actions are logged while you are working (in the form of executable expressions or expressions that can be filed into a system). … All the information stored in the internal change set is also written onto the changes file."

1984 Smalltalk-80 The Interactive Programming Environment page 46

https://rmod-files.lille.inria.fr/FreeBooks/TheInteractivePr...

----

"At the outset of a project involving two or more programmers: Do assign a member of the team to be the version manager. … The responsibilities of the version manager consist of collecting and cataloging code files submitted by all members of the team, periodically building a new system image incorporating all submitted code files, and releasing the image for use by the team. The version manager stores the current release and all code files for that release in a central place, allowing team members read access, and disallowing write access for anyone except the version manager." (page 500)

1984 "Smalltalk-80 The Interactive Programming Environment"


I wanted to play with Smalltalk on a new machine similar to raspberry pi but I was struggling to port squeak to it. Generated C source didn’t compile even though raspberry pi version should work there.

So now what? The only way forward is to really dig deeply into the ‘beauty” of C to make it work? This is exactly what I was trying to avoid in the first place!)

Perhaps you are the right person to ask. Do you think it is a design flaw to have vm tightly coupled with C and all it’s compilation nightmares ?

It doesn’t compile/run on the machine, possibly due to the lack of sound hardware but excluding sound package didn’t help. Where/what is the best way to get help with that? Perhaps you can give a wise advice on that.


Fortunately things have changed in the intervening years. Pharo, for example, includes a complete git integration:

https://github.com/pharo-vcs/iceberg


Thanks for sharing your experience with working with Smalltalk.

I wonder in a parallel universe, a smalltalk one, what VCS would have been like? Would the "last version that worked" also just have been an object?


You've been given a mistaken impression of the Smalltalk-80 development process.

Changes were automatically logged to an external text file and could be recovered.

By the late '80s there was multi-user fine-grained version control: ENVY/Developer

https://www.google.com/books/edition/Mastering_ENVY_Develope...


Check out the Monticello VCS for Smalltalk.

https://wiki.squeak.org/squeak/1287


What is the difference between "sending a smalltalk message" and "calling a method" (i.e., jumping the central thread of execution to a particular label until a ret instruction is reached)? This isn't truly distributed, it's still centralized. If one method gets stuck in a loop, the entire system grinds to a halt. If each "object" or "service" gets its own process or thread however, this doesn't happen, and you get graceful system degradation which can be mitigated by spinning up redundant services.

You're right, most businesses don't need microservices, indeed "monolithic" (not distributed) architectures are totally fine and they have less overhead. And yeah probably a lot of businesses that do use microservices don't even make use of the benefits of a distributed system yet have all of the drawbacks. That's just people being people (stupid).


If we had gone down the path of Smalltalk rather than the UNIX-like path we have taken, I imagine we'd see systems with asynchronous message-passing semantics using processes/threads. Think something like E (or now Spritely Goblins, which takes inspiration from it) with at least one separate vat per "application". This would result in a system like Smalltalk, but with concurrency to prevent the situation you're describing.

I think we'd be in a much better place today had we taken that path, for a number of reasons.


In Smalltalk, calling a method is what the system does after a message is sent and the system looks up the receiver and its class hierarchy. This is a little different from C++ where the message dispatch is just vtable lookup by the calling code, because the system might have different behavior, such as changing the message to a #doesNotUnderstand: message.

However, in your question it is not much different because the default approach is synchronous. Yes, the calling process waits for the receiver to send a response. This has been one of the sticking points of remoting objects since the beginning, that you cannot treat the network as completely transparent. The options are the same as in other languages: promises, callbacks, forking a process, etc.

On the other hand, having a simple representation for network calls that the runtime can figure out for you automatically is pretty cool. You can ask Smalltalk to create proxies on the fly that act like the real thing but do the marshalling and network calls. The same was true for RMI in Java, which may have been based on Smalltalk and/or Modula-3. Comparing this to COM or CORBA, where the representation is defined externally, or to SOAP or some JSON api, where all bets are off, keeping everything in one system makes this stuff easy.


If we wrote things as objects sending messages we might be able to disregard their remoteness most of the time. The value could be in writing software which wouldn't need to change much as you changed the way it's getting used radically.

For a crappy example, a tool like rsync can talk to a copy of itself over any kind of stream and sync things that are remote so because it has this abstraction, you can rsync over ssh but you're not limited to ssh. If some future stream protocol arises rsync could probably be made to work over it without a rewrite.


Smalltalk was, and in many ways remains, my favorite programming language - as a language. But the image-based development environment, and the performance of the system, were both awful.

As for message passing, and the potential for remote objects - well add CORBA into the mix of elegant, but ultimately unsustainable technologies alongside Smalltalk.

The world was not ready for "pure" OO in the 80s, and I'm not sure it's ready now. Some things really are better in the mind, than in practice.


Yes we are living in a worse version of the world that has come about because the more beautiful ideas didn't make it out in a wider way and with the right level of practicality-at-the-time to be scaled up.

We probably will get there eventually but by a very very circuitous route.


I think a better candidate for a comparison to the 1984 Mac would be the Xerox Star:

1. https://en.m.wikipedia.org/wiki/Xerox_Star#/media/File%3AXer...

2. https://en.m.wikipedia.org/wiki/Xerox_Star


You are missing the point here.

The point is the often-repeated story that "Apple stole the GUI from Xerox." It's not true, because when Apple saw the Xerox Alto, it was almost a prototype. It wasn't a commercial product, and it had no standard UI. Different tools had different UIs, a lot in the "apps" was keyboard-driven, and the famous GUI part, the Smalltalk system, did not have a lot of things we today take for granted.

The Alto didn't have them because they had not been invented yet. That is because Apple invented them.

Apple's Lisa OS, and later Mac OS, invented things like title bars with standardised control buttons, menu bars and pull-down menus, graphical iconic toolbars, dialog boxes with standard buttons in standard places, desktop icons, graphical file managers representing files and folders with icons that the user can position, and so on.

Now, all GUIs have these. Later Xerox GUIs have these, such as the Star. But the Alto didn't, because the ideas hadn't been invented.

Xerox came up with the GUI and overlapping windows and graphical menus and things like that, but Apple refined it into something more usable by non-experts, and also simplified it so that it was usable with a single mouse button.

What Apple saw in the demos was indeed weird, which is the point of TFA.


> For example, there’s no traditional file system... The whole desktop metaphor isn’t really there, because that isn’t really what it was.

The earliest Macs feel 'weird' in this way too, since the hierarchical filesystem wasn't introduced until the Mac Plus and multiple applications running simultaneously weren't officially supported until System 5 (with MultiFinder). It doesn't really start feeling like the modern 'desktop' until System 7, where they started treating the desktop like a folder.


> For example, there’s no traditional file system.

Is that correct?

The first screen-grab in the article shows a "System Workspace" window with some script snippets that seem to do "traditional file system" stuff.

"The Smalltalk-80 system includes classes FileDirectory, File, and FilePage to represent these structural parts of a file system."

etc etc

page 209 "Smalltalk-80: The Language and its Implementation"

https://rmod-files.lille.inria.fr/FreeBooks/BlueBook/Blueboo...


The Mac also had no preemptive task switching, no memory protection between applications and/or the OS. Early on there was also no software development on the Mac - one needed an Apple Lisa for that.


I think it's important to note that the Mac didn't have multitasking because it was so savagely cost-reduced, and the team had to fit a GUI OS into 128 kB of RAM.

The Lisa did have multitasking. It is one of the Lisa features that was cut from the Mac.

And then Apple spent a decade or so trying to put it back in again.


Though it was cooperative Multitasking, something with the MacOS also got, years later

A good overview of the Apple Lisa is this:

https://www.researchgate.net/publication/2996607_The_Archite...

"The Architecture Of The Lisa-Personal Computer"


> For example, there’s no traditional file system. Smalltalk environments were

> stored as images which contained the entire state of the system. That means

> all of the objects, code, data, whatever, were stored as Smalltalk objects.

> The creators of Smalltalk wanted to make a system where the person using it was

> also modifying and programming the system as they used it. That’s really cool

> as a programming environment, but not really how we think of normal people

> using computers today.

Seems like it was much more advanced than what we actually have today. Why do we need "filesystems" anyhow? Why do we "not think of normal people" programming their computers?


> For example, there’s no traditional file system.

Is that correct?

The first screen-grab in the article shows a "System Workspace" window with some script snippets that seem to do "traditional file system" stuff.

And "The Blue Book" says:

"Class FileStream is a subclass of ExternalStream. All accesses to external files are done using an instance of FileStream.

Classes ExternalStream and FileStream are provided in the Smalltalk-80 system as the framework in which a file system can be created. Additional protocol in class FileStream assumes that a file system is based on a framework consisting of a directory or dictionary of files, where a file is a sequence of file pages. The Smalltalk-80 system includes classes FileDirectory, File, and FilePage to represent these structural parts of a file system."

etc etc

page 209 "Smalltalk-80: The Language and its Implementation" "The Blue Book"

https://rmod-files.lille.inria.fr/FreeBooks/BlueBook/Blueboo...


Possibly the VM can operate without a filesystem but it might be necessary to have a way to interface with the external world.

https://en.wikipedia.org/wiki/Smalltalk#Image-based_persiste...

^^ Here I think the point is that you can stop the VM, restart it and you're back where you were before you left - there's no need to "save your document" because it's an object in the image and when you edit it, the persistence is handled automatically.

Just as today your IDE saves your source files for you every time you make a change but obviously not on every character you type.


Yes, Smalltalk-80 provided a way to snapshot and resume — the image.

And Smalltalk-80 provided a way to represent the file system — FileDirectory, File, and FilePage (and use text files to archive source code).

"Implementing a Smalltalk-80 File System and the Smalltalk-80 System as a Programming Tool"

page 287 "Smalltalk-80: Bits of History, Words of Advice"

https://rmod-files.lille.inria.fr/FreeBooks/BitsOfHistory/Bi...


You can play with another early Xerox GUI called Interlisp in your browser: https://interlisp.org/medley/using/running/online/

It was being developed in parallel with Smalltalk-80 (and earlier versions of Smalltalk): https://interlisp.org/medley/history/timeline/


I may be mid-remembering details, but

The first Mac development environment from Apple that you could run on a Mac was the text-based Macintosh Programmer's Workbench.

Superficially, it was a command-driven shell. But it was not hosted in a terminal emulator window.

The command shell was just a text document; you would type a command sequence, then tell the environment to execute it. I believe you could run it by typing Command-Return, but then I seem to recall you would execute the text by selecting it with the mouse and invoking "execute selected text" via a menu option.

There was no enforced distinction between input text or output text; commands usually printed something by way of feedback.

Overall, your command shell text document would soon get built up into something worth keeping, at which point you could save it to a file, or it was a complete mess you could discard. But visually, it looked more like Prof. Doug Englebart's legendary demo, or like Plan 9 Acme, than Bash in a terminal session.

In much the same way that Emacs is a Lisp Machine environment. Is it a document? A command? M-x e.

Later on, NeXTStep used SmallTalk ideas; Objective-C is close to being SmallTalk in C.

The columnar browser for object/command hierarchy is still used in the AppleScript Script Editor app (which predates NeXT).

The NeXT file browser had an area (the "Shelf") above the hierarchy columns that you could use for keeping references to objects you wanted to use later. macOS retained the Shelf in the Finder until ?? Tiger 10.4? But over time the shelf degenerated into a semi-customizable toolbar. You can't drag files up there anymore.


This style of command interaction via buffer was kind of how a lot of early systems worked and it's kind of sad how this method disappeared and was replaced with the Unix shell (and I assume VMS) command/response flow.

Consider the C64 BASIC environment. In that case not a "buffer", like emacs, but just really the text screen memory, but same effect. To repeat a command, move the screen arrows up and hit return. Execute any content off the screen. It's nifty.


>At first glance, this looks incredibly similar to something like the desktop of the Apple Lisa or early Mac OS. It’s easy to see why people might think that Apple sort of stole the graphical user interface from it’s rich neighbor Xerox. It’s not true, though.

People don't think Apple stole the GUI from Xerox because of the Smalltalk-80 GUI, but because of the Xerox Alto and Xerox Parc [1].

[1] Xerox Star


> For example, there’s no traditional file system.

Seems like Collin Donnell, the writer of "The Xerox Smalltalk-80 GUI Was Weird" is just wrong.

Yes, Smalltalk-80 provided a way to snapshot and resume — the image.

Yes, Smalltalk-80 provided a way to represent the file system — FileDirectory, File, and FilePage (change log, change set, sources).

These are not mutually exclusive. Smalltalk-80 provided both.

The first screen-grab in "The Xerox Smalltalk-80 GUI Was Weird" shows a "System Workspace" window with some script snippets that seem to do "traditional file system" stuff.


That's not how to review something from the past. Everything in these days "was weird". Because of mere chronology, your neural networks were trained to a completely different understanding of normalcy (as in not-weird) which your article doesn't address.


Great article helping to dispel the myth that Apple simply stole the GUI from Xerox. The Lisa and the Mac were truly revolutionary, introducing many GUI concepts that we take for granted today.

Another rather interesting Xerox PARC technology in the history of GUIs is the Xerox Star interface, which actually beat the Apple Lisa in releasing a GUI with a desktop metaphor. The Xerox Star was a commercial workstation and was this one of the first GUIs that customers could actually purchase. However, it was very expensive (more expensive than the Lisa, which failed partly due to its $10K [in 1983 dollars] price point).

Even though the Xerox Star preceded the Lisa when it came to the desktop metaphor, the UI would be unfamiliar to Mac and Windows users, featuring rather complex mouse and keyboard gestures to use GUI features, compared to the Lisa’s conventions, which have largely endured to this day, to the point that it shouldn’t take too long for a Mac or Windows user to learn how to use a Lisa.


> the UI would be unfamiliar to Mac and Windows users

True. Star had something no other system's ever tried to do, namely dedicated MOVE / COPY / PROPERTIES / DELETE keys, which were supposed to be universal for all types of objects. Nowadays, cut / copy / paste have become the universal operations, and mouse button chording / two finger press have become the equivalent of the PROPS key. Also, Star had no menu bar across the top, another thing that would be pretty confounding for a modern user.

That's not to say dedicated keys were the right idea! It was just one of those things that sounded good at the time but didn't catch on.


Everyone involved in the efforts of developing a consistent and consumer-ready UI and stringent metaphors from the beginnings at PARC deserves due respect. E.g., the team at Xerox SDD (Systems Development Department) developing the Star, the team at PERQ (Three Rivers), and, of course, Apple (there were probably a few more). They are probably somewhat undeserved in the shadow of what was still early (and often inconsistent) experiments.

And, yes, the first commercial GUI systems were expensive, as they required fast processors and comparably lots of RAM, which was expensive. It was probably still too early for consumer oriented systems and the systems that were had to address special niches, like knowledge workers and experts. Even the Mac just barely managed a certain level of affordabilty, and this only by (initially) including just a barely viable amount of RAM. (Compare 128K for Mac vs 2MB for the Lisa, which also meant stripping advanced features like multitasking.)


Xerox did not "beat" Lisa. They showed what they had to Jobs and like all great artists, he stole . . .

https://www.newsweek.com/silicon-valley-apple-steve-jobs-xer...


Even it were true that the Lisa had been a simple copy of Alto software (it isn't), or there hadn't been any prototypes for a windowed bitmap display at Apple before this visit (there was), it's technically somewhat difficult to steal what you own. Notably, the (in)famous PARC visit was coordinated in fulfilment of a condition for an exchange of stocks between Xerox (who had advanced Apple on the matter) and Apple, when Apple was still a private company. So Apple actually co-owned part of Xerox PARC. (In this regard, it may be even a bit ironic that the Apple delegation was shown less than what had been the usual program for similar tours, of which there had been many before this.) Nevertheless, Apple licensed quite a bit from Xerox.

Also, it may be important to note that the focus at Xerox PARC and Apple was quite different. If there had been inspirations initially (of course, there were), these developed quite differently, even radically so. You may also say, Xerox had a fair chance with the Star, being on the market without competition for years before the Lisa was eventually released. But Xerox PARC was so much more than just Smalltalk, and Xerox still got a rich return of investment out of PARC. (So it wasn't a failure either.)


PS: If you really want a prime example for parasitic Apple, look no further than the Mac eventually finding its killer application in desktop publishing. In order to so, Apple came up with the LaserWriter: They took Canon's LBP-CX laser printer engine (laser printing had been developed at Xerox PARC), bolted on their RIP utilizing PAL and RAM chips from other companies, and as they were at it, also hooked up their LocalTalk network interface. The software, PostScript, of course, came from Adobe, the founder of which, John Warnock, had come from PARC. This all was covered up by a case by Frog Design and Apple finally slapped their logo on this. As before, Apple fiendishly covered their traces by licensing what had not been developed in-house…

;-)

None of this had been possible, hadn't Jobs seen the Canon LBP-CX, while negotiating for a supply of floppy drives, and hadn't he been aware of John Warnock and his PostScript development, and, of course, none of this had been possible without Xerox PARC. Still, we call this legitimate development.


Because it absolutely IS legitimate development. Apple had the insight to take the printer engine, combine it with a networking interface, and bake in PostScript. If you have 1/3 or even 2/3 of those, you don't get the desktop publishing revolution.

Most of the time, innovation is combining a few things that exist, but putting them together in an inspired way and nailing the execution. That's the iPhone. It's synthesis.


This was really meant as a bit of comedy, but with a grain of seriousness. It was meant to demonstrate that, if you just try hard enough, you can twist the history of about any product in this direction.


You fell for the myth. Stanford explodes it:

https://web.stanford.edu/dept/SUL/sites/mac/parc.html

I was there. This is mostly correct. Jobs didn't "steal" it; he finally believed what his own people had been trying to tell him.


See also "On Xerox, Apple and Progress" by Bruce Horn

https://www.folklore.org/StoryView.py?&story=On_Xerox,_Apple...


This feels like hindsight. Even this is revolutionary compared to the text terminals that came before it. You could arrange stuff on screen, pull down menus etc? That was science fiction.


You hear a lot about the Xerox Star, but most of the articles (and there were many) are behind paywalls, or just plain lost. But they're all in one PDF on bitsavers. I posted them here.

https://news.ycombinator.com/item?id=36574631


weird is an odd way to spell brutalist ux.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: