"...a language of description (programming language) that serves as an interface between the models in the human mind and those in computing hardware"
We do know what the models in our computing hardware look like, but do we actually know the models in the human mind?
Don't get me wrong, I love everything about this document. But I can't help but feel like that conceit is really dangerous. And seems common in this era of work.
I don't think he means the models of the human mind at the implementation level.
Rather he's referring to the mental conception of a specific problem or task, or even a natural language description of the problem, and how that is translated to a computer program that models that problem.
People are associate not procedural so objects do map somewhat better than functions. Many of the hardest problems in computer science are very short term. Navigating a car across the country is not that hard, not hitting anything or breaking any laws in the next 10 seconds and every 10 seconds along the way is vastly more difficult.
It's easy to calculate a trajectory to toss a ball into a hoop, it's much harder to get a robotic arm to do so.
At a molecular level, we could argue that everything is procedural in humans as well. Of course things get complicated at biological neural networks where parallel processing becomes more important.
In the end, both humans and machines have to follow the laws of nature (physics) in order to process information. There is nothing magic about it, at a certain point of abstraction we both (all) work in the same information processing way.
Hmm. "Languages and systems that are of sound design will persist, to be supplanted only by better ones." I wonder if the author still thinks that, looking at the programming language landscape today. (Either he would have to say "it didn't work out that way", or to say that C++ and Java were better than Smalltalk.)
On the other hand, the big idea behind Smalltalk, message passing (i.e. programming with computers rather just the programming of computers), keeps getting recapitulated in various forms such as Ruby, Erlang, Docker, etc. Even C#'s LINQ, in its most powerful form, is ultimately a message passing system (and version 4 of the language that introduced `dynamic` went even further).
Consider that the first MVC paper concerned Smalltalk and was published sometime in the late seventies, but only discovered by the broader programmer community about 15 years ago. Certainly Microsoft developers only seemed to have become aware of it in the last ten years.
Consider that both the JVM (Hotspot) and Google's V8 owe their performance to Smalltalk hackers.
Consider that IDEs are becoming every more tightly integrated with their their client programming languages (Roslyn+Visual Studio, Atom/VSCode+JavaScript, VBA, Browser Dev Tools). Smalltalk was the canonical existence proof that this was a good idea (although Sketchpad did it first).
...
Dan Ingalls (along with Alan Kay and Adele Goldberg) deserve to feel pretty smug (although Kay was supposedly disappointed that they stopped evolving Smalltalk because it became too useful).
There were some insider events called Camp Smalltalk clustered around the year 2000, where people were saying to each other that we lost the battles, lost the war, but we won our cause. So many of the ideas that went into Smalltalk became mainstream. Java and then C#/DotNet was the validation of that. People continued to say this at other conferences.
Consider that IDEs are becoming every more tightly integrated with their their client programming languages
You should take a look at Smalltalk "IDEs" some time. There's hardly anything there! The meta level is so easy to access, along with the compiler. The debugger is mostly just an ordinary Smalltalk application. In VisualWorks, a good coder can start writing a debugger and be browsing stack frames from real exceptions, literally within minutes. A warty but fully workable Class Browser could be finished by one coder in a week. There is no "tight integration" with the language. It just seems that way because almost everything is just an ordinary first class Object.
We had a nice discussion about this at Camp Smalltalk back in the day. In Smalltalk, everything is an Object, and we even use Objects to define Objects. In a proper relational server, everything is a table, and the tables are themselves defined using tables. SQL database servers are also very dynamic and everything is changeable at runtime, until you start restricting things. Even that last part is parallel to Smalltalk.
There are some things you can do as an advanced VisualWorks developer that look like queries of the codebase. You can search on one predicate like "sender of #message:" and take the intersection of that with another search, and have that pop up as a custom browser.
> On the other hand, the big idea behind Smalltalk, message passing (i.e. programming with computers rather just the programming of computers), keeps getting recapitulated in various forms such as Ruby, Erlang, Docker,
"Recapitulated"... interesting choice of words.
Yes, this approach keeps being rediscovered, becomes a fad for a bit and then gets forgotten again. And for good reason: it leads to spaghetti designs with a chaos of actors sending and receiving messages to each other as an undebuggable mess.
Plus, it encourages the removal of types in programs.
Message passing has its place, some of the most impressive software projects ever built use it at their core.
That smalltalk does this at the object level is quite possibly not the most useful thing (though it is interesting), but at other levels in an architecture it can be an extremely powerful tool.
You can make spaghetti designs with any architecture, that's nothing to do with message passing per-se.
With respect to the semantics of the system, a Smalltalk object is supposed to be a full Turing machine (sometimes universal, but often times special-purpose), which is the same thing as a process.
I mentioned Docker because of the similarity to the Smalltalk image.
Smalltalk didn't invent message passing. Even if it did, Ruby is an obvious descendant of Smalltalk. Messaging in Smalltalk and messaging in Erlang are almost completely different. There's nothing remotely asynchronous about messaging in Smalltalk. Docker and LINQ? You're insane.
MVC has to do with this how? MVC has enjoyed continuous popularity since its invention. And of course partly this is because MVC keeps getting post-facto redefined, much like messaging. The Controller in original Smalltalk MVC translated keycodes into messages to the model. This is pretty different from the role of the controller in Rails, where it acts as the be-all dispatcher. Also, look at the most direct descendants of Smalltalk's MVC concept: Morphic, which is an unmitigated disaster with no controller at all, and Cocoa, which eventually replaced almost all the controllers with three stock ones. Everybody doing anything like a GUI is using MVC, but almost nobody is doing it like Smalltalk originally did, including Smalltalk today.
JVM and V8 owing their performance to Smalltalk hackers just means that dynamic languages get optimized similarly to each other, and the people who did it the first time had the experience to do it again. There's almost nothing of Smalltalk in Javascript. Some of the same people being involved? So what, James Gosling and Guy Steele were huge influences in Java, do you see any goddamn Common Lisp in there? Gosling thought the central idea of Lisp was that it was garbage collected!
Atom isn't integrated with Javascript, it's an editor built on Javascript. If you think this is like Smalltalk, where the editor and the code are alive together in the same process image, you've totally missed it. And by the way, as wonderful as this is, it also means you can trash your image and lose code. This is an interesting idea but having lost work to it I am not at all convinced it is a good idea. VBA? Browser Dev Tools? I feel like I'm reading a recruiter-authored job post here, not a Hacker News comment.
Your interpretation of what has happened with Smalltalk is seriously confused. I think you've mixed up cause and effect, similarity with equivalence and allowed a cult of personality to override technical truths. This is the birth of a cult, not an honest appraisal of progress.
Smalltalk is a really interesting idea. Some of the insights from it have enjoyed wider success. But most of your heroes have moved on from it, and rightfully. If you want to keep the candle burning you should be honest with yourself about why that happened, rather than trying to rewrite history to fit your narrative.
Even if it did, Ruby is an obvious descendant of Smalltalk.
So you agree with him. Ruby's most powerful feature is the combination of OO and blocks. That's Smalltalk through and through.
MVC has to do with this how? MVC has enjoyed continuous popularity since its invention. And of course partly this is because MVC keeps getting post-facto redefined, much like messaging. The Controller in original Smalltalk MVC translated keycodes into messages to the model. This is pretty different from the role of the controller in Rails, where it acts as the be-all dispatcher. Also, look at the most direct descendants of Smalltalk's MVC concept: Morphic, which is an unmitigated disaster with no controller at all, and Cocoa, which eventually replaced almost all the controllers with three stock ones. Everybody doing anything like a GUI is using MVC, but almost nobody is doing it like Smalltalk originally did, including Smalltalk today.
So, again, you agree with him? MVC has been refined, but it still has a basis in Smalltalk.
JVM and V8 owing their performance to Smalltalk hackers just means that dynamic languages get optimized similarly to each other, and the people who did it the first time had the experience to do it again. There's almost nothing of Smalltalk in Javascript.
Of course he's referring to Strongtalk, which was the basis for the JVM. Javascript is clearly inspired by Self which is a descendant of Smalltalk.
You've got some good points but they'd be easier to take if you weren't being so rude about them.
This is the birth of a cult, not an honest appraisal of progress.
How is it the birth of a cult when as you've said people are moving on?
I'm angry because the cult-building around Smalltalk is stupid and I fell for it and wasted years trying to see what was there.
- It's stupid because you can use Pharo today; Smalltalk isn't dead. Try it and find out if it's great. They're trying as hard as they can to keep it relevant and it's a constant struggle because they have to build from scratch everything Smalltalk didn't invent.
- It's stupid because tiny languages like Smalltalk beget huge libraries; I have owned ~20 books on Smalltalk and all but two of them were essentially useless tours of defunct libraries. (The other two are fantastic and useful no matter what language you're using, if it's OOP.)
- It's stupid because most of what was powerful about Smalltalk is either nowhere in sight or became mainstream years ago. I'm angry at the cult-building because it builds nostalgia instead of systems. It makes it easy to spurn what we have and make excuses for our own non-productivity—obviously I'd be so much more productive if only it were 1985 and I had Smalltalk! If Smalltalk GUI building is so great, how come nobody uses it today? Because it's maddeningly inferior to everything we have today. Take your "Javascript fatigue" and try out Morphic, tonight. Here's a bottle of Rogaine and vue-cli for when you lose your hair and sanity trying that little experiment.
The cult-building is maddening because it misses the truly amazing things. To appreciate the amazing things requires immersion, and immersion sucks! You know, if you make a web app with Seaside and it throws an exception, you can serialize the debug window, bring it down to your image and continue the debugging session locally. The serialization works on blocks, which is to say closures, so you get the whole state of the app locally. Has the OP tried this lately? Because it only works if you use the newest serialization method, if you have exactly the same code in both images, and once you make the change in your local image, how do you get it up to the server again? Oh, you either have to endure the awful homegrown Monticello VCS (inseparable from its insufferable GUI, because Smalltalk) or you have to do the file-out file-in trick with git, because Smalltalk hates external tools.
It's a cult because MVC, in the overarching sense that it would have to mean to be applicable to all the aforementioned systems, would have arisen organically without Smalltalk because it's just a sensible design. It's a cult because messaging in the broadest sense that must be meant by including completely different things like Erlang and LINQ predates Smalltalk. If you ascribe things to Smalltalk (or influential people who happened to use Smalltalk) that did not originate in Smalltalk, you are taking Smalltalk as more important than history or fact, and that is the essence of building a cult. Methods-as-messages originate in Smalltalk, sure. If you say Smalltalk invented that, that's true. But of the five things mentioned, only Ruby has that. Java, Javascript, LINQ and Erlang do not. They are not Smalltalk's inheritors; they don't owe their concepts to Smalltalk, even if the same people worked on it.
Smalltalk was great, is great. Pharo is great. What makes it a cult is the alpha-and-omega bullshit. Great ideas happened before and after Smalltalk. They chose general names for specific ideas. History is mutilated if we ascribe to Smalltalk the general ideas or give Smalltalk credit for unrelated things because some of the same folks were in the room.
> If Smalltalk GUI building is so great, how come nobody uses it today? Because it's maddeningly inferior to everything we have today.
I'd argue that the CompSci community DOESN'T choose the best thing that comes along, but merely the first one that fits the bill. It's a mixture of darwinian processes and market force.
If you take the example of Smalltalk vs UNIX vs DOS in the 80s then you get exactly that. People chose DOS+IBM PC because it was a shitload cheaper than a Xerox Star. They'd also choose UNIX (for mainframes) over Smalltalk because it was already there, everyone knew how to use it, and nobody wants to ever break backwards compatibility (so you're stuck).
(EDIT: also, compare to the WWW. Is it really so much superior than what we had before?)
Sure, SOME things have improved, but overall it's a complete shit show and I don't think that the things that have improved are THAT much better (while everything else is downright garbage).
Yes, but today they're both free. Why don't you download Pharo today and use Morphic? Then come back and tell me it's competitive with Cocoa, Qt, JavaFX. Morphic was a neat idea where it was born (in Self!) but there are reasons it hasn't taken off among hobbyists—it's bloated, it's slow, it is worse for most usecases. If it were better today, where's the community of people being hyper-productive and trumpeting it today?
That's not what I was arguing. I was arguing that a better idea does not (necessarily) get adopted, even if it is superior. Sure, Smalltalk might not compete today, but I'd say it was superior /then/ (and still didn't see wide adoption!). That's why I don't have any hope that a superior idea will get adopted in the /future/, whatever it may look like (and this is my main issue here).
I'm especially not thrilled about the underpinnings of current systems. You have to put in a huge amount of effort (millions of lines of code) for very little effect (web browsers, modern operating systems, etc). Just because something is the best that doesn't mean it is actually good. /I/ think, we could do a whole lot better. Just recently, I had a similar debate. Money quote: "nothing beats a nice C implementation".
What I'm always curious about: does this /Idea/ really not work or did simply put not enough effort in it (compare what amount of money, time, and man power is spent on modern systems compared to Smalltalk or any less popular system; I'd say it isn't a fair comparison wrt the former leading to better results).
I think we've reached agreement here, and I've calmed down a bit so I can restate things a little more dispassionately and maybe demonstrate that.
Fundamentally, I think if you're talking about Smalltalk, you're either talking about it in situ, or as it is today, or as an influencer. In the original context, it was thrilling, far ahead of its time in many ways, and lost out because of the reasons you mention: expensive machines, expensive licensing, performance.
If you're talking about it as it is today, it's a mixture of great and horrible and if someone is uniformly positive about it that tells me they have limited experience—either with Smalltalk itself or with competing modern systems.
As an influencer, there are things that can make a reasonable claim to inherit concepts from Smalltalk, especially Ruby. But the influence is not as wide as gets claimed by its proponents. Some of the ideas in Smalltalk are synergistic and don't work well taken piecemeal. Many of the claims about Smalltalk's influence above are unreasonable, either because they are a semantic ball-and-cup game or because they claim a kind of invisible human influence that bears no technical resemblance.
Smalltalk is worth using today, but it has actual faults that shouldn't be whitewashed, and I find the cult of Smalltalk distasteful because it prioritizes in-group mythbuilding over the truth. We should all download Pharo and use it for a few weeks. That direct, personal experimentation is the ultimate cure to technology triumphalism, whether we're talking about Smalltalk or anything else.
I certainly agree with you that we can do better than what we have today, and that Smalltalk has lessons for how to do that which would absolutely apply.
> If you're talking about it as it is today, it's a mixture of great and horrible
Most PC run Wintel and nowadays even more people use/care about ARM/Android/iOS. That's reality and I don't debate that. I'm only/mostly interested in reserarch/CompSci as a discipline.
> But the influence is not as wide as gets claimed by its proponents.
To me, the problem always seemed to be less of how wide these ideas spread but how well they and their origins were understood. OOP made a lot more sense to me, when I found out what Alan Kay had in mind (Sketchpad, Arpanet, biology, etc.).
> Some of the ideas in Smalltalk are synergistic and don't work well taken piecemeal.
Exactly. I think of Smalltalk as more of a system and not just a language. In my opinion, C++, Java, et al. fail (or aren't as good as they could be), because they only take just one idea from Smalltalk (e.g. classes; or the GUI, in case of the Apple Macintosh or Niklaus Wirth's Lilith), when the good things about Smalltalk were due to the interplay of multiple ideas.
> Smalltalk is worth using today, but it has actual faults that shouldn't be whitewashed
I see Smalltalk (and its usefulness) more as a research system than a production system. I'm mostly interested in research and moving Computer "Science" as a discipline forward. I'm sick of people trying to sell me their 5% performance improvement as "new"/"research". Problem is, things haven't fundamentally changed since 1965 (the von Neumann bottleneck is still present, both physically and mentally). Smalltalk could have been the greatest piece garbage in computing history and I'd still respect it (somewhat), because at least it tried to do things fundamentally different.
> I find the cult of Smalltalk distasteful because it prioritizes in-group mythbuilding over the truth.
To be honest, you've got that problem everywhere. Heck, look at Apple, Linux, BSD, Unix/Plan9 …
Btw: I'm currently reading "Smalltalk-80: The Language and its implementation". What's your opinion on that book? And thanks for the other two book recommendations.
> Btw: I'm currently reading "Smalltalk-80: The Language and its implementation". What's your opinion on that book?
It's required reading and it will tell you how to build Smalltalk-the-language (inefficiently, but productively), but it won't really teach you how to use Smalltalk today. For your purposes, it's definitely the right book.
Where did Cocoa come from? It came from NeXT, which was Steve's second attempt at implementing the ideas he got from his PARC visit where he saw Smalltalk, the first obviously being Lisa/Mac.
And of course Cocoa is implemented in Objective-C, which is Smalltalk added to C.
And I would claim that Cocoa is hands down superior to Qt or JavaFX (does anyone actually use JavaFX?), and for precisely the reasons that are made possible by the dynamic nature of Objective-C. Having experienced 4GLs and other UI building systems, Cocoa managed and still manages to achieve the holy grail of combining 4GL-/screen-painter-like productivity with complete flexibility to implement anything you want, on a smooth curve.
We take that achievement for granted nowadays, but it is no mean feat.
But yes, Pharo/Squeak etc. are somewhat fossilized, but still amazingly productive compared to most anything else, and that's why there are communities of people committed to these fossils, being amazingly productive on them and trumpeting it.
No, it's the right amount of emotional. When people distort the truth, it makes me angry. This is misinformation well beyond the protection of Hanlon's razor. And I resent that you are unable to attack my facts, so you have decided to try to go after my tone instead. If you are in possession of better facts, you should trot them out. But if you're somehow harmed by my honesty or passion then that's your problem.
> But if you're somehow harmed by my honesty or passion then that's your problem.
You're harmed by your "honesty" and "passion". People don't listen to what you have to say if they are turned off by your tone.
Look, I've been there, too. I've gotten angry about Lisp and Haskell zealots, for much the same reasons you got angry about Smalltalk fans. And my anger was... unproductive at best.
I read your post a couple times and I'm not even sure what you're angry about. You yourself said that Smalltalk is great. Maybe you had a few friends who were really obsessed with Smaltalk and you are upset at them?
I have always considered the standard library to be part of the language.
How big is Python the language? How big is Erlang/Elixir?
The power you get from each is that there are very powerful batteries included, and most 3rd party libraries fall into the same patterns learned from the STL.
They were better than Smalltalk, not in terms of the language, but in terms of availability to developers. C++ and Java were free, Smalltalk during the time it could have risen, had no free implementations and cost a fortune to use. It's still a better language than either of those, but lost network effects gained by Java by being free.
They were better than Smalltalk, not in terms of the language, but in terms of availability to developers...Smalltalk during the time it could have risen, had no free implementations and cost a fortune to use.
There were literally per-seat licenses at $5000 and $10000 at one point. There were free implementations around the time when Java was new, like GNU Smalltalk, but it would have been better if those hadn't existed, because they didn't convey the benefits at all.
LightTable had YouTube to enable people seeing the benefit of the environment. Smalltalk could have really used that at the right time. That still wouldn't have helped, because the Smalltalk community didn't understand the network-effect benefits to the same degree almost any programmer understands them today.
Smalltalk could have been Java. Sun actually approached the owners of the VM to ask if they could use it for their early set top box project. They were rebuffed and went on to write what was to become the JVM. (Smalltalk becoming Java might well have resulted in huge early Internet security disasters blamed on it, because of the difficulty in securing the system, and the preference developers had for code signing as the primary security mechanism.)
Self was developed at Sun as a prototypal alternative to Smalltalk, which involved a lot of work on the compiler to make it performant. I'm pretty sure that effort fed into the JVM.
Self started life as Smalltalk-86 at Xerox Parc in a collaboration with Stanford, and then got implemented at Stanford before moving to Sun Labs. When Self was killed at Sun (the first time in 1994) part of the group joined forces with the "Smalltalk with Types" group to implement StrongTalk.
Meanwhile the part of the Self group still at Sun created Pep, which was a Java on top of Self that was way faster than any other Java. But it was a quick hack with a few serious limitations.
Animorphic, the StrongTalk company, did a more polished implementation of Java on top of their VM technology and got bought by Sun to turn that into the HotSpot VM working together with the part of the Self group that had stayed at Sun all along.
Around 2000 Sun allowed the Self project to be restarted until they killed it again in 2006, at which time they open sourced StrongTalk (which had been hidden since Animorphic had been bought).
I think quite a few of the ideas behind Smalltalk were too radical for its time. Looking at the intellectual legacy of Smalltalk there is no denying it has been profoundly influential. It's not so much about C++ / Java vs Smalltalk but rather OOP vs Procedural Programming. There's a question about whether Java is OOP done right, but I think it's fair to say that it's done well enough to reap the majority of the benefits of OOP.
Lisp had image-based distributions and dynamic typing before Smalltalk. Those and extreme oop-ness (Generic Functions, ...) plus MOP are still in Lisp (-> CLOS).
Nobody would consider starting a large project with Python or Ruby these days.
Javascript will be the last mainstream dynamically typed language, and while it's going to be around for a while, even it is slowly being replaced by statically typed versions of itself (Typescript, Dart).
The trend is crystal clear for anyone who's been paying attention to the field for the past decade. There's simply no longer any good reason to use a dynamically typed language.
I believe there is a mistake in the user interface section. They use the word "esthetics" when they mean aesthetics which is
> a set of principles underlying and guiding the work of a particular artist or artistic movement.
The root of the word is possibly Germananic (in German: Ästhetisch) or Greek Aisthetikos.
It's not clear that there is a difference from any citable source, however deductive reasoning suggests that escetics is concerned with cosmetology and estheticians.
The argument to be made that its the Latin dipthong AE is simplified to E is not valid in this case as inspection of Greek shows.
Edit: update. One given definition of esthetics is > relating to, involving, or concerned with pure emotion and sensation as opposed to pure intellectuality. [dictionary.com]
We do know what the models in our computing hardware look like, but do we actually know the models in the human mind?
Don't get me wrong, I love everything about this document. But I can't help but feel like that conceit is really dangerous. And seems common in this era of work.