Hacker News new | past | comments | ask | show | jobs | submit login
The Early History Of Smalltalk (1993) (worrydream.com)
121 points by ohjeez on Sept 4, 2018 | hide | past | favorite | 68 comments



I really wonder what the software world would look like today if the Smalltalk environment had predominated. It seems like various ideas from the Kay's group get reinvented from time to time, but we it's not really mainstream. Even the RAD tools from the 90s somewhat fell out of favor, partially due to the rise of the web, which meant reinventing everything in the browser.

From hearing several talks by Alan Kay since then, it doesn't sound like software quite turned out the way he envisioned it, but he still thinks there's a better way to do things that's related to what he was thinking in the 70s with a living environment of little computers sending messages to one another. He's talked of the idea of modeling biological cells since then.


Kay found the web distasteful.

“his conception of the World Wide Web was infinitely tinier and weaker and terrible. His thing was simple enough with other unsophisticated people to wind up becoming a de facto standard, which we’re still suffering from. You know, [HTML is] terrible and most people can’t see it.”

https://www.fastcompany.com/40435064/what-alan-kay-thinks-ab...


I assume you're posting this to show that Kay "doesn't get it"?

The more I learn about cutting-edge technologies of the 60s, 70s and 80s, the more I'm convinced that he's entirely correct about the web and mobile. Most developers today are obsessed with tooling and features. Almost no one seems to care whether those tools and features have some kind of fundamentally valid idea behind them.

Just look at the proliferation of HTTP security headers for a clear-cut example of how this works out in the long run.


Considering a whole Smalltalk VM is smaller than a lot of hero images on websites, I get the feeling that Kay is correct. I would imagine a containerized / sandboxed environment would have developed. I cannot see how it could be worse than today.


> I cannot see how it could be worse than today.

I fear it would turn out equivalent. The starting stack might have been better, but the promise of money to be made and competitive pressures would be the same. We would "worse-is-better" ourselves to roughly the same spot we're in today. Bloat, tracking and clickbait.

Whenever I implore people to cut down on software bloat, I know what I'm really asking is, "fight against the market, slow the decay down just a little bit".


> I fear it would turn out equivalent.

Only some 30 or so years earlier, and who knows what we'd have come up with, given such an early head start?

I work in the space of dynamic language runtimes, and it hurts me to see how ideas pioneered by SmallTalk and subsequently the Self language[1] runtimes haven't yet been adopted by some of the more popular dynamic languages today.

[1]: http://bibliography.selflanguage.org/_static/implementation....


The worse part is the insistence of Ruby and Python to keep JITs away from their canonical implementations.

Sure there is PyPy and the "C JIT" on upcoming Ruby, but they don't seem to be widely deployed.

Maybe with Graal and J9 focus on increasing support for dynamic languages, that will eventually change.


> I fear it would turn out equivalent.

One argument against this is that the web today is not set up for easy authoring by end users (in fact, it's gotten more difficult to "make websites" as the web has progressed). But there was a hypermedia system around two decades ago that had authoring in mind from the start — hypercard — that could have been a good prototype for a "real web".

If you have authoring as a primary consideration than perhaps one-directional, consumer oriented cultures of technology would not be so prominent. But who knows.


>I fear it would turn out equivalent.

I doubt it. Smalltalk had coherent design that anticipated many needs that the web faces today (and computers in general). What we get in ad-hoc piecemeal additions would be an organic part of the overall infrastructure.


I recently started experimenting with Squeak. It's a bit dated, but holy crap, that thing is amazing. So many good ideas packed in so few megabytes. The startup time alone blows my mind. It's pretty much a virtual machine with a sort-of-operating-system that runs an IDE for all of its own source code, and it loads faster than 95% of all vanilla desktop apps I use today.


I could see having the same setup as a browser except every url downloads a vm. I do wish Squeak would update its look and feel though.


They are still using bitmapped graphics under the hood, which explains some of the limitation. The community has shrunk, which means there's no one to take them to the world of vector. However, Pharo people are implementing a new vector engine called Bloc.


Pharo is pretty much the future. I wonder if WebAssembly and it’s upcoming ecosystem will be useful to take some load on a future Smalltalk?

[edit: since I’m getting the posting too fast crap here is another language mentioned in the article KRL by Danny Bobrow and Terry Winograd https://pdfs.semanticscholar.org/033e/cd544c4e71b14a7d3ee061... ]


I think about that all the time. Imagine a Smalltalk VM written in WASM. It could load with every site and every site could just be live Smalltalk.


Although it's not WASM, perhaps you will prefer http://squeak.js.org/ , a Squeak VM written in JavaScript.


There's a guy that's written a nice looking Smalltalk in Ruby that works perfectly well in the browser with rails I think.


>They are still using bitmapped graphics under the hood, which explains some of the limitation. The community has shrunk, which means there's no one to take them to the world of vector.

Most (all?) commercial OSes don't use vectors for their GUIs either.


Android and Windows (UWP) do use quite a lot of vector based graphics.


Yeah, Apple has some APIs for rendering widgets from vectors too, and Gnome (for icons) but most widgets, ui elements and apps are still bitmap based.


I've noticed this myself, and it makes me ashamed of modern computing honestly. It is as though the community has been taken over by some kind of religion that worships technology for its own sake. Technology is supposed to make our lives better, but there seems to be an increasing trend of sacrificing our own interests in service to increasing technological complexity, and these people embrace it.


He has a good talk where he shows that the original Smalltalk implementation had IDE, graphics, a word processor, VM, networking features...etc etc in like 10k lines of code. He then shows how big Microsoft Office is and refers to a visible bug that MS has been trying to fix for 20 years, but they can't find it. The take home idea is that we've made things far more complex than we've needed to.


Computer Programming was better when there was a constraint on it.Now that chips really can't get much smaller and Moore's Law is effectively dead we are entering a new era of programming. Efficiency is going to be the low hanging fruit for getting better programming solutions.

When I look at what was done with the C64, and Amiga I am shocked at what that doomed toy company was able to do. The Amiga was easily 6 years ahead of MS and Apple at the time.

Amiga's solution to the constraint is that they gave us custom chips for graphics and sound and we could program to easily. The C64 in Assembly was so simple you really could pretty much have it in your head. If you even look at ID and Doom with 640k limitation it is pretty amazing stuff.

Memory became "free" and people just ran with whatever we wanted. Parallel computing is proving to be not the easiest thing to implement but efficient use of memory and cycles will become king of big projects.


At least in Europe where Atari and Amiga reigned for a while, PC Compatibles helped killing the Amiga.

If IBM had managed to keep their control over the PCs, the market would have looked much different.


Much more pleasant to work with.

Having personally experience that era, it always saddens me to see xterm maximized to full screen size, with text mode vi or emacs, replicating the same experience I had with actual VT 100 terminals almost 30 years ago.

That is why I live in eco-systems where the communities love their IDEs.


Or you can use Emacs in its graphical mode:

https://www.gnu.org/software/emacs/tour/


Given my comment I guess you can imagine I know how to use Emacs in graphical mode.

Also Emacs in graphical mode still isn't as feature rich as XEmacs used to be, which was much closer to the inline REPL experience of Lisp Machines.

And in any case it feels quite short of the overall experience of using commercial Lisps, Smalltalk, Delphi and many other RAD environments.


I like the idea of software, libraries and application code all being accessible to each other and have the ability to call each other. Too much functionality is siloed behind application A or B, and the interface exposed by these applications is often woefully inadequate for anything interesting. You see this a lot with smaller projects, where they need to focus on getting the product feature complete as well as with much larger projects, where lock-in becomes a supposed requirement. The space in the middle seems to be ever shrinking.

I like the idea of everything being written in the same language and the code being accessible to anyone who is interested. That you can then open any running code in a debugger to see how it works, or in the integrated IDE to alter it's behavior, is very appealing. For sure the ability to edit anything at anytime can lead to instability but there are ways to ameliorate that without being as dramatically inaccessible as today's environment.

I do feel that even in a modern Linux desktop environment (the only place where we might expect to see any Smalltalk-like accessibility and functionality), so much work needs to be done slogging through different languages to figure out how things work. Your email application might be developed in Vala, a lot of library code is in C or C++. Many tools turn out to be written in Python or Ruby or even in a shell scripting language. While it makes things more interesting for the career software developer it's clearly making things harder for the end-user who needs to alter a subtle behavior in an application they use every day; so much so that the idea of actually making that change seems foreign to, well, everyone.

Smalltalk has it's obvious problems and warts and I can see why we aren't using it today. But in many ways, the Smalltalk we are looking at today is the very same Smalltalk detailed in the article, which is very similar to the Smalltalk of 1984. Work has been done and I do not wish to cast shade on projects like Pharo, which are doing some heavy lifting with their work on the language, compiler and runtime. But it seems to me that Smalltalk as a whole is moving away from the "anyone can use a computer" target and more towards the developers, where it may well end up another provider of a single application that is siloed away from the rest of the customer's environment rather than that environment itself.


> I like the idea of software, libraries and application code all being accessible to each other and have the ability to call each other. Too much functionality is siloed behind application A or B, and the interface exposed by these applications is often woefully inadequate for anything interesting.

Maybe I'm skeptical, but I've gotten to the point where I believe that this specific problem is not going to be solved anytime soon.

Here's how I see it. I believe that unification, specifically within software, is universally based on the idea that we can clean up the world (where "the world" refers to a given software environment) if only we can reimagine it clearly enough and, to borrow a few technical concepts, compute a delta with enough detail and resolution that it'll carry us from where we are to where we want to be. (With the implicit [mis]understanding that all we need to do is make said delta exist, and then it'll do all the work of carrying us.)

I think the above is what we're trying to do - and that we're doing it because we're staring at/barking up the wrong set of trees (the associated forests of which got lost long ago).

Ultimately, the world is a disorganized mess. Any security researcher could entertain you for days with highly-detailed hair-raising examples and proofs of this. I've very recently (finally) realized this is because, at a very high level, the world operates somewhat similarly to the bazaar model Linux uses. Evolution is highly organic, often driven by social effects. There's no sort of cohesion (or "elegance", for want of a better word) that carries from the standpoint of some superscalar bigger picture all the way down to the smallest detail, and if anything is cohesive it only acts within the reach of some small-scale focus, is often further limited by the scope and motivation of that focus, and always ends up conflicting with _something_. A good examples of this is software security - there's always going to be a way to break things, no matter the kind of protections that are applied.

After quite a few years' confused and anxious pondering, I've reached the conclusion that the planning parts of our brains that we use to architect scope-sensitive solutions to problems don't have limit switches, and will in effect run forever if we let them. This is a problem because they will also never report any sort of ground-truth that could be used as an objective sense of progress - which makes a lot of sense if the problem(s) being studied themselves are forever in motion and never stay put. Furthermore, these parts of our brains seem to have no cognitive load cost metric; "dreaming up castles in the sky" is something we have the potential to enthusiastically engage in all day; the sense of motion we seem to get from this seems to satisfy our "I'm doing something productive!" threshold. When all of these things come together, it's obvious that, sadly, some fall into setting all of their focus and attention knobs to prioritize this "planning and scheming" style of thought, at the cost of incorporating many other styles of architectural study.

For me, one new style I try to prioritize is (as might be obvious) considering the bigger picture, and particularly the lateral subjective connections and the network-effect/unintuitive/unintended ways things can influence each other. I read a comment on here some time ago that left an impression and got me started on this track. It was talking about how it's a highly effective use of one's time to become the best X within a given Y vertical - for example, being the best data scientist/programmer within a group of etymologists, or the best woodworker amongst a pile of photography chefs, or the best psychologist within the software development community. I think this is a remarkable approach to use for a couple of reasons: firstly, it mitigates the "front and center" style of focus that seems to be so incredibly devastating to mental organization, because I'm looking at X from the standpoint of Y, so I'm not organizing/weighing/judging X by its own merits, memes, culture and hangups, so I'll likely have a highly effective easygoing outlook on it, and much more than if I were to go "righteo then let's go be a psychologist" hefts Atlas's burden. Secondly, because I'm coming at this from a different set of mental models and perspectives, I'm basically getting some really cool procedural cross-pollination for free. This is the kind of mental hacking I think is cool.

Let me fold back to what you were saying before and stitch together a narrative out of some of the things you've said - I hope I'm not taking your words out of context here:

> I like the idea of software, libraries and application code all being accessible to each other and have the ability to call each other. Too much functionality is siloed behind application A or B, and the interface exposed by these applications is often woefully inadequate for anything interesting. ...

> I like the idea of everything being written in the same language and the code being accessible to anyone who is interested. That you can then open any running code in a debugger to see how it works, or in the integrated IDE to alter it's behavior, is very appealing.

> ... Your email application might be developed in Vala, a lot of library code is in C or C++. Many tools turn out to be written in Python or Ruby or even in a shell scripting language. While it makes things more interesting for the career software developer it's clearly making things harder for the end-user who needs to alter a subtle behavior in an application they use every day; so much so that the idea of actually making that change seems foreign to, well, everyone.

Right. I think I'll address the language thing first.

So, first of all, we have: why so many languages? Why not just one?

Of course, this is a loaded question with some unspoken context. The real answer I want is, how can we eliminate all the languages out there?

Well, each language was created to solve a given problem. So now we need a rough mental sketch that covers all of that history - not all of it of course; perhaps enough to fit on an allegorical napkin.

Vala was written to solve the problem of easily creating applications on Linux. The unspoken sentiment was that it wanted to squash C's brittleness and make software development more straightforward. So it has a focus of accessibility.

C/C++ were of course grown very slowly (and you might even say haphazardly), with zero 20/20-equivalent foresight. As a relevant example http://pzemtsov.github.io/2016/11/06/bug-story-alignment-on-... popped up on here the other day; it covers a good example of how the C standard is unintuitive and hard to follow at the best of times, and also gives a sense of how a C compiler really is a thousand specifications tacked together that (amazingly) don't result in programs that spray singularities and black holes upon process creation. That organization grew out of slow iteration and evolution, not planning; as such we don't have the mental modeling ability to follow and reason about it very easily. (To reiterate, that doesn't mean there's _no_ high-level organization, of course.)

Python was written with the vision of a systems-capable, batteries-included variant of the ABC teaching programming language.

Ruby was written as a more-readable alternative to Perl, with care and attention paid to high-level language design and organization.

Python and Ruby both overwhelmingly occupy the tail end of benchmark graphs. :(

Python has scaled out so much that its creator left the project only a few weeks ago, IIUC citing bus factor and the need to preserve sanity and manage/mitigate burnout.

Ruby's focus on organization produced a language with a perceived high-level elegance; the "web designer" crowd of the early turn of the century bolted it onto Apache with initially heavily-hyped results. For some reason the current trending developer aesthetic seems to be relabeling medium-level languages as low-level languages, so we have people doing "suuper-looow level stuff" in Go, which lends itself heavily toward writing application servers.

Vala was written to make programming more accessible for everyone, but the status quo that has developed around it has kept it tied to the GNOME project, even though it can generate code for other platforms.

All of this is very interesting.

Python was written to "be useful"; it became so useful its creator had to run away from the ensuing chaos produced by all the people successfully using it.

Ruby was written with a particular sense of design aesthetic; it remained "in vogue" until appreciation for the aesthetic it used wore off, and now Ruby occupies this incredibly confusing space that seems to return true for all of "apathetic", "why are you even using this" and "it works really well". Real-world example (that might fall flat): you may be aware that that GitHub is pretty much all Ruby on Rails. (...While GitLab is Go. Haha.)

Bash wasn't written. It grew. It evolved. It metamorphosed. It... shapeshifted? It grew some more, and partially fell over. Hmm, I'm reminded of one of the login fortune-quotes I saw earlier today!

> It took 300 years to build and by the time it was 10% built, everyone knew it would be a total disaster. But by then the investment was so big they felt compelled to go on. Since its completion, it has cost a fortune to maintain and is still in danger of collapsing. There are at present no plans to replace it, since it was never really needed in the first place. I expect every installation has its own pet software which is analogous to the above.

> -- K.E. Iverson, on the Leaning Tower of Pisa

Except with bash there was no actual intention to build anything in the first place. So, lacking a purpose, direction, or design spec, it... gained ad-hoc status as the first-class program we use to interact with our machines more than anything else. (What's X? :P)

Hahaha. Yeah, that totally makes sense, and explains UNIX a lot. Really shows the bazaar mentality is pretty old, too.

Okay I'm making bash - and shell{s, syntax} in general - look bad. Concatenative languages are really interesting IMO. I can do things in bash that take much more LOC in an actual programming language. (Case in point: I needed to port a shell script that was behaving erratically (due to misuse of subprocess control hacks :D) and rewrite it in an "actual" programming language, and it promptly grew from 114 to 780 lines.) Also - I've never used a functional language in my life -- yet I have.

Without purpose or direction, bash seems to have converged/settled upon some reasonably-global balance that permits "a reasonably fair bit" in less keystrokes than other languages require - because it is, of course, not a language, but a shell environment. (Now I think about it, I wonder if bash hasn't unintentionally picked up on the "be the best X in the given {{A..W},YZ} vertical" I was talking about earlier.)

I continue to look (honestly, genuinely) for something that allows me to interactively control a computer in the same minimal number of keystrokes that bash et al do. I haven't found anything similar up to now. (K is interesting but I haven't Van Gogh-ified my brain enough to be able to squint at it correctly... yet :P)

The reason for the outline above is to illustrate the extreme difficulty of writing One True Language To Rule Them All. We simply aren't capable of reaching that far - of correctly color-coordinating our brick fences from 10,000 ft up, if you will. The languages mentioned all had different focuses, differing origin stories, different motivations, different goals... and different userbases, different cultures, different community cliques (which surely influenced the language in good and bad ways), different sets of tradeoffs.

And this is why I say, we have to let go. Treating the world as suitable input to feed to a mental model of a REPL will leave us stuck at either the R or E stages (depending on how you look at it) for ever and ever and ever, because the input is constantly moving and changing.

I'm not yet sure how this works, to be honest. We take the slightly shorter view, we make things, and they're cool, and they have a real impact within the scope they target... and from the perspective of the bigger picture it'll all continue to be relative. This is admittedly the bit where I don't know how to emerge with a sense of grounded orientation AND a positive outlook; I've reached the point I describe above, but I don't have any sort of rationalization to justify one particular subset and set of compromises over another. I guess I can say that I've emphasized unbiasedness and not wanting to short-sightedly focus on "just do what makes you happy" at the cost of indecisiveness :S

One other thing.

> For sure the ability to edit anything at anytime can lead to instability but there are ways to ameliorate that without being as dramatically inaccessible as today's environment.

This is something I got stuck on for about... 6-7 years. The idea that it might be possible to create an environment where you really could edit things in such a way that the result was generally useful. Not perfect, but generally useful.

Heheh. Remember "mashups"? The idea that you could combine bits of data from around the Web onto one webpage? I think that idea kind of happened, but the nicely cohesive organized notion that it'd let us control the data flying around - and even lay it out - kind of never eventuated, and with perhaps obvious reason: such ideas require full-stack integration, down to the smallest detail, in order to work.

Knowing this, I set my sights on minimal levels of accessibility - enough to let people make it look like they put their screens through a blender and customize things fairly extensively, without needing to write code to do special-case rendering, ie where applications would be "editable" enough to be able to customize them appreciably.

I eventually reached the conclusion that the amount of effort required to architect such a system... and then all the applications for it... would get us to somewhere only slightly north of where we are now. Better, most definitely, and maybe even better enough to justify the cost, but in terms of this idea having a magic breaking-even point? No.

I thought, okay, Facebook is pretty widely hated, but a lot of people use that every day, and they don't seem to hate it enough to not use it, so let's say we apply the formularized, cookie-cutter model they use to UX, and let's say it catches on. Information would need to be expressed in specific ways so that the UX engine could pick it up and render it correctly.

And I decided that such a system would only possibly wind up "universally useful", which is the goal I'm aiming for. I can't really assure myself that such an idea would definitely work out.

NB. Bug #2691 in Arc: posting new comments can sometimes result in "that comment is too long", but I've discovered there are no size restrictions for edits. :>


I'm no programmer, but I remember back when Ruby 1.8.7 got really popular because of Rails even though Ruby 1.8.7 was not that secure or stable compared to other environments. (Yes, I know Ruby is better now.) But, I was wondering why someone didn't take Smalltalk (or Squeak or Pharo) and find a way to make it more popular? We have Node, Electron, Java, etc. So why not Smalltalk? Wouldn't it be easier to develop, run faster, and be more cross-platform than those popular environments? Is there something in Squeak or Pharo holding it back?

Maybe the next Tim-Bernes Lee could use it to develop new protocols + a browser and create a safer and secure WWW-alternative?


The short answer is it's different and unfamiliar to most newcomers:

1. The image-based paradigm [1] holds it back. Using Smalltalk, there's quite a change in development style, the existing development tools (text editors, version etc) mostly don't work [2], and deployment is at least slightly different.

2. The unfamiliar UI (mostly Morphic) looked very different.

3. The different (but incredibly simple) syntax looks strange to newcomers.

There are paid tools that fix some of these, but they never quite caught on with lots of momentum, and they were used mostly in bigger enterprise shops. The biggest might have been IBM VisualAge (for Smalltalk), which later morphed into IBM VisualAge for Java and then Eclipse.

[1] The image-based paradigm is really really great. If you haven't experienced it before, give it a go. It's worth it even if you are never going to use Smalltalk. [2] There's been effort to get git to work. I haven't been tracking it, but this was only in the recent few years.


Apparently (not an expert) [1] Smalltalk was the leader in basically the Java niche until Java happened, whereupon the proprietary and rather expensive Smalltalks were wiped out, since not only was Java free as in beer and reassuring to C++ programmers, it also came in riding a truly preposterous hypewave. (For those who weren't there and haven't heard: it was roughly cover-of-Time-magazine scale. It was visible from space.) Then HotSpot—retrofitted from the Self VM, ironically—gave Java a performance advantage which (iirc) the existing ST VMs couldn't match. And while Smalltalk was almost the paradigm dynamic language, the proprietary STs had no impact in the scripting-language niche thanks to their licensing costs and inward-looking, image-based nature. (GNU Smalltalk was another story: I don't know what happened or failed to happen there.)

[1] There was in fact an article written by someone who was an expert which did the rounds of /r/programming back in the day. But good luck finding it now of course.


You missed the part that IBM was a big Smalltalk vendor and they rebooted their product offering into Java, including Eclipse which has its roots on Visual Age for Smalltalk.

Eclipse to this day still has a Smalltalk like code browser for Java code.

Also in the early days although the JDK was free beer, all good IDEs were commercial, Eclipse and Netbeans only came a few years later.


There was also Inferno[1] and Limbo[2] from bell labs touted as the future, some very good ideas..

[1]https://en.wikipedia.org/wiki/Inferno_(operating_system) [2]https://en.wikipedia.org/wiki/Limbo_(programming_language)


I think it is called "Why Smalltalk failed" and is easy to find with a search.

Java was web, Perl was web, and ST wasn't ready. Java and Perl were free as in beer, ST was expensive. I think ST hardware was expensive too. ST is very different from C & C++. ST had multiple vendors too.


"why smalltalk failed" may afaicr have been first thing I tried: https://www.google.com/search?q="Why+Smalltalk+failed" . https://medium.com/smalltalk-talk/why-smalltalk-failed-to-do... isn't the article I was looking for. It's much more recent, and much less in-depth. However http://wiki.c2.com/?WhyIsSmalltalkDead covers a lot of the same ground, unsurprisingly.

I'm pretty sure that late-'80s/early-'90s ST was largely run on commodity (or fairly-commodity) hardware: PCs, Macs, Unix workstations. It was commercial Lisp that tended to be tied to dedicated hardware. IIUC Smalltalk did have a then-expensive appetite for RAM though.

The commercial STs of the time weren't good as CGI languages like Perl and PHP for the same reasons they weren't suitable as Perl-style scripting languages in general. Meanwhile, behind the hype it soon turned out that Java applets were not ready for prime time either, while servlets were something that could probably have easily been replicated in ST (and for all I know, likely were).


googling

   site:www.reddit.com smalltalk leoc
one of the results is:

Bambi Meets Godzilla - "I watched Smalltalk die."

https://web.archive.org/web/20080706101400/http://www.oreill...


> The image-based paradigm [1] holds it back. Using Smalltalk, there's quite a change in development style, the existing development tools (text editors, version etc) mostly don't work [2], and deployment is at least slightly different.

It holds it back because it's a bad idea, otherwise Access/VBA apps would rule the world by now. It's a great environment for single developer RAD tools but it's horrible when you're trying to scale the team up because tools like git don't work well and you often don't want data to be shared between instances.


git didn't work with it (maybe it does now). But there has been native version control tools in Smalltalk. There was ENVY, and Squeak and Pharo has Monticello, Dolphin had something (can't remember its name).

git is great in many ways, especially how it lets multiple developers to work more effectively on the same code base. But native version control tools on Smalltalk, while they might not work as effectively for large groups of developers, has native support — it can support things like versioning at method level.

Rather than arguing Smalltalk is the best or that it is bad, I rather focus on what I think — it has many great features, it's just that it's so different that it can't ever be popular, at least not for the foreseeable future.


Well git was made for text/file based programming languages. If SmallTalk had taken over instead, we would have the equivalent for image based languages.


How exactly does the image-based approach mesh with version control like git? I'm trying to picture a possible approach, but failing.


Smalltalk has a concept of changesets. They can approximate diffs/patches.

And while the code is exposed in code browsers (some kind of IDE/text editor window) in the image, and not exposed as text files, they certainly can be. So I'd imagine it's just a matter of exposing those into files (or still classes and methods as separate "entities") and commit those into git with a bit of markup.


But the whole point of the image approach is that it's not just code, but also data, no? It would seem that it'd require some multiple-file text-based serialization format for images that would include both code and data, and would split it into files in a way reflecting the typical change patterns...


Code + data + live instances of the code/objects. Like I said, I haven't been tracking how they do it, but I'm guessing that they only track code. If you want to be able to deploy an image directly, you either have to write scripts (and check those in too, either standalone "scripts" or class-side initialisation) or save the entire image and deploy that.

Saving the development image and deploying it directly is also not done that often because the development image contains things that can stripped off, e.g. development tools or object instances you use during development. Squeak and Pharo is usually distributed in various flavours, loosely — full and minimal images (I might not get the names right). So you take the former and install packages for development and the latter + packages for deployment, or you can build your own from the minimal image.

But through this you can see how different it is and while it's great in many ways, some of the differences make it harder for newcomers to start and harder for everyone to use it in production.

I forgot to mention Dolphin Smalltalk http://www.object-arts.com. It used to be a paid product, running on Windows. It has been open source for a few years now. I think product sales wasn't sustainable for the owners. But for several years, it is was the best Windows thick client development tool. It's a great example of Smalltalk well-done. You have the same image-based approach, but the windows are all native and familiar looking, so basically a multi-window development IDE. There's a deployment tool that helps you to strip down your image and build it into an exe (which was basically the Dolphin Smalltalk image + the shrunk image). It might still run, maybe it looks a bit outdated, it was actively developed until 2004 or so. If it still works, it's a wonderful way to experience Smalltalk.


At a place I worked they checked the changed / added code[1] into a conventional version control with all our other code. There are quite a few adapters available. The final image deployed to production was a starting generic image which the build team applied all the code from the version control. Worked just fine. If image-based approach is the deciding factor, it really shouldn't be. There are tools.

1) Smalltalks have a text output format for code chunks


I believe Monticello has been the mainstream Smalltalk VCS for some time, but I don't know much about it: http://wiki.squeak.org/squeak/1287


GNU Smalltalk isn't image based.



An image could be serialized to anything, including edits...

Aside from the code, all other user data is message sending, so it could also be serialized (if needed).


For a take on what happens when smalltalk and the web browser and js meet, see:

https://lively-kernel.org/

For a more "traditional" smalltalk for js, see:

https://www.amber-lang.net/


Rails was about the first time a functional language (probably other types too) had a programmatic proponent, it leveraged ruby to create an easy way to build web apps and people/companies wanted web apps. Just about every other language was full of smug proponents who just wanted to show how little code you needed to generate Fibonacci numbers, as if that was at all relevant to most developers. They spent so much time telling everyone why $lang was great and very little time showing them how it could be used for great things. Just looking at the pharo homepage this jumps out:

>Live, immersive environment: Immediate feedback at any moment of your development: Developing, testing, debugging. Even in production environments, you will never be stuck in compiling and deploying steps again!

That doesn't appeal to me at all. I don't want to be debugging in live environment, I don't want users or testers having that power and I don't get stuck in compiling/deploying steps.


>Rails was about the first time a functional language (probably other types too) had a programmatic proponent, it leveraged ruby to create an easy way to build web apps and people/companies wanted web apps. Just about every other language was full of smug proponents who just wanted to show how little code you needed to generate Fibonacci numbers, as if that was at all relevant to most developers.

Yeah, that wasn't it. Maybe you weren't there? This sounds like a third hand experience of the era.

Functional languages have been used in major programming projects (and had popular apps, from Eclipse and AutoCAD) for decades before Ruby, not to mention AI work.

Besides, PG's company used CL to build one of the first web stores way before Rails was even a thing.

>That doesn't appeal to me at all. I don't want to be debugging in live environment, I don't want users or testers having that power and I don't get stuck in compiling/deploying steps.

That's because you haven't used it.


I installed pharo to see what this smalltalk was all about. At first, I was confused with what I saw. I was expecting to set up a compiler build pipeline and an emacs mode, but it booted into some kind of operating system graphical shell. I realised then that smalltalk is more than a programming language, its a full IDE + OS

A shame that I couldn't continue toying with it (that was before my computer got reformatted)


> "A shame that I couldn't continue toying with it (that was before my computer got reformatted)"

It looks like it's still available for download for a variety of OSes, if you'd like to continue:

http://pharo.org/download


See also:

(lively kernel, amber smalltalk for js) : https://news.ycombinator.com/item?id=17915402


I had the same reaction. Once I realized what it was, I had no idea what to do with it though. Anyone know of something that shows you how to get started or shows off its capabilities?


http://books.pharo.org/ has a bunch of books. Pharo By Example is a nice starting point.


interestingly i was just looking at this last night!

Here's their MOOC course: http://mooc.pharo.org/#week1

This is a real-time physics engine demo: https://www.youtube.com/watch?v=eYuhlixBaPI

This is a "Meta-IDE"! https://www.youtube.com/watch?v=anZoWeA5vd0


my company's front end is all in smalltalk. My boss designed his entire site with smalltalk.

http://onsmalltalk.com/


If Seaside is still a thing, you could start with a simple website.


To get another taste of the kinds of systems possible in the 70s, in spite of the constraints, see Kay's demo of a revived Smalltalk-78 system: https://youtu.be/AnrlSqtpOkw?t=3m3s (I believe this system is about 10-20K LOC software in total)

Notes about the the revival are here: https://freudenbergs.de/bert/publications/Ingalls-2014-Small...


Smalltalk is one of those things I want to learn, but resources seem sparse and everyone seems to be pushing their own particular flavor of Smalltalk. Do I go with Amber Smalltalk? Squeak? Pharo? Are they all sufficiently similar that I can move from one to the other without re-learning everything?

I'm honestly considering just going through the GNU Smalltalk manual on the terminal and worrying about all the graphical VM's later.


I'd just go with Pharo. Pharo by Example is a good intro book, and it's still under development. Also, I would strongly suggest that you go with something like Pharo where you run the image on your desktop. Part of the reason I really, really like Smalltalk is how working in the image changes your workflow. As I understand it, GNU Smalltalk is still write file-compile-run, which is not at all how you work in Smalltalk (or at least not I or my colleagues do), instead I find that Smalltalk encourages a very iterative style since you're generally working directly on a running instance of your program.

It takes a bit of getting used to the development environment in the image (and setting breakpoints by inserting `self halt` directly into a method feels really dirty initially), but now that I've been working in Smalltalk for a while I find I prefer it to "normal" development, which is still very much "hand in your stack of punch cards and wait for the result to come back"-way of development.


I would go with Pharo if you are interested in what the Smalltalk that is referenced in the article was like. Pharo has a modern and friendly look, it's also providing the GUI environment and integrated tools that (in my opinion) seem in line with the original goals of the Smalltalk project.

If your interest is more in Smalltalk the language, I think that using GNU Smalltalk in the terminal is certainly reasonable.


there is a pharo mooc http://mooc.pharo.org/

I took it some few years ago, it was great to get a feel of the lifestyle, and ST deserves all the praises. It's tangible dynamic visual repl, the object system is sensible and human (unlike java). They have creative and interesting ways to make software (the web engine felt super low on boilerplate).


Never realized how much Smalltalk influenced Objective C until I looked a bit more into it.

See e.g. object initialization here: https://www.gnu.org/software/smalltalk/manual/html_node/Inst...

And the notion of passing messages to an object.


Plenty of submissions but only a few short discussions:

https://hn.algolia.com/?query=The%20Early%20History%20Of%20S...


I get the feeling it slips off the front page by the time folks have read the whole thing.


Maybe the front page algorithm should be adjusted to take this into account?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: