Hacker News new | past | comments | ask | show | jobs | submit login
If Smalltalk Is So Good Why Does Nobody Use It (c2.com)
60 points by CapitalistCartr on Nov 10, 2014 | hide | past | favorite | 70 comments



I worked on two Smalltalk projects in the early 90's at two different companies. Both efforts were abandoned for C++. Some of the issues we encountered at the time:

* Numerical computation was too slow. Numbers used to be objects and all numerical computations followed object messaging. Ouch! This was eventually remediated but not before we'd dropped Smalltalk.

* Garbage collection was S-L-O-W. Your workstation would randomly freeze 30-60 seconds. It was that bad.

* The built-in windowing environment was an albatross in the early days of Windows and OS/2. Customers wanted a native look-and-feel not the weird (to them) Smalltalk window manager.

* Not everyone had yet bought into object-oriented programming, much less making everything an object. This was really bizarre at the time - even code was an object. Makes sense now, but few people were thinking that way then.

* It was strange. C++ seemed safer since everybody at the time knew and was comfortable with C programming. Little did we know the pitfalls awaiting us when using C++! :)

In the end I think the Smalltalk experience left such a bad taste in everyone's mouth that when these issues were resolved no one was interested in going back and giving Smalltalk a second look. Nowadays Smalltalk lives on in Objective-C. In fact one the shops I was in abandoned Smalltalk for Objective-C, which is one of the few instances I know of people using Objective-C outside of the Apple ecosystem.


The windowing system depended on the flavor of Smalltalk; Smalltalk/V Windows, which I used circa 1991-1992 used the native GUI. I forget what I was using on our Unix boxes, but it used X (or possibly Motif, I forget).


The question is moot -- Smalltalk isn't so good.

Back in the 90s we were developing sales force automation software in VB3 (ugh!) which was by no means wonderful. Our stuff ran far faster on modest hardware than similar Smalltalk software running on (by the standards of the time) ludicrously powerful hardware. Delphi, when it came out, was far, far better still.

Smalltalk also seems to have accumulated annoying cruft faster than it got useful. Another large Smalltalk project I was involved in (as a UI designer) suffered from multiple incompatible class hierarchies. I wanted to make different controls that did similar things look similar, but it was impossible because some controls inherited from a 1-bit graphical library and couldn't be rendered in color, others from a 4-bit graphical library and couldn't be rendered in 256 colors, and others didn't respond to certain events in certain ways (e.g. action on mousedown instead of mouseup). It was an absolute nightmare.

The language may be fine, but the implementations of it were slow and bloated, and the class libraries were a disaster area. (Of course, the main company pushing Smalltalk at the time was IBM, which probably says something.)


Even IBMers didn't really like it. I heard similar stories of complexity over dynamicism turning into a 'Message Not Understood' fest. Maybe they didn't really understand the language.


I'd say there are two currently popular languages that are significantly influenced by Smalltalk: ruby and ObjC.

They are both quite different languages, but both very heavily influenced by smalltalk.

ObjC is mainly popular because it's the language you have (or until recently had) to use to write for iOS -- although many developers come to like it for itself, and it certainly has been succesful at Getting Things Done. (I believe iOS itself is written in it as well, if not OSX too?)

Ruby had no such vendor lock-in, so the only reason to use it originally was if you liked it, although at this point the large library of open source available for it encourages it's use. (Of course people had to like it enough to write that open source for it, something got the ball rolling and kept it rolling)

Both ruby and ObjC take different things (with some overlap), but there are some things that _neither_ takes from Smalltalk. Looking at the things neither takes from smalltalk (eg the "program as image" that means you can't just use a text editor to inspect or use effectively in a generic version control system)... probably suggests some possible answers to why smalltalk hasn't been more successful.

I am not familiar enough with Smalltalk to be confident in my suspicions, but comparing to the more popularly succesful languages ruby and ObjC is probably a fruitful avenue to answer that question.


I don't use it because at this point it's alien in the world of modern computers. Back when Smalltalk was in its prime it wasn't just a programming language, it was essentially your operating system, complete with its own graphics system, window manager, etc. This hasn't changed, but instead of being the basis of your system it's essentially another OS running on top of whatever your poison of choice is.

I think today the concept of the Smalltalk image and workspace has pretty much killed it, not the language itself. I use git for version control, I use GNOME as my desktop environment and IntelliJ as my IDE of choice, trying to do anything in smalltalk requires I transport myself back to the 90's.

GNU smalltalk is the closest we have to a smalltalk implementation that works well in modern computing environments, but there's a big lack of interest in the project.


[deleted]


I'm confused, where did the GP say anything about the GPL stopping him/her from using it?


I liked Smalltalk and saw it used well and used badly, but I don't think it is anything about Smalltalk that was the problem. It was the Smalltalk vendors that mostly screwed it up. Thousands of dollars per seat just wasn't worth it. It didn't help that some companies regarded it as "secret weapon" and kept its use hush-hush (I've heard the same thing about APL derivatives, but have no idea if that is just urban legend).

I figured if someone had built a VB like environment for Windows and priced it at $99, it would have done well. Closest I saw was Dolphin, but they did the old "database connection extra" crud. I wonder how many development environments have died due to that one pricing checkbox.

Smalltalk could be fast. Smalltalk/X was pretty speedy.

(PS - totally ignoring the native platform look and feel was foolish in the extreme)


I'm not sure whether native look and feel is ever important point. At the moment I cannot think of any enterpise-ish application that does not ignore native platform look and feel.

And for both Delphi and now .NET, there is significant market for third party components that look purposefully non-native.


I think the native look and feel is important because the emulated interface always seem to have flaws and added to the training cost of users. Each new version of Windows would bring problems for the emulated interfaces where the tools using the native widgets didn't have the same problem. Worse, something like Squeak which is basically running a foreign OS in a window with garish colors.

I know a lot of enterprise apps go off into the weeds, but their success is driven by external factors. Successful tools really need to appeal to a broad group of developers.

I really think a native widget version of Smalltalk that included the database access could have done a fine job, but it always seemed with Smalltalk that the NIH syndrome was so high and they could build their own things better. Its like they didn't respect any platform they were running on.


I think Java was almost killed by not having native look and feel in Swing. There is probably still a lot more desktop applications written with native Windows look and feel than with Qt or Java. It is still a problem with mobile apps that users can feel confused over some gui's that are made with html+css (or something else) and does not follow the native look and feel for the platform in question. It will probably get better over time but it has been an obstacle for many, many years.


I worked in a Smalltalk shop, smalltalk is not that great. I've discussed it at length, my problems with it. As a language it is impressive what they built at the time. In practice it was a terrible pain in the ass with tons of "ravioli" code everywhere. The language has no qualms with letting you wreck havoc within the standard base classes, which makes integrating anothers code almost impossible.


Oh I do want to add that it is a language behind a paywall as well. Which i find rather strange. Being charged as a customer to one) use their language and two) being charged per user we allowed into the environment.

Because Smalltalk is an enclosed runtime (similar to Java, but much worse). We had to tell them how many users would be using non-developer editions of the runtime and they charged us for that.


I don't find that strange, as I remember the days we had to pay for every single one of our tools, including distribution royalties.

Open source kind of killed this way the industry used to work forcing companies to find other ways of revenue, except for big corporations that kind of still work this way.


cough borland cough :) Oh I guess, i just felt it was strange that Cincom still charges a significant amount of money to run their stuff, per seat and per server pricing.

When you look at the market of tools today. Historically speaking, they were par for the course.


There is a big difference in pricing between Borland and Smalltalk vendors. I remember the Turbo Pascal / Turbo C pricing and how it really made quality compilers accessible. Smalltalk never had that revolution.

Anytime you need to "call for pricing", you know its going to cost.


correction smalltalk WAS a language behind a paywall... which, as others have suggested was probably a significant contribution to its decline. Squeak http://www.squeak.org/ (kid oriented) and Pharoah http://pharo.org/ (professional oriented) are both excellent and free.


and still IS depending on the distribution. Squeak and Pharo are great from what I understand, but are lacking some of the enterprise features present in Cincoms distribution. Though for 99.999% squeak and pharo are going to be more than enough.


Put simply: because the things that made Smalltalk great are now part of modern stacks. Dependency management, interactive development tools, object-browsing IDEs, plus virtual machines improved significantly. Why go back to Smalltalk when we have all of these things with the language of our choice today?


Because smalltalk integrates those things together better than any other language / environment that has them?


Don't get me wrong - I'm thankful that I got to use Smalltalk in my career. It was incredibly expressive and productive - maybe the most productive I've ever been. But, time marches on. There are simply more people working today on python / node / <pick_your_favorite_here> to maintain and improve those tools plus making it easier to find a <language> programmer over a Smalltalk programmer.


out of interest, which modern stacks allow "live coding" your application? That is: no need to restart for various changes you make. You are playing/changing/creating with the objects that make your aplication.

I am currently doing mostly Objective-C-programming in Xcode, which is a rather simple IDE in this respect.

In the past I used more java, but at this time only simple changes were possible live (using Eclipse back then), for a lot of others I had to compile and restart anyway.


I like the idea of it.

I recently tried out Pharo with a simple, achievable goal in any language worth using these days: fetch a document from a website, parse it, and display all the href's with titles. It was a very pleasing experience!

I basically evaluated the code I'd expect to use to fire up this application in a scratch area. Naturally it didn't work. It dropped me into an interactive debugging session. From there I just started writing and compiling the missing methods and objects. I kept doing this until my original evaluation just started working. No recompilations or context switches between editors and compilers and the like. Best of all I could just close down the image and take a break. When I booted it back up everything was right where I left it.

If given the opportunity to use it on a project I'd give it a shot. It was one of the most refreshing computing experiences I've had since I got hooked on Common Lisp.


Cost. Smalltalk was proprietary and very expensive. Had the Smalltalk houses made open source free versions and monetized with consulting instead of software sales, I suspect things would be different.

The technical issues people cite are not so different with the dynamic languages that were much more successful (Perl, Python, Ruby, Objective-C, etc).


A lot of people point out here Smalltalk is not great. I personally don't know either ways, because I haven't used it.

However, I do want to point out that adoption and quality need not be correlated. Haskell, for e.g., was relatively obscure till recently (and even now I would say very few people use it). But it is absolutely amazing compared to most mainstream languages.

The article linked makes some excellent points about business and other considerations and how it influences adoption of any language/tech (other than the quality itself).

The question itself can be reframed either as "Is Smalltalk good?" or "Why does noone use Smalltalk?"


I sense a similar thing happening with languages stealing functional features from Haskell, similarly to how languages stole OO features from Smalltalk.

C++, Java, Objective C, Ruby borrowed OO ideas from Smalltalk to varying degrees. Eventually, you could get most of the benefit of Smalltalk OO ideas without actually using Smalltalk.

Swift, Rust, Scala borrow functional programming ideas from Haskell (and ML) to varying degrees. My hunch is many programmers will encounter and use functional programming constructs in those languages, without ever programming in Haskell.

To complete the comparison, Smalltalk is still arguably OO in its purest form, as Haskell is probably the purest embodiment of the functional programming paradigm. But in both cases, most programmers are happy to adopt some subset of those features in languages more similar to the onese they already know.


As a haskell user, I would say that you're overstating how amazing it is. The language is certainly fantastic, but the tools are so awful that it almost makes me willing to use go! GHC takes ages to compile code, as well as easily a couple of GB of RAM. When you're longing for the days of C++ because your build time was so much faster, something is seriously messed up. Cabal mostly stinks, ocaml's cabal inspired packager is far better despite being much newer and having fewer resources invested into it.

The community is borderline insane when it comes to style, so you basically have to use a huge complicated emacs setup with a bunch of plugins just to be able to produce code that meets the bizarre style guides without manually messing with spaces all the time. That last point shouldn't matter since for your own code you can just pretend it is a normal language, but as soon as you want to make a change to an open source library you're stuck.

Much of this is similar to smalltalk, where the language itself was good, but trying to use it would drive you mad.


> GHC takes ages to compile code, as well as easily a couple of GB of RAM. When you're longing for the days of C++ because your build time was so much faster, something is seriously messed up.

IMO, that's only true if what you get out of the compiler provides the same or less value as what you get out of the compiler for competing options. OTOH, the whole reason people choose Haskell over other languages is value-add from Haskell's compilation process. Spending some clock cycles at build time for that is, IMO, worth it.


You are assuming that it is necessary to take many minutes and GBs of RAM to compile haskell code. There is no reason to believe that is so. Especially since other haskell compilers are much faster. It is 100% certain that GHC contains at least one memory leak that has been unaddressed for years. Simply put, absolutely nobody has put any effort into making GHC fast. Only into making it produce fast output. Ocaml gives you 90% of the benefits of haskell with 10% of the compile time (actually faster than that, but close enough).


> It is 100% certain that GHC contains at least one memory leak that has been unaddressed for years.

That's clearly a problem.

> Simply put, absolutely nobody has put any effort into making GHC fast. Only into making it produce fast output.

OTOH, as annoying as it may be to developers, on the assumption that code will be run more frequently than it will be built, that seems like priorities being in the right place.


A lot of times people like the idea of a thing more than the thing.

Meanwhile, a lot a people hate on C++, which they're only able to do because they learned its nuanced spec, which they only did because it had already reached critical mass in the wild, because it worked everywhere and was, despite the complaints, comprehensible, if hard to get perfect in the language-lawyer way.


A lot of people also spent a lot of time hating on Windows, which they only were so familiar with because it had already reached such overwhelming "critical mass in the wild". Does it follow that all these complaints were ultimately misguided?

(I don't have any particularly strong opinions on Windows myself; it just seemed an apropos example for the Hacker News crowd. There are a million other examples of the same type, of course.)


I think the point is that the complaints may all very well be valid, but that there is also the question what they did right. And Windows do get some important things (mostly) right, for example backwards compatibility.


This argument reduces to saying that anything popular and widely used is actually better than anything unpopular but beloved by language enthusiasts. This simply isn't the case. Many, many factors beyond merit contribute to language popularity and usage.


Even though the basis of your statement, "Many, many factors beyond merit contribute to language popularity and usage," is true, actually defining the merits of these lesser used languages seems to be something that their fervent supporters are poor at.

To my (C-tainted ears), the affirmations of support generally tend to sound something like, "It's better because its (modern|buzzword|safer|buzzward|technobabble|compact)."

I am not suggesting that these languages do not have tangible benefits, but I do wish their supportors would be a bit more concrete while describing them.


I consider myself a language enthusiast, but I think that 'merit' in PL design is extremely poorly understood on a practical level. This idea that these languages became popular in spite of their apparent lack of merit is far too easy a way to stop thinking about what merits they might have that the oft-touted 'better languages' lack that drive people to use them. And this thought-stopper has to rest on an assumption that programmers don't want things to be easier/better/etc., which strikes me as unlikely.


You have to have a metric for 'better'. If its 'utility' then yes something popular is better than something unused.


Thus my motto for PHP: It's better than nothing at all. Usually ... FML.


Despite its warts PHP does have very valid merits that make it popular.


Even "utility" isn't necessarily identical with "popularity". I suspect, in many cases, by "X is better than Y", people mean enthymematically "X would be better than Y if we were to start using X instead of Y, even taking transition costs into account"; it would be the death of progress to dismiss out of hand every argument of this kind.


You're either saying something that is tautologically true (more popular things have more "utility" because they're used more because they're more popular), or is clearly false by any metric that a majority of experts in CS or language design, or even industry professionals, would agree on.


How about "Sum ( usefulness for a purpose X number of instances its used for that purpose )"? Plenty of people would agree with that definition of utility.

Something that isn't used at all (or much) has very low utility.


Now, as a follow-up: is anything holding back Rust?


Anyone who understands Rust's good points (and there are many) is probably already using Haskell. What compelling advantage does Rust have over Haskell to justify the cost of switching (and, more importantly, the cost of losing higher-kinded types, and of adding macros)? Manual memory management is unlikely to be enough.


There's a ton of people who aren't using Haskell and never will. It is a great language, but it isn't a language that will ever be really popular.

Rust bridges the gap between industry C++ programmers with the ideas of Haskell. Sure you might not see a lot of codebases migrated from Haskell to Rust (not that there are a massive amount of anyway), but you will likely see C++ codebases migrated over to it, and Haskell programmers that want their library to be widely used may start greenfied projects in Rust instead of Haskell.

You are looking at it from a Haskell -> Rust conversion cycle, when that isn't really important from either a numbers standpoint or how Rust is positioning itself.


Let me put it this way then: why would someone using C++ who will be willing to migrate to Rust in the future not be willing to migrate to Haskell now?

Many of the advantages and disadvantages are the same: different syntax, weird sigils, an emphasis on immutability, a much more restrictive compiler, a strong type system, a new way of thinking about your programs. And it will take Rust a long time to reach the maturity and library ecosystem that Haskell currently has.

One might catch a few people who absolutely need manual memory management, I guess, but my experience is that this is often an excuse and those who are seriously interested in using another language find ways around the problem. For OS kernels and the like Rust has an advantage, but again that's a tiny niche. Rust's C FFI might be a bit better, but honestly Haskell's is rather good; I can't see that making the difference. And if you find the safety or the type system compelling, surely you've already moved beyond C++.

I'd like to be wrong - I'd like to see C++ programs moving into better languages. But I think a need for manual memory management is very rarely the real reason people are still using C++ - and if it's not that, what can Rust offer that wasn't already available?


The people sticking to C++ aren't using it for the sheer joy of manual memory management, they're using it because they either require or enjoy the ability to exert predictable control over programs in terms of both execution speed and memory usage. Haskell, being a lazily-evaluated garbage-collected language, offers little to these people.

Rust has emphatically never been "Haskell with manual memory management". That conception is entirely incorrect, typeclasses or no. If you must frame Rust in terms of prior languages, consider it to be an ML with an extreme focus on zero-cost abstractions and pervasive, memory-safe concurrency.


Many of those C++ developers are using it, because C and C++ are the only languages with native compilation toolchains that all OS vendors have on their SDKs.

When will Rust be pushed by an OS vendor?


Anyone who understands Rust's good points wouldn't write a comment like yours. Rust's goal is to be a systems language that allows one to write programs with native performance (equivalent to C's) without sacrificing the safety of higher-level languages by leveraging an advanced type system. Haskell is an extremely poor candidate for that niche.


It is entirely possible to write native-performance (better than C++, and soft-realtime at the same time) programs in Haskell; I've seen it done in a professional environment.


I see. I was thinking generally about tools / libs / ecosystem, and platform presence. I actually didn't realize Haskell was already so established.


It is established among a very competent, smart, and vocal community. It is not established as a language heavily used in industry or most open source projects by any means.


I meant less that lots of people use Haskell and more that most C++ folks aren't interested in changing languages.


Predictable performance.

That seems to be the main selling point for Rust, anyhow. Zero cost abstractions, predictable data sizes, well defined memory scopes, etc.


Maybe Self was the best chance. We all know that java has won, but Self could have been Java (and thus the success of smalltalk). Maybe my understanding of this history is imcomplety, if it is please help me out.

Java should never have reached where it is today. Im not hatting on java btw. Just historiclly speaking, java was a embeded language that was pushed for the internet. At the same time Sun had the well advanced Self project in devlopment. One of the major reasons that java was selected was its small stack based bytecode.

Self was the hotbed of compiler research and could run rings around java. Self is a small language and you need less lines of code then java. Self would probebly not have been much heavier on the wire.

So why did the people at sun push Java and not Self? I can not explain it. The only explaition, is that they just did not know what they have. Anybody has a better idea, is there something I completly misunderstand?


Java probably won because of its C-inspired syntax. At that point smalltalk was falling behind and C++ started to take off. To me Java was always meant as a better C++. At least that is my guess.

A lot of the people behind Self went on to work on the JVM JIT and later the Chrome V8 engine. Self produced many influential ideas in the JIT space like polymorphic inline caches and deoptimization and stack rewriting. Some of the papers published by the Self group at UCSB are fascinating.


The syntax is something, but they could easly have written another parser for sell and make it look very C like. That would have been 100x less work.

Many of the people that worked on Self later worked on Strongtalk, that was bought and is the bases for the current JVM, and yes now everything that was in Self is in the JVM.

So my point is, why not just start where they ended up anyway. In one of David Ungars talks he said something like "they probebly just didn't know what they had". Thats about the only thing that makes sence, but maybe somebody knows more then me.


Self has reincarnated as Javascript.


Yes, but thats the point. We could have had what we have now 10 years ago, or even more.

Sure hardwar got faster as well, but im talking quality of the VM.


The reason I don't is because it takes its philosophy too seriously. My absolute top nit-pick with that philosophy is: it doesn't compile to an .exe/elf.

Using Squeak as an example, it is very-much a virtual machine. If I want to work on a new project, I might have to create a new VM image or something. It's like an OS and language boiled into one thing, which simply doesn't work.

There are more classical approaches to SmallTalk these days, but most of the literature on the language is oriented toward the Squeak approach; dedicating half the bloody book to working within the OS! Just let me use Vim, I don't want to use your shoddy editor!


Smalltalk is a little shrewd in some ways as outlined here. Self has already been mentioned as a great alternative (and the related papers are amazing IMO and influence most modern JITs).

But there are some other really interesting alternatives that IMO keep with the spirit: Io (io-lang.org), Ioke (defunct now, ioke-lang.org - some awesome ideas IMO) and Squeak which is being actively developed.

I'd recommend checking out some of these, I think you can easily install io via brew on Macs and used to be able to install joke as well. They improve syntax and/or add significant language features.


The Java HotSpot compiler was originally worked on for Smalltalk before SUN bought the company. It along with the ideas from SELF certainly helped improve JIT systems and dynamic compilation solutions in the late 90's and all thru the early 2000's


I haven't checked others but the publishers O'Reilly offer something like 8,000 titles related to IT. Open to correction but I can't see a single one on Smalltalk or Pharo. Is this simply because no publisher can see any return on investment? What do the Pharo developers think about this?


The consultants who can sell it to gullible managers have moved on to selling something else.


That something else was Java.

One of the answers could be that almost every OO language borrowed some concepts form Smalltalk and implemented them in a less "cryptic" (or esoteric) way. Take a closer look at that Objective-C. Erlang's (non-OO) modules and message-passing primitives were also influenced by Smalltalk.


Kinda reminds me of LaTeX. It's my favorite way to write documents ever since I learned the language, it's so much more precise than word processors, but most people still defaults to them.


Smalltalk was and is used for many advanced, super complex and difficult to understand requirements based systems. Much as Lisp and Scheme are. It sprang the ideas of full OO onto the programming community, it's guru's brought us Agile development techniques, Refactoring tools, ChangeSets (Git has this now), Everything is an Object and many other ideas and concepts prevalent in today's languages and tools. Many languages freely admit the inspiration of Smalltalk to their ideas and have expanded on Smalltalk's ideas. Some people have continued to expand Smalltalk itself. Even it's original creator (Alan Kay) hopes for a better system than Smalltalk, but even recently he says he hasn't seen it yet and has called Lisp the "the greatest single programming language ever designed".

Company's like JPMorgan (Currency Trading), Booz Allen & Hamilton (Currency Trading), Sprint (Network Topology and Configuration), Digital Switch Corp (now Alcatel) (Digital Cross Connect), TenX Technologies (C source code maintenance and analysis tool), American Express (Full IDE for a custom in-house Credit processing language) and Nortel (Meridian PBX line Configurator as well as Configurator for the Nortel Wireless Modules used in cell towers) have used Smalltalk to great ends and gotten competitive advantages and accomplished great things with Smalltalk.

There were lots of others projects that used Smalltalk to create systems that they had failed at previously or did not want to tackle due to the enormous complexity. OO and specifically Smalltalk gave them a way to implement these systems in an Agile way, long before Agile Manifesto was dropped onto the world. Remember ST at the time embodied the most mature implementations of the OO paradigm and it's power was not meet for many years. No other language gave the bang for the buck overall. C++ was still being standardized thru the early 90's. At the time and up until Borland shipped their Turbo C++, most compilers were incompatible with one another and/or generated C code because they were based on ATT's C-front system, which itself was being fluidly developed.

I worked on those examples listed above for those companies and knew of many others working on other projects. I also happened to work with C++ from 1989-1993 while porting tools such at SUN Tooltalk and SUN Net License to IBM's AIX boxes as well as Porting the Mentor Graphics ECAD products and core libraries to AIX and DEC Ultrix and OSF/1. We had 14 hour compile times, even with 128 Meg of memory (which at the time was huge) using the $150K box on my desk. I had also worked with Obj-C on the NeXT systems and IBM RT system and several OO Lisps including CLOS.

Each of those systems had good points and enabled different things, but Smalltalk and the tools it provided and the amazing productivity that we were able to accomplish was amazing. Fixing bugs in the debugger, stepping back one call on the stack-trace and continuing on (no stopping the whole program and starting again) [Fix and Continue that worked every time, not just some of the time like the C++ and Java versions), adding capabilities to the system itself, being able to inspect and see anything at any time, seeing all the code for the system and adding to that were amazing and still are.

The title is misleading in that "Nobody" is not the right word for the title. It's safer to say "few" people use it. But those that do have reasons as diverse as why there are so many languages that programmers use now and why the number of languages being created is going up. We as programmers use languages for many reasons, including their relative power, alignment with our own thought processes, libraries available, community, religion, syntax preference, etc.

Don't buy into group thought when looking at any language. I also think that no one language is going to solve all your problems. Much like we have AWK, SED, etc. we can use the language that helps us in that moment. Smalltalk is just one of those languages that is available and for many uses it still is very applicable.

Sam Griffith Jr.


"it's guru's brought us"? I think you mean "its gurus brought us".


It might just be waiting for the creation of the framework that will make it sexy and popular... like what Rails did to Ruby.


For the same reason more people use lottery tickets than index funds for retirement




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: