Hacker News new | past | comments | ask | show | jobs | submit login

So how would we go about switching the entire software industry to use LISP more? I've been struggling with this idea for awhile. It seems that the best languages don't get adopted.

The only consistent explanation I've seen that it is about 'easy'. The other languages have tools to make them easy, easy IDE's, the languages 'solve' one 'thing' and using them for that 'one thing' is easier to than building your own in LISP.

You can do anything with LISP, sure, but there is a learning curve, lot of 'ways of thinking' to adopt the brain to in order to solve problems.

Personally, I do wish we could somehow re-vamp the CS Education system to focus on LISP and other ML languages, and train more for the thinking process, not just how to connect up some Java Libraries.




I'm going to argue that Lisp already won.

That is, other programming languages have adopted many of the features of Lisp that made Lisp special such as garbage collection (Rustifarians are learning the hard way that garbage collection is the most important feature for building programs out of reusable modules), facile data structures (like the scalar, list, dict trinity), higher order functions, dynamic typing, REPL, etc.

People struggled to specify programming languages up until 1990 or so, some standards were successful such as FORTRAN but COBOL was a hot mess that people filed lawsuits over it, PL/I a failure, etc. C was a clear example of "worse is better" with some kind of topological defect in the design such that there's a circularity in the K&R book that makes it confusing if you read it all the way through. Ada was a heroic attempt to write a great language spec but people didn't want it.

I see the Common Lisp spec as the first modern language spec written by adults which inspired the Java spec and the Python spec and pretty much all languages developed afterwards. Pedants will consistently deny that the spec is influenced by the implementation but that's absolutely silly: modern specifications are successful because somebody thinks through questions like "How do we make a Lisp that's going to perform well on the upcoming generation of 32 bit processors?"

In 1980 you had a choice of Lisp, BASIC, PASCAL, FORTRAN, FORTH, etc. C wound up taking PL/I's place. The gap between (say) Python and Lisp is much smaller than the gap between C and Lisp. I wouldn't feel that I could do the macro-heavy stuff in

https://www.amazon.com/Lisp-Advanced-Techniques-Common/dp/01...

in Python but I could write most of the examples in

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

pretty easily.


Similarly, Lisp has lost.

It's clear to me that S-expr are not popular.

Other languages and environments have captured much of what Lisp has offered forever, but not S-expr, and, less so, not macros.

Any argument one has against Lisp is answered by modern implementations. But, specifically, by Clojure. Modern, fast, "works with everything", "lots of libraries", the JVM "lifts all boats", including Clojure.

But despite all that, it's still S-expr based, and its still a niche "geek" language. A popular one, for its space, but niche. It's not mainstream.

Folks have been pivoting away from S-expr all the way back to Dylan.

I'm a Lisp guy, I like Lisp, and by Lisp I mean Common Lisp. But I like CLOS more than macros, specifically mulithmethod dispatch. I'd rather have CLOS in a language than macros.

I don't hate S-expr, but I think their value diminishes rather quickly if you don't have macros. Most languages have structured static data now, which is another plus of S-expr.

I don't use many macros in my Lisp code. Mostly convenience methods (like (with-<some-scope> scope <body>)), things like that). Real obvious boiler plate stuff. Everyone else would just wrap a lambda, but that's not so much the Common Lisp way.

In my Not Lisp work, I don't really miss macros.

Anyway, S-expr hinder adoption. It's had all the time in the world to "break through", and it hasn't. "Wisdom of the crowds" says nay.


What did win is the scalar/list/dict trinity which I first saw clearly articulated in Perl but is core to dynamic langauges like Python and Javascript and in the stdlib and used heavily in almost every static language except for the one that puts the C in Cthulu.


And a lot of lisps have clunky dicts.


I think you're absolutely right about macros being the main thing that makes s-expressions valuable, but allow me to inject my own opinion to drive it all the way home:

It's not that the homoiconicity of S-expressions makes it easier to write macros. Plenty of infix expression languages have macros that let you manipulate ASTs. We're programmers; mapping from lexical syntax to an AST is not hard for us.

What makes s-expressions so great in macro-heavy code is that the simplicity and regularity of the syntax lets you create DSLs without having to fuss with defining your own syntax extensions or figuring out how to integrate it into the "parent" language in a way that isn't completely terrible.

Racket demonstrates this rather nicely. They've got a fantastic system for creating macros that let you define your own syntax. It really does work well. But I'd generally rather not use it if I don't have to, because if I do then the job instantly gets 10x bigger because now I have to write a parser plus all the extensions to make sure it won't break syntax highlighting, ensure parse errors are indicated nicely in the editor, etc. And also all that work will be tightly coupled to DrRacket; people who prefer a different editor will not get a good editing experience if they use my macro.

I can avoid all of that headache if I just let it be s-expressions from top to bottom.


As an aside, Ron Rivest and Donald Eastlake last updated "SPKI S-Expressions" just a couple of months ago: https://datatracker.ietf.org/doc/draft-rivest-sexp/


This is kind of backwards. Languages which imitate everything from the Lisp family except S-exprs and macros are, because of that, other languages. Those that have S-exprs and macros are identified as in the Lisp family.

The appearance of new languages like this has not stopped.


This is kind of a no true Scotsman argument. The point is that most of the key ideas of lisp have been enthusiastically adopted by many very popular languages but lisp itself remains a small niche.


A language that enthusiastically adopts all key ideas is identified as a Lisp.

There aren't "many" popular languages; only a fairly small number. The vast majority of languages are destined for unpopularity, regardless of what they copy from where.


You'll need to say what this "lisp" and the "lisp features" are.

When we go back to the original idea of LISP, we have the following things:

idea -> what follows from those

1) programming with Symbolic Expressions -> support the implementation/implementation of languages

2) "programming with Symbolic Expressions" applied to itself -> Lisp in itself, code as data, macros, self-hosting compiler, Lisp interpreter

3) linked lists and symbols as data structures for Symbolic Expressions

4) a programming system to build algorithms and programs based on the ideas of above -> symbols repository, late binding, recursive functions, symbols with properties, garbage collection, Lisp interpreter using s-expressions, Lisp compiler, reader, printer, read-eval-print-loop, a resident development environment, managed memory, saving/loading heap dumps, dynamic data structures,...

LISP means "List Processor" (not "List Processing"). Which indicates that this is at its core, not a library or an add on.

There aren't that many other languages which are based on these ideas.

We can define three levels of "feature adoption":

* enables a feature (the language provides mechanisms to provide a feature)

* supports a feature (the language actually includes those features, but use is optional)

* requires a feature (the language requires the use of that feature, its not optional for the developer)

If we use a programming system, which requires the four core ideas above (and many of the features that follow from those) and is based on them, it is still very (!) different to what most people use.

Example:

Smalltalk development was started more than a decade after LISP. It adopted a huge amount of features from Lisp: managed memory, resident development environment, garbage collection, late binding, interactive use, runtime evaluation, dumping&loading heaps, etc. etc. For a decade Xerox PARC developed a Smalltalk system and a Lisp system side-by side: Smalltalk 80 systems and Interlisp-D systems were the results.

Still: Smalltalk 80 did not use the first three core ideas from above and replaced them with another core idea: objects+classes communicating via message passing.

Thus mass feature adoption alone does not make Smalltalk fundamentally to be a LISP. Nor does feature adoption make a language popular.

So we can ask use, will those core ideas of Lisp make it very popular? Seems like it did not. Still many of the useful features which were following from it could be replicated/adopted/morphed into other languages.

The original article was probably taking a view that a language&programming system which enables domain specific embedded languages would be "better" than the diverse stack of tools and languages in the automotive domain. Lisp is such a tool for embedded languages (embedded into a hosting language, not in the sense of languages for embedded systems), but that does not make it the best for that domain, which has a lot more requirements than language embedding (like reliability, robustness, etc. ...). In Germany A-SPICE gives a lot of hints what a development process needs to do, to deliver reliable software for the automotive domain. I don't think Lisp would fit well into such a development process.


You can implement CLOS with macros, but not the other way round!


Lisp is the original dynamic language; for decades more or less the only one. Dynamic languages have been incredibly successful.

The code-as-data, self-similar aspect of Lisp, which is at the core of Lisp's macro features, hasn't reached the same popularity.

I would say that Lisp has both won and lost.


As much as it sucked, BASIC had an important place as a dynamic language as early as 1963 at Dartmouth. GOTO wasn't all that different from a JMP in assembly.

BASIC was mature as a teaching language that you could run on a minicomputer by the early-1970s (see https://en.wikipedia.org/wiki/RSTS/E) and it came to dominate the market because there were implementations like Tiny BASIC and Microsoft BASIC that would run in machines with 4K of RAM.

There was endless handwringing at the time that we were exposing beginners to a language that would teach them terrible habits but the alternatives weren't great: it was a struggle to fit a PASCAL (or FORTRAN or COBOL or ...) compiler into a 64k address space (often using virtual machine techniques like UCSD Pascal which led to terrible performance) FORTH was a reasonable alternative but never got appeal beyond enthusiasts.

There was a lot of hope among pedagogues that we'd switch to LOGO which was more LISP-like in many ways and you could buy LOGO interpreters for everything from the TI-99/4A and TRS-80 Color Computer to the Apple ][. There was also µLISP which was available on some architectures but again wasn't that popular. For serious coding, assembly language was popular.

In the larger computer space there were a lot of languages like APL and SNOBOL early on that were dynamic too.


I cut my teeth on BBC BASIC with the occasional inline arm2 assembly block on an Acorn Archimedes A310.

It had its limitations, but it damn well worked and you could have total control of the machine if you needed it.

(also the Acorn Archimedes manual was about 40% "how to use the gui" and 60% a complete introduction and reference for BBC BASIC, which definitely helped; I had to buy a book to get an explanation of the ASM side of things but, I mean, fair enough)

Then again the second time I fell in love with a programming language was perl5 so I am perhaps an outlier here.


> C was a clear example of "worse is better" with some kind of topological defect in the design such that there's a circularity in the K&R book that makes it confusing if you read it all the way through.

Whaaaaat???

I read the K&R book (rev 1) all the way through. The only thing I found confusing (without a compiler to experiment wiht) was argc and argv. Other than that, I found it very clear.

What, specifically, are you referring to?


oh how i wish i had kept my rev1 K&R rather than donate it to my hometown library 20+ years ago.


>C was a clear example of "worse is better" with some kind of topological defect in the design such that there's a circularity in the K&R book that makes it confusing if you read it all the way through.

Aww you can't just leave us hanging like this. What's the paradox?


I really like that argument.

I've often thought Sun and Solaris also won, since so much of Linux is open source reimaginings of what Solaris had in the mid-late 90s, essentially a few year head start on Linux (which i used in the early 90s and still do, but along with Solaris back then, and NeXTstep).


20 years back I remember chatting to a sysadmin over beer and the conversation included "Linux mostly does either whatever Solaris does or whatever BSD does, you just have to check which it is before trying to write anything."

(this was not a complaint, we were both veterans of the "How Many UNICES?!?!?!" era)


> there's a circularity in the K&R book that makes it confusing if you read it all the way through

What do you mean?


Imo the issue is undergrad cs programs have an identity crisis.

Are they the entry into the world of computer science. Should they teach set theory, computer organization, language theory, etc.

OR

are they trying to prepare the new wave of workers. Should they teach protocols like http, industry tools like Java and python, and good test practices?

A well rounded engineer should have a grasp on it all, but with only 4 years, what will attract more funding and more students?


I've been advocating for years that CS departments need to bifurcate. Actual CS classes, the theory-heavy kind, need to be reabsorbed into math departments. People who want to study computer science academically would be better served by getting a math degree. The "how to write software" courses need be absorbed into engineering departments, and maybe the extra discipline gains from actual engineers teaching these courses can start turning software engineers from computer programmers with an inflated job title into actual engineers.

Students can then make a choice. Do I get a degree in computer science and be better prepared for academia, essentially being a mathematician who specializes in computation? Or do I get a degree in computer engineering and be better prepared to write reliable software in industry?

Of course this distinction exists today, and some universities do offer separate CS and CE degrees, but in practice it seems more often to be smashed together into a catch-all CS degree that may or may not be more theory or practice focused depending on the program.


First, "computer engineering" is already a name for an established discipline. And regarding academic computer scientists, a significant amount of them are on the "systems" side; it would be inaccurate to call their work a specialization of math.


At my university, "computer science" and "computer engineering" were under different departments, and the latter focused less on algorithms and more on embedded digital hardware.


At mine they were the same department and degree, but different course “tracks”


That's still the case for a lot of CS programs. At least mine is still part of the maths department. We also had a distinction between CS and Software engineering, and even between normal CS and a hybrid program between CS and Software engineering.


Just to add another angle to this: of course you can have CS classes and all that good stuff, but would the businesses only employ these kind of graduates? Or would they spread even thinner to grab market share or increase profits and hire non-experts as a result, for which "easy" and intuitive tools have to be developed and employed? I mean, I see this problem with abstraction, maths, compilers, Lisp, etc, you know, the fundamental stuff. That is, the deeper you go it will become that much harder to find people willing or able to dive deep. So eventually you run out of manpower and that what? Use these "intuitive" tools, probably.


I mean, this is already a thing today. Programmers who can hack together a CRUD app in <insert popular web dev stack> are a dime a dozen. People who know more about compilers than "oh yeah I got a C in that class" are pretty hard to find.

At some point businesses who need "deep experts" have to hire non-experts and invest in training them. This is what I have seen in practice. You don't use the "intuitive" tools, you hire people who are willing to learn and teach them the other tools.


20 years ago I found CS students at my Uni usually didn't know how to do version control, use an issue tracker, etc. Today they all use Github.

Remember computer science professors and grad students get ahead in their careers by writing papers not by writing programs. You do find some great programmers, but you also find a lot of awful code. There was the time that a well-known ML professor emailed me a C program which crashed before it got into main() and it was because the program allocated a 4GB array that it never used. He could get away with it because he had a 64 bit machine but I will still on 32 bits.

Early in grad school for physics I had a job developing Java applets for education and got invited to a CS conference in Syracuse where I did a live demo. None of the computer scientists had a working demo and I was told I was very brave to have one.


When I was a student in Crete, the nearby CSD was a gorgeous example of doing both competently. They had rigorous theory courses (like mandatory DS, Algos, logic, compilers etc) as well as hardcore applied stuff.

Side-knowledge like source control, Unix systems and utilities, editors/IDEs were supposed to be picked by students themselves with the help of lab TAs, because assignments were to be delivered over ssh through specific channels etc. Sometimes, the quirky teacher would not give precise instructions for certain projects, but tell the students to find the documentation in the departmental network directories for the class and decrypt them with their ID numbers. So the students would go on to crawl the unix and windows nodes for the relevant info.

A "good" (7.5+/10) graduate from the CSD of the University of Crete could do rigorous algorithmic analysis, hack system stuff in C and time in Verilog. OOP was hot at the time, so it also meant students were expected to produce significant amounts of code in Java and C++. Specializations abounded: networks and OSs, theory, arithmetic and analysis, hardware (including VLSI etc. at the undergraduate level, with labs). I won't even go into the graduate stuff.

And although this curriculum was quite demanding and onerous (the number of projects in each class was quite crazy), it was just something students dealt with. Heck, this was not even the most famous CS school in Greece: the elite hackers mostly went to the National Technical University of Athens.

I am not sure what the situation is now, but at least at the time, graduates were ready both for academic and serious industry careers. It is a 4 year curriculum, though many if not most students went on for 5 or 6 years. Of course, free.


In my traditional-engineering education, we spent a ton of time on the basic science and theory, with a super broad overview of actual methods used in practice.

The expectation was that you graduate and you're then equipped to go to a company and start learning to be an engineer that does some specific thing, and you're just barely useful enough to justify getting paid.


IMHO, it is University-level vs College (vocational college) difference. Industry should not require University-level training from workers. Academy should.

Plumber or electrician don't need university degree, only professional training till you design whole sewer system or power grid for city.

So, yes, bifurcate CS to science and trade. And fight requirements for Bachelor/Major degree in jobs offerings.


> Industry should not require University-level training from workers. > Plumber or electrician don't need university degree, only professional training [...]

Oh come on, this is ridiculous.

Sure you might be able to do good programming without a college degree.

The typical student? I give them very little chance.

Here's what I think you would see:

* Students trapped in small skill-set jobs, struggling to branch out.

* Poor ability to manage complexity. The larger the system, the worse an ad-hoc job is going to be.

* Lack of awareness of how much tools can help. "Make impossible states not representable?" Write a DSL with a type system so another department stops making so many errors?

* Even more incompetence. Had to learn to recognize an O(N^2) algorithm on the job, but you didn't even know what asymptotic complexity was? People are going to end up believing ridiculous cargo cult things like "two nested loops = slow".

* Less rigorous thinking. Do you think they're even going to be able to pick up recursion? Have you watched a beginner try to understand something like mergesort or quicksort? Imagine that they never had a class where they programmed recursively. Is depth first search taught on the job? Sure, but... how deeply.

* Even less innovation. Who is going to be making the decisions about how to manage state? Would you ever see ideas like Effect?

I'm not saying that a university education is a cure-all, or that it is essential to the jobs that many of us do. I AM saying that if you look at the level of complexity of the work we do, it's obvious (to me) that there is something to study and something to gain from the study.


On one end, if you get good fundamentals, you'll understand the practical side too.

On the other end, most of engineering knowledge is from experience.

Switching undergrad courses to be about understanding THE DOM so we can drive browsers to render our printed-paper skeuomorphs isn't going to drive any kind of innovation. It's not going to put you in a good position to generalise to other work.

Furthermore, if you aspire to centering divs and tweaking webpack, you don't need a CS course for that.


Tbf, having more higher end engineers center divs did lead us to better div centering methods.


I think using lisp vs other languages is pretty much completely orthogonal to whether the degree is theoretical or practical.


I strongly disagree.

Idk if lisp is the perfect language to teach, but the choice in language absolutely flavors a student's understanding of computing, especially early on.

I think “lighter” languages with fewer built in abstractions would allow students to better understand the theoretical backing of what they’re doing.

Like what’s the difference between and class and a closure?

It doesn’t really matter for the practical day to day work, but certainly does to compiler designers and programming language theory folks.

You wouldn’t really get a sense for that in Java or python, but you might in a lisp or maybe js.


> So how would we go about switching the entire software industry to use LISP more?

There are two answers to this, depending on how broadly you interpret the "Lisp" part of the question. The fundamentalist, and worse, answer is simply: you don't. The objections to Lisp are not mere surface-level syntactic quibbles, it's deeper than that. But also, the syntax is a barrier to adoption. Some people think in S-expressions quite naturally, I personally find them pleasant if a bit strange. But the majority simply do not. Let's not downplay the actual success of Common Lisp, Scheme, and particularly Clojure, all of them are actively used. But you've got your work cut out for you trying to get broader adoption for any of those, teams which want to be using them, are.

The better, more liberal answer, is: use Julia! The language is consciously placed in the Lisp family in the broader sense, it resembles Dylan more than any other language. Critically, it has first-class macros, which one might argue are less eloquent than macros in Common Lisp, but which are equally expressive.

It also has multiple dispatch, and is a JIT-compiled LLVM language, meaning that type-stable Julia code is as fast as a dynamic language can be, genuinely competitive with C++ for CPU-bound numeric computation.

Perhaps most important, it's simply a pleasure to work with. The language is typecast into a role in scientific computing, and the library ecosystem does reflect that, but the language itself is well suited to general server-side programming, and library support is a chicken-and-egg thing: if more people start choosing Julia over, say, Go, then it will naturally grow more libraries for those applications over time.


Elixir feels like that as well (in the same broader sense).

Their trick of 'use' being require+import (ala perl) becomes brilliant when you realise import can be a macro, and that's how libraries inject things into your current namespace.

Consider also: https://github.com/elixir-lang/elixir/blob/main/lib/elixir/l...

Julia looks beautiful but I keep forgetting to play with it. I'd use the startup time as an excuse, but it really is just 'I keep forgetting' if I'm honest.


>The objections to Lisp are not mere surface-level syntactic quibbles, it's deeper than that.

I am interested in the specifics; if you have the time to write details I'd love that, otherwise I welcome some links!


My very rough and wet finger take:

* Scheme is just too fragmented and lacking in arguments to seduce the current industry; its macros are unintuitive, continuations are hard to reason with (https://okmij.org/ftp/continuations/against-callcc.html), small stdlib, no concept of static typing, etc...

* CL is old, unfashionable and full of scary warts (function names, eq/eql/equal/equalp, etc...), and its typing story is also pretty janky even if better than the others (no recursive deftype meaning you can't statically type lists/trees, no parametric deftype, the only one you'll get is array). Few people value having such a solid ANSI standard when it doesn't include modern stuff like iterators/extensible sequences, regexps or threads/atomics.

* Clojure is the most likely to succeed, but the word Java scares a lot of people (for both good and bad reasons) in my experience. As does the "FP means immutable data structs" ML cult. And its current state for static typing is pretty dire, from what I understand.

All of these also don't work that well with VSCode and other popular IDEs (unless stuff like Alive and Calva has gotten way better and less dead than I remember, but even then, SLIME isn't the same as LSP). So, basically, that and static typing having a huge mindshare in the programming world.


I agree with your comment for the most part (particularly the IDE situation)

I do want to callout that while Racket started off from Scheme (and still maintains compatibility with a bunch of it), it should be considered a different platform at this point and solves a bunch of the problems you called out with Scheme - macros are apparently much better, continuations are usually delimited, has user-mode threads and generators, so you very rarely need to reach for raw continuations, a very nice concurrency system, large stdlib, and Typed Racket also adds static typing.

The DrRacket/SLIME experience is great for smaller projects. I do agree that the language server needs more love. However I still think it gets a lot of stuff right, and is much faster than Python/Ruby due to being powered by Chez Scheme.


> continuations are hard to reason with (https://okmij.org/ftp/continuations/against-callcc.html)

The link is an argument against call/cc, not continuations themselves.

That entire website is an exploration into the shift/reset paradigm of delimited continuations, and explores in detail why continuations are incredibly useful and powerful idioms to build all sorts of control structures.


Thank you so much for the info! I am mostly interested in CL.

>Few people value having such a solid ANSI standard when it doesn't include modern stuff like iterators/extensible sequences, regexps or threads/atomics.

For everyone: are there plans to include such topics in the Standard, or are there canonical extensions that people are using?


There is little chance the Standard is gonna get updated again, especially since CDR (the Common Lisp Directory Repository) kinda stopped. That said there are a few libraries that are kinda considered to be the default go-tos for various needs:

CL-PPCRE for regular expressions.

Bordeaux-threads for threads.

Alexandria for various miscellaneous stuff.

Trivia for pattern matching.

CFFI for, well, FFI.

And Series for functional iterative sequence-like data structures.

There's also ASDF and UIOP, but I'm not sure whether or not they're part of the Standard.


Too bad cl-ppcre is extremely slow in my experience, but that's what we have. ASDF and UIOP are the same as the rest, de facto standards, but not ANSI nor CLtL.

I would also add iterate to the "must haves". All noobs should constantly look at the CLHS (available for dash, zeal and emacs too!), https://lispcookbook.github.io/cl-cookbook/ and https://github.com/CodyReichert/awesome-cl, anyway.

Truly, someone could tie himself to SBCL and its extensions and get a much more modern environment, but I think it's worth targeting ECL and CCL in addition.


there's a much faster regex library: https://github.com/telekons/one-more-re-nightmare


It's not yet operational, sadly. Missing ^ and $ isn't a small thing.


ASDF is one of the symptoms of what is wrong about CL. You have this thing that is not really sure whether it is image based or source based and has a bunch of global state while reading the source. Solution to that is ASDF, which is a ridiculously hairy and complex thing.


Calva is way better than Alive, currently. It pretty much does everything you need.


Lisp is too powerful and flexible. That's an advantage and a disadvantage.

On the one hand, you can use that power to write elegant programs with great abstractions. One the other hand it's really really easy to create unmaintainable messes full of ad-hoc macros that no-one, not even you next month, will be able to understand.

Even if the program is well-written, it can be a really steep learning curve for junior hires unless it's very well documented and there's support from the original writers available.

Compare it to other languages that limit your ability to express yourself while writing code.

The truth is that most programs are not really interesting and having a code base that can be hacked-on easily by any random programmer that you can hire is a really valuable proposition from the business point of view, and using less powerful languages is a good way to prevent non-experts from making a mess.


There was a great episode of Type Theory for All with Conal Elliot as a guest where he decried MITs decision to switch to teaching Python instead of Scheme for introductory programming. He makes a very good case for how CS education should be aiming to pursue the greatest truths and not just equipping people with whatever employers want.


A podcast about type theory and other programming language design topics? Oh well auto subscribe :)


Not calling it LISP would help. And though you get used to it and even kind of appreciate it sometimes, the default behavior of most Common Lisp implementations to SHOUT its symbols at you is rather unfortunate.

Go back to 2005 and look at how 'easy' Python, Perl, PHP, Ruby, JavaScript, C++, and Java were. Look at them today. Why has their popularity hierarchy not remained the same? How about vs. Lisp then and now (or even Clojure later)? You'll find answers that explain Lisp's current popularity (which isn't all that bad) better than anything to do with initial or eventual easiness.


> The only consistent explanation I've seen that it is about 'easy'. The other languages have tools to make them easy, easy IDE's, the languages 'solve' one 'thing' and using them for that 'one thing' is easier to than building your own in LISP.

I have thought something similar, perhaps with a bit more emphasis on network effects and initial luck of popularity, but the same idea. Then about a week ago, I was one of the lucky 10,000[0] that learned about The Lisp Curse[1]. It's as old as the hills, but I hadn't come across anything like it before. I think it better explains the reason "the best languages don't get adopted" than the above.

The TL;DR is with unconstrained coding power comes dispersion. This dilutes solutions across equally optimal candidates instead of consolidation on a single (arbitrary) optimal solution.

[0] https://xkcd.com/1053/

[1] http://winestockwebdesign.com/Essays/Lisp_Curse.html


Guess today I'm one of the lucky 10,000 to read this essay.


I'd say you were unlucky, because it's a rather terrible essay and doesn't actually get the diagnosis correct at all; indeed if some of its claims were true, you'd think they would apply just as well to other more popular languages, or rule out existing Lisp systems comprised of millions of lines of code with large teams. The author never even participated in Lisp, and is ignorant of most of Lisp history. I wish it'd stop being circulated.


Can lisp work without a garbage collector? I think it is categorically excluded from some use cases if it can't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: