I've read this story a couple times as it's popped up on the internet in different places. I've enjoyed it each time because it promises that lisp is an uber-powerful and easy (?) language.
I'm always curious though: what's the other side of the story? Why is lisp always rejected?
I started programming seriously (for money) in 2009, and when we needed a language we went with python because it had better math/science libraries than Java and much more libraries in general than clojure/lisp/ocaml. I confess I like the functional languages and like using some of their features in python.
Was the no-libraries dealbreaker for the "pure" functional languages also true in 1999, which led google to java? Or maybe there are complex meta-programming bugs you can generate in the functional languages (and elsewhere) that are the dealbreaker? Or what?
Languages like Lisp, APL, and Haskell reward extraordinary skill with extraordinary power. If you're a researcher or an early-stage startup founder, that's great. You want your little band of geniuses to have the maximum possible leverage.
But if you're an engineering leader at a large company, the last thing you want is an asset in your portfolio that only extraordinarily skilled people can work on. What if they quit? What if your company's stature declines and you can't attract such people anymore? What if a more pressing problem comes along and you want to deploy them elsewhere?
So we get languages with short learning curves and low ceilings (Go!) instead.
Except in performance critical contexts, where it can’t be helped, but C++ already won that one.
This seems correct to me. The last two things I've learned were javscript/node/react and Haskell. I had never touched js or html before, but within two days I had a pretty decent web app up and running. It was something that could (and did) provided value for a business.
Two days after starting Haskell I had essentially nothing to show for it except a feeling that I has a huge amount to learn before I could be productive. Even a year later I still have a huge amount to learn.
No, that's not it. The world is not run by functional programming languages, despite their "leverage". The word you are looking for is "quirky".
Of course, the same is true for the programmers. They are not extraordinarily smart, they make quirky technical decisions. Interest in functional programming solves technical problems as much as your choice of editor.
I don't think that extraordinary skill and power are linked at the language level. Within a rounding error, an "extraordinarily skilled" developer can probably be just as much of an asset to a company with an "extraordinarily powerful" language as they could with an "ordinarily powerful" (read: mainstream) one. Now, super-skilled folks might like playing with Lisp or Haskell or what-have-you (and if it affects your ability to retain them and get value from them, you might want to let them), but their skilled-ness is usually transferable to other languages if the need arises.
It isn't always. I have a consulting gig for a chip design company that uses Common Lisp internally. It's one of their principal competitive advantages.
But mainly it's a vicious cycle: no one uses it because no one uses it. Employers don't use it because they think (mostly correctly) that they can't find CL coders, and no one learns CL because no one hires CL coders.
Always great to hear there are companies out there thriving with these more powerful languages. The tired example I always turn to is Ocaml being the language for all the code at janestreet (~900 employees).
I've always been annoyed by the "there are no X coders" argument. Coding skills are broadly transferrable. My team's code base is all python, and we've had successful team members who'd previously worked only in javascript, Java, C#, etc. When we tell recruiters that any of these languages will do, they hate that. Oof.
From my point of view, Lisp gives programmers too much power.
No company wants giving a programmer so much concentrated power, they want it distributed around the workforce, and people disposable. Replace them whenever is needed.
With lisp you can modify your development platform to suit your needs, In practice, over time you create your own programming language.
You create your DSP(Domain specific language) with your own systems. This is not standard. You are dependent on the programmer.
Companies prefer to buy "solutions"(packages,environments) from companies and depend on those solutions. They are standard and you can hire people trained on them. Replace them if needed and make people continue the work that someone else left.
That is because "solutions" have open documentation, designed by people that are able to draw beautiful diagrams, lots of books, videos on the Internet, but lisp gurus have all the info in their heads.
Bugs are way easier(orders of magnitude) to find with meta programming. In fact you can create a subsystem that finds all bugs so you don't have to.
The parent comment got downvoted, but I have definitely experienced this "too much power" issue at work in other languages. E.g. one programmer on the team chooses to use powerful abstractions, but then other folks have trouble modifying the code. Usually we give the respect that is due and ask for a simpler rewrite. When the abstractions or tools are powerful and also simpler, and we can all pick them up and use them, that's the happiest outcome.
In homoiconic languages like Lisp or Mathematica the program is just "data" and can be transformed arbitrarily, even at runtime.
This allows you to write utility functions that transform programs in all sorts of useful ways to find bugs.
A simple example is to just transform the program so that it traces whatever you want to a log, much like Visual Studio Intellitrace, but completely flexible.
You can do things like replace all floating point numbers with a "dual number" (a 2-number tuple) where the first number is always rounded down when operated on, and the second number is always rounded up. This then lets you robustly determine if your numerical accuracy is guaranteed to be within a certain range (due to floating point error, at any rate).
You can symbolically evaluate programs involving random variables using their distributions instead, symbolically working out the products of functions instead of scalar values. This allows things like video poker or lottery programs to be robustly evaluated for average payout even if some of the plays are rare, such as jackpots. A friend of mine did this using Mathematica back in the early 2000s.
Whichever word you put before "gurus" in that sentence, I don't want to work somewhere which values that dynamic. Programming is very occasionally a solitary and purely technical task. The rest of the time it's a collaborative and human task. Eschewing documentation and presentation in favor of "fits in one genuis's head" is not a good thing if the task at hand is collaborative/involves multiple humans.
Not to be pedantic, but since Clojure is a hosted JVM language (as one of a few platforms it's hosted on) and has full interoperability with Java, by definition it has more libraries than Java as it has all the Java libraries plus all the clojure libraries. So definitely not library support in recent years but maybe back in the days. Performance was an issue early on as well, but that's certainly not the case now. Clojure is extremely performant and outperforms pretty much all the popular interpreted languages easily.
Polyglot programming with Graal will eventually be awesome, but it isn't yet. If it was, I'd use Graal to use Python libraries from Clojure, but I'm not, because it still has some ways to go.
So for now, I use Pandas, Numpy, matplotlib, and all that from Clojure using libpython-clj. Which is the biggest issue with graal-python currently, how to leverage all the C-based libraries. That's where libpython-clj excels over Graal.
Yep and even then Clojure has Erlang and .net runtimes and interop built syntactically into the language, GraalVM has to reimplement those languages in it's tooling so they'll always be playing catch-up with the real implementations (unless they merge)
That said we do like GraalVM it has a cool ability to create native binaries out of Clojure apps, which is currently powering a resurgence in Clojure for scripting the last wave had to use Clojure's JavaScript target (cljs)
>Not to be pedantic, but since Clojure is a hosted JVM language (as one of a few platforms it's hosted on) and has full interoperability with Java,
Armed Bear Common Lisp is an ANSI Common Lisp, thus can use all CL libs, runs on the JVM, and it can easily call Java libraries. In fact as easy as calling regular lisp functions.
Interesting. We didn't look at this closely since we liked the numpy/scipy/sagemath tools, but we should have realized this put clojure ahead of java. Especially java's 2009 features and syntax.
Due to the fact that Clojure has its own classloader hierarchy, quite a few libraries that require AOP like mechanisms (or even OSGi components) cannot be as easily used from Clojure.
I was at JPL on this Lisp project, and left for Silicon Valley. First place I interviewed was Peter Norvig's startup (that's how I think of it, though it had actual cofounders and he was an employee). He told me if they'd known they could hire Lispers, they wouldn't be using Java.
Just one datapoint on how things were trending then.
Sounds like the lisp programmer skillset doesn't have a strong overlap with the java/python/javascript programmer skillset. Makes me want to learn more lisp. :)
There is some overlap with javascript because it's the only one of the three that offers uncrippled lambdas. Or I may be wrong about that; did Java finally add them?
Even before Java 8, Java had inner classes that if you squinted at them right looked like very syntactically awkward lambdas (indeed, Java 8 lambdas compile down to something that looks an awfully lot like an inner class).
One of the key selling points of lisp is that it allows you to easily build a language that more closely maps to the problem you're trying to solve. In the 1970s, this was powerful because parsing was an active area of research.
Today, parsing is one of the least interesting problems in language design (unless you're C++ or based on natural language). Most of the effort of spinning up a new language is in the backend (optimization and error checking), where lisp doesn't offer a compelling advantage (well, writing folding rules in C is a total pain, but this is a recursive problem... I can just write a code generator).
> Most of the effort of spinning up a new language is in the backend (optimization and error checking), where lisp doesn't offer a compelling advantage
So maybe other languages have finally beat lisp at its own game? Makes sense from the wave of functional language features becoming popular in the 2010s.
It's not necessarily that they beat lisp at its own game. I still think the tight integration story of lisp is unmatched nearly anywhere else, but, when viewed holistically, lisp's advantages have become less compelling over time for people who have already written software outside of the ecosystem. It suffers from a lot of the same problems as smalltalk.
I think the key metric for programming language popularity is something like “when someone makes a thing work, how easy is it for someone else a) to find out about it b) understand what’s happening, and c) shoehorn the solution into their own janky application”.
I think it was on HN that I read recently, but I heard some complaints that it's a language that you develop on, but not a language that you develop. Rich Hickey develops it and lets everyone use it. May be good or bad but that seems to have turned away some good developers. So anyway I don't know if that has any impact on its long-term future.
What led Google to Java is a good question - I'd guess they wanted to standardize on something likely have longevity, because they knew they couldn't predict the success of good languages, so the same old reason why less all the other corporations usually do it.
Good that JVM eventually started to get nice languages like Clojure, Scala, etc - despite being the platform of choice of corporate Vorlons.
In the 70s, computers were slow. When the decade started, the competing languages of the time were Fortran, Algol and Cobol. All were imperative languages with manual memory management that would compile directly to hardware and had semantics very close to that of the hardware. On the other hand, at that time, Lisp was this high level thing, garbage collected, dynamic types, semantics more removed from the hardware, etc. Meaning Lisp was much less performant.
Then in 1972, came C. A language inspired by Algol and Fortran, using very similar imperative semantics and constructs, but changing the syntax slightly to what is now the common syntax most programming languages use.
The whole point of C was to be crazy fast, and for compilers to be easy to write so it could be portable as well. C's biggest move was that Unix was re-written in it. Before that, operating system kernels were implemented in assembly language! Unix was implemented in assembly language! Think about that, performance was critical in the 70s for a kernel.
Once Unix was re-written in C, and C proved to be performant enough to run a Kernel, its momentum started growing. Unix won the OS war, and slowly everyone was using computers running Unix, meaning that it had a C API. What happens when the most popular OS has binding in a specific language, more people choose that language to write programs for that OS. Thus C became ever so popular. Lisp was there, but not leveraging Unix as much, and still much too slow.
And this was the beginning of the momentum for C, it only accelerated since then. Linux came around, also in C, Windows, also in C, all major OS were in C. So C like syntax, semantics and programming constructs became the norm, and was what got thought in school and uni, etc.
Fast forward to today, and Lisp is finally "fast enough" to be used. We finally reached a point where you can have useful programs in slower languages, like Python, Java, JavaScript, C# all demonstrate. But how can you beat 50 years of momentum in C like languages? That's what most everyone knows. That's why imperative, mutable, procedural statement based, with C like semantics and syntax is still the norm, and all other languages that aren't struggle to get a foothold.
In the 70s speed wasn't really the big problem, but performance. Sounds similar, but it wasn't. The performance of Lisp on a mini computer seriously degraded with the number of users on the machine and the lack of available RAM (one megabyte?). A single Lisp on the machine could be relatively fast, but Lisp had high memory demands and a simple GC could walk through all allocated (virtual) memory of a Lisp process. Thus the machine got slow when 50 users suddenly worked next to one or two Lisp users. For Lisp users this was especially a problem, since they were often meaning to use interactive Lisp systems, not batch computation.
The idea of Lisp computers was not so much about faster CPUs, but a performant machine for a single user with high memory demands.
90s memory sizes started to get bigger, but Lisp was then hit by the AI winter. Many projects at that time were either moving to C++ for applications (because that was the new efficient industrial language) or tried to develop new languages, which are simpler, cheaper and more deployment friendly (Dylan, Java, C#, Scripting languages in a C runtime, reimplementations of special languages in C/C++, ...).
Cost of development systems and the cost of deployment - we are talking about cost of both software and hardware here - was a problem for a long time.
There were always also people who tried to provide solutions, Scott Fahlman for example directed the CMUCL project and his goal was to have an optimizing CL system for free - providing CMUCL as no-cost public domain software was a major goal for him.
C succeeded because of Unix, not the other way around. Unix was given away and ran on small, inexpensive minicomputers (originally the PDP-11). Lisp typically ran on more expensive equipment.
High level programming languages have been used to write operating systems, including the kernel: Lisp (on Lisp machines), Algol 60 (MCP on Burroughs mainframes) and PL/1 (MULTICS on General Electric and Honeywell machines). MCP and MULTICS (from which Unix gets its name) preceded Unix.
One difference between C and Lisp is that C was designed to run on the PDP-11, and is a close match for its instruction set. Lisp didn't have a native hardware architecture (the closest it had was the PDP-10) until Lisp machines were built, and there were economic reasons - largely economies of scale - why they're no longer built.
> High level programming languages have been used to write operating systems, including the kernel: Lisp (on Lisp machines), Algol 60 (MCP on Burroughs mainframes) and PL/1 (MULTICS on General Electric and Honeywell machines). MCP and MULTICS (from which Unix gets its name) preceded Unix.
IBM also has a long history of using (non-standard and semi-secret) PL/I dialects for operating system development. Much of MVS was written in a PL/I variant optimised for systems programming PL/S (which later evolved into PL/X) – older versions were mostly assembly, with successive versions more and more written in PL/S – and that remains so with contemporary z/OS, although brand new development has largely switched to other languages, especially C.
Similarly, most of the software for the AS/400 was originally written in a pair of PL/I dialects, PL/MP (for code beneath the virtual machine layer) and PL/MI (for code above it.) (Starting with the CISC to RISC transition, PL/MP and PL/MI were progressively replaced by C++.)
Apparently, AIX 1.x and 2.x were written in a mixture of C and PL/I, but IBM got rid of all the PL/I code in AIX 3.x onwards which are pure C.
> C succeeded because of Unix, not the other way around. Unix was given away and ran on small, inexpensive minicomputers (originally the PDP-11). Lisp typically ran on more expensive equipment.
Yes I agree with that, I thought I had made that point clear.
C won over the other C-like due to Unix. But Lisp-likes lost because they weren't fast enough to run on commodity hardware. That meant that the hardware that became popular did not run Lisp, and didn't have an OS written in Lisp. Instead it did C, this has created a massive momentum that still dictates to this day what language is popular and which one isn't, in my opinion. At least it is a strong contributor I'd say.
How then to explain Python's popularity? Modern Common Lisp is dramatically faster than pure Python. CL is compiled down to the metal; Python is usually interpreted.
C functions called from Python (like Numpy) are faster than Common Lisp but pure Python is much slower.
I'm not too sure, but Python has always had great integration with C. It also follows the familiar statement based, imperative procedural model pretty closely. And it has been bundled on unix-like systems for the last 25 years. In that sense, its tottally riding the same wave, leveraging the momentum behind those languages.
That's the crux of my argument. Not that Lisp is too slow today, it's quite fast considering today's programming language landscape. But that it was when it mattered, in the 70s and 80s.
Had Lisp been able to overcome that, we might be in a world where the most popular OS is implemented in Lisp with a Lisp API. Where the commonly thought paradigms in school would be expression based lambda calculus, functional programming, dynamic runtime object systems, meta-programming and the all mighty parenthesis syntax ;)
The 80s is already 10 years later than my timeline, C had already gained massive momentum by then.
> Lisp was used to program massively parallel supercomputers
I'm curious to learn more about this, can you be more specific or provide links?
But anyhow, a massively parallel computer is a very different beast. Most computers used by people and enterprise were not massively parallel, and Lisps had poor performance on them and their architectures compared to C.
> Lisp stopped being slow a long, long time ago.
How long ago is that? I'd bet it was too little too late already. In my opinion, it would be somewhere along when Java came around, in the mid 90s. And even then, I'd say it was still too slow for most user application, but started being good enough for server work. At that point though, the momentum of C was already unstoppable, and Java became dominant, not CL.
It's the same story. Lisp ran on expensive early massively-parallel hardware, like the Connection Machine. But, again: exotic, expensive, few commercial users at the time, etc. Also killed by the AI winter.
From http://www.softwarepreservation.org/projects/LISP/maclisp_fa... , "This reports the results of a test in which a compiled MacLisp floating-point program was faster than equivalent Fortran code. The numerical portion of the code was identical and McLisp used a faster subroutine-call protocol."
I think this reinforces my thesis here. You can tell from the article that at the time, Lisp needed to prove itself in the mind of people as "fast enough". That seems to be the intent of the article. Given a good compiler, it may be just as fast.
I'm not trying to bash Lisp, I love lisp, I use it all the time. And when I say "fast enough", I know lisp is manageably fast, maybe custom hardware could have made it equally viable, maybe it only needed a more sophisticated compiler more widely available at an affordable price, but in practicality, it had more hurdle to overcome than other languages to meet the bar of the time.
Imagine an alternate reality where Lisp was clearly fastest. Where its semantics mapped very closely to commodity hardware of the time, and where Fortran, Algol, C, etc. semantics are all further removed from those hardware instructions, making them require more sophisticated compilers or very expensive specialty hardware to have them perform as well as Lisp. I feel it in such a reality, I can't see why C would have become the language of choice and not Lisp.
> From http://www.softwarepreservation.org/projects/LISP/maclisp_fa.... , "This reports the results of a test in which a compiled MacLisp floating-point program was faster than equivalent Fortran code. The numerical portion of the code was identical and McLisp used a faster subroutine-call protocol."
To add some more detail, the public description MACLISP floating point code being faster than Fortran, was published about 7 months before public unveiling of UNIX and when C was just taking more solid shape in it.
Interesting. Choosing languages to get work done in the 2010s has insulated my team from a lot of these historical forces; interpreted and garbage collected has always been more than fast enough. I can see how earlier businesses would choose a different route.
I actually worked at Sun as a Java architect back in this era, and consulted with the JPL as they were adopting Java.
In the end, the JPL isn't a software shop. The glory is in the hardware end of things. As a consequence, you want something that has a broad pool of talent, tooling, vendors, etc. to draw upon for support. It also was a more natural transition from their extant Fortran & C code.
LISP's marketshare had already declined significantly at that point, and it's primary remaining user base didn't have very similar needs to the JPL. When you're an outlier for a language with small marketshare, that means the tools, skills, vendors, etc. aren't going to be as well suited to what you're trying to achieve.
tl;dr: when you're in a domain where you aren't trying to be quite so innovative, it helps to run with the pack.
I really feel like they should teach LISP in elementary school. It has extremely simple rules to get started and helps people intuitively understand programming concepts. Maybe an updated The Little Schemer style curriculum:
You don't understand why kids would want to learn programming. At least those who do, want to learn because they can build cool things they can show off to friends.
It's not about whether a programming language is simple or not. (I actually think LISP is not as simple to understand as imperative languages, unlike most nerds think) It's about what you can build with it.
For example, my first experience with programming was to build a game. If someone had tried to force-teach me LISP, it would have actually had opposite effect on me because then it would be no different from someone trying to force teach me some boring subject I have no interest in. The best way to teach something is to provide gratification. LISP is only gratifying to nerds who marvel at its "elegance", but it has no ecosystem which has tons of cool projects like fancy graphic engine or fancy game engine. It's just a "cool" language to nerds.
DON'T teach your kids LISP if you want them to love programming. Teach them JavaScript or Python, or anything they can build something tangible with which they can show off to their friends.
I'd argue that racket's ecosystem is ideal for getting young people interested. You can build gui tools and do graphics programming out of the box, very easily.
My go to language to refer people who want to learn is Processing though. Easy to use, and the documentation is fantastic (and Dan Shiffman is IMO the best programming educator out there for kids and adults alike, whether or not you like his high-energy style).
I absolutely agree. I watched many of his videos as a professional programmer just because it’s fun to watch him build cool visualizations. He’s super charismatic and looks like he’s having genuine fun in figuring things out.
This really is true. I learned Logo in middle school, didn't want to be a programmer, took a Java class in college because it was required for actuarial science, then read some PG essays.
Lisp should be forbidden to young programmers, precisely so they'll go seek it out.
This provides a 2D and 3D NetLogo installation. The 3D version blows my mind. You can spend days going through the samples. When you look at the code after you watch a simulation, it's a bit of a shock how compact and readable it is. It's definitely a whole lot more than what they taught you and I alike in middle school.
> Lisp should be forbidden to young programmers, precisely so they'll go seek it out.
Unless it comes with a turtle, of course. Then they might
end up writing the next greatest flocking algorithm.
JS is popular because it runs in browsers and therefore has a thriving ecosystem with lots of educational resources, lots of opportunity to show your work to other people, and lots of jobs, not because it's taught in schools.
Examples of educational languages would be Alice and Logo, and they're both pretty bad at actually doing anything.
I learned Logo in elementary school. I thought it was required for everyone just because learning about line segments and angles are a part of the maths curriculum and Logo just fits so well.
A piece of paper, a pencil, a ruler, and a compass - these are the best tools to learn about geometry. (It’s like having a cat is the best way to learn about cats - vs., say, watching YouTube videos.)
I do wonder now how much the fact that I had to circumvent the system security to get to the startling line made me enjoy spending my lunch break writing QBASIC at high school.
This is because people who are really great at something have already forgotten what they used to be like when they first started out. They just tend to think everybody would be excited about the cool thing they themselves are excited about.
In most cases, people who are masters at something tend to be the worst teachers because by becoming the master they have forgotten how to sympathize with complete newbies.
Or they remember all too well. Another way of looking at it is that people who have gotten good at something can look back and see the things that hindered their progress. Like many here, my first exposure to programming languages was BASIC and 6502 assembly language. In retrospect, I can see that they were horrible languages to start with because they taught me some very bad habits that I had to unlearn.
It is possible to teach someone Lisp, or something like it, without teaching them all of it at once... if ever. There are lots of people who use CAD systems like AutoCAD, or Emacs, or the various Lisp based/inspired Expert Systems shells, etc. who wouldn't consider themselves 'Lisp programmers' who learned just what they needed to be productive. If people were exposed to it a bit at a time, rather than the fire hose with minimal direction, I suspect a lot of complaints could be addressed. For example, the reason people think all the parens are bad is because it's become a meme and it's different than what they're used to. If they had started with Lisp and never learned ALGOL-inspired syntax, semicolons would likely look very strange to them.
I credit my earlier exposure to BASIC as a very good introduction to programming, because of its bare-bones nature. Too much abstraction would have been overwhelming, to start with - I didn't even understand the purpose of GOSUB for a long while (I knew, mechanically, what it did, but I did not grasp why one would use such a construct instead of just putting all code inline). The simplicity made it an easy outlet for creativity, and pushing the language to its limits and finding the complexity more and more difficult to manage was key to my eventual enlightenment - advanced language constructs felt like gifts, rather than burdens.
Of course this process could be done in an advanced language that grows with the user, but most modern languages are not that - even Python expects the user to understand at least the rudiments of structured flow and objects almost immediately. Lisp (Common or Scheme) isn't it, either - lambda calculus is already far too abstracted from the "series of instructions" model which is a critical foundation for computing in the real world. C, my next step after BASIC, has quite a good balance of available abstraction and a tractable mental model; but the amount of boilerplate (the requirement for main() always seemed baroque to me) and hand holding of the computer (total lack of even trivial type inference) is not ideal for a beginner language.
A kid of John Carmack is likely to have a cognitive capacity that is a standard deviation or two to the right of even the "smart" kids in class. Genetics isn't everything but it's not nothing.
I don't know how this ended up in the USMLE but AFAIK, it has long been debunked... unless for "inherited" you mean something other than "genetically transmitted".
See "Not in our Genes" by Lewontin, Rose, Kamin, for a thorough discussion.
In practice, it is a result based on twin studies, but with very low standards: for one thing not double-blind, definitions were let swing arbitrarily by the interviewers, and so on.
Definitions of intelligence are themselves questionable, techniques to measure it, even more so. The history of IQ has been tainted by plain fraud... And the whole field is too a succulent target for ideological warfare... truth is too easily a collateral casualty. This should not be considered science.
If he was given a chance to go back in time and teach his younger self to learn whatever language he wanted to make himself learn, I'm pretty sure he would have NOT Picked LISP and picked exactly the same language he used to build his first game.
I would love it if he found it endlessly fascinating and spent all day programming on his own, but he does need a bit of a push from mom and dad. He enjoys it, but given the choice, he would still rather play games than make them :-)
You didn't pick very good evidence to support the idea that picking a LISP is a great way to get kids to love programming. This only supports the idea that Carmack thinks that learning a LISP is good for his son's development.
There is a LISP textbook now that teaches LISP by writing videogames. It's called Land Of Lisp. And there are LISP game dev communities operating on twitter or IRC, organizing game jams:
Lisp (especially Common Lisp) works just fine as an imperative language, even if it’s not the most common style. And there are entire books written about using Lisp for game development.
If the goal is for a beginner (whether child or adult) to be able to make cool things fast, Lisp is a great choice. The interactivity and the simple syntax makes it very easy to get up and running compared to more mainstream languages.
When was the last time you saw a kid who made a cool game or software and made money using LISP?
The best motivator is recognition. Kids don't care if a language has books and whether there "exists" a game development kit. They only care about how much satisfaction they will get by learning something and building something with it.
If you want your kid to learn programming, the best approach is to teach them Swift or Java, or Javascript, so they can build a mobile app or web app and show off to their friends. And they will get the joy of building something that is actually used by others. That's how you get people to be interested.
LISP won't get you anywhere in that sense. Sure you may learn some programming concepts, but the ROI is not worth it. Kids might as well spend a bit more effort to learn languages with more exposure and actually feel the satisfaction of having real users.
Java the language is shitty, but that's the language you use to build Android apps. And most kids own Android phones. Kids want to show off to their friends and the best way to do that is to build mobile apps nowadays.
> Do you want your kid to learn actual computer science, or do you want him/her to basically be able to create a cookie cutter mobile app?
This is exactly the point I was making in the ancestor thread. If you want your kid to learn computer science, you don't force feed him/her things they will have no interest in immediately. Instead you teach how to build a "cookie cutter mobile app", and they will get interested and go from there and teach themselves racket or lisp or whatever you originally wanted to force feed them.
That's not true at all: as a kid, I had to do inordinate amounts of reverse engineering precisely because I didn't have sufficient documentation, and what little I did have was in a foreign, non-English language I did not understand! I would have loved to have had books on how to program!
So on the one hand, I wrote a Tactix GUI in Java/AWT in my first high school programming class which covered mostly Java with turtle graphics. I can agree that kids who want to learn programming want to make games.
But that Tactix GUI took me two months of sustained effort, lasting longer than any of the actual class projects. Most of the students struggled to draw a circle. It took weeks for us to get to "implement a method that takes an argument".
If you're looking at teaching programming concepts to elementary school children, i.e. to most kids, building applications doesn't seem very likely to me. It takes a long time and small children have poor memories for long-running projects.
If the programming is supposed to support elementary math and logic for kids who are getting used to the notion of "variable" in both programming and math contexts, teaching the kids to write the sieve of Eratosthenes or Babylonian algorithm really does seem like the way to go. Not because it teaches them to be good programmers or to like programming, but because it teaches them to think, and they're mostly not going to grow up to be programmers.
If, however, you want to entertain the minority of schoolchildren who do like computers and want to learn to program, then using an easy graphics library with good deployment tooling makes sense. Those kids do want to spend two months or so making a game. But that doesn't, to me, seem to be in the spirit of 'proximitysauce's comment.
> ...my first experience with programming was to build a game
most of us!
we have made it difficult to show our games and stuff to friends. except for minecraft and game mods. that's why kids gravitate towards those.
as a side note, i was thinking last time that programming is difficult to teach nowadays because when you boot your computer or phone, it doesn't drop you into a shell or command prompt.
so few people nowadays know that you can actually give commands to your computer or phone. so the concepts is not familiar to most!
> At least those who do, want to learn because they can build cool things they can show off to friends.
I have no doubt about what you say. I had a really broken laptop that my 3rd grader happily tookover as his laptop, a linux machine mind you, about which he knew nothing about. I showed him how to open a terminal and use some bash commands, and how to invoke the Python interpreter and import a turtle graphics module. He was hooked and had lots of fun moving the turtle around (along with the incentive to learn some math at his level). I remember the day he walked to school with this duck taped laptop on a show and tell day, which ended up with the school's IT departement making Python available on all computers in that elementary school lab. I don't think he could have pulled this off in lisp. Beautiful language Lisp,but I don't see how I could have helped with graphics programming within an hour or so -- which was all what he was interested in.
Logo basically was Lisp -- a small parser on top of MACLISP.
There were some later small implementations (e.g. for the Apple II) that were written from scratch and didn't need to implement all of Lisp (e.g. lambda, plists, etc).
No, because m+expressions are syntax and logo has a different syntax, and because Mathematica is a successful use of something much closer to m-expressions.
I recently picked up 'Turtle Geometry' from Abelson and diSessa. And it was a mind blowing experience.
Really makes you think what kind of fun programming was about back then and its potential. Frankly speaking you begin to feel programming has kind of lost its way.
But it is also a excellent concatenative (left-to-right) language with brilliant procedure definition mechanism. Here as example, an oneliner to draw ascii christmas tree, we made with the kid last year:
So, someone had lisp, thought "I want to teach people programming", and instead of just teaching lisp directly, chose to implement something quite different in syntax and spirit?
FWIW, I was taught Scheme as a kid in nerd camp (maybe seventh grade?). I had previously learned BASIC. I didn’t notice anything very different. I didn’t understand it really. I didn’t use it once the summer was over, probably just because of lack of learning material.
I went to a talk by cognitive psychologist Dan Ariely, and he very much agreed: said that he tried to get girls into computer science, in Israel they had a 6-week introductory course, and that class SUCKED at getting kids interested (actually negative effect).
Ariely said there are actually better ways to get people interested, which is to show them something meaningful.
I can totally imagine a LISP inspired course of learning in a Montessori classroom. Elements, lists, functions — all somehow through physical, tactile objects.
Also Clojure can use all of Java and JavaScript, so there's lots of tangible thing you can build. That said, here I'd mostly agree with you, the barrier to entry should be as small as possible, you just won't find the amount of tutorials and guides, and tooling that JS or Java would have on its own.
>You don't understand why kids would want to learn programming. At least those who do, want to learn because they can build cool things they can show off to friends.
I mean, this isn't true for everyone of course, but I began learning to code at around 8 or 9 years old because I was heavily interested in math and numbers, and precisely because I didn't have many friends (or brothers or sisters) to occupy my time with when I was very young.
I agree with the premise that the majority of folks learn better with a practical goal in mind, but I disagree with you that there exists only one kind of curious child.
I had a second-hand IBM XT from my fathers' shop at first, and I wanted to learn how to do things with it.
That equipment being available, in my opinion, was the biggest motivator in learning how to use it.
It was there for me to use, the limit was simply my knowledge of how to use it, and that was a very enticing proposition.
My first project as an 8 or 9 year old was a prime number generator (a naive sieve). I had read a book about patterns, and primes had been mentioned. I took a trip to the city library (another rarity statistically?) and picked up a guide to Pascal that was written towards small business owners and non-CS college students to help them with simple automation, and got to work looking up terms.
A week or two later I had a CLI program that spat out primes as fast as my computer (or my poor implementation..) could make them.
I then tried to use those primes to draw trippy pictures, which began another foray into programming topic that were new to me, necessitating more work looking up terms and generally haunting the library.
Years later, when I was a teenager, when I ran into like-minded computer folks and we'd get to talking, that same prime generator got spun into a small benchmarking suite that my friends and I toyed with before the multi-core trends.
so.. I guess that the "show it off to friends" thing still happened, but knowing myself and how I learn, I doubt that i'd have ever been interested in a hobby that was pushed towards me by authority figures as being useful.
I have a hard time conceiving how I would become a 'computer person' now-a-days, in fact. I was so anti-social and actively hostile towards teachers that the idea that one of them could have tutored me in the hobby , quite counter-intuitively, may have been the exact reason I would have avoided it as a child.
really, my grand tl;dr :
"There is no one teaching method."
I've been teaching myself Clojurescript for the past three months, and while this might, in theory, be a good idea someday, the state of the Clojurescript tooling and learning materials I've used means that it is certainly not a good idea today.
Why? Even as a professional developer, it’s extremely unusual that my cljs work requires any JS familiarity. I don’t see why a complete beginner would even need to know that JS exists.
I did teach my son Racket at age 8, as his first programming language. He was able to make a number guessing game in it. I think we used "Realm of Racket," but I don't recall, as it was some years ago. He's now midway through high school and not a single course at his STEM middle school or high school has come close to teaching him as much as he learned at age 8.
That difference is more because classes are designed to be extremely slow and low paced so the worst student can pass. Learning anything with a personal tutor is much more efficient.
In a way they did. in the 1980s, Logo was popularly taught in elementary schools, and while lacking sexps, it was clearly inspired by Lisp in the way it was a functional language in the way the other popular teaching language BASIC clearly wasn't.
Logo may have more potential than it seemed at the time, but as it was presented in my school, there was not much to it, and I never investigated further after turning to BASIC and then compiled languages like Pascal and C.
I certainly agree with the comment that what matters at an early age is what you can do with a language. I wanted to make GUI applications or games, and Java and Javascript hadn't been invented yet.
The point of Logo is not to teach programming but to teach geometry. At least that is what I took from the Paperts Mindstorms book. I never had Logo in school myself.
However if you dig deeper geometry is just the beginning eventual goal is to teach you to simulate movement of insects around light, predator-hunted scenarios etc.
Yeah, I learned Logo in elementary school and it was something that stuck with me until I actually started programming later on in school. I particularly remember having to write a program to navigate a maze, it was a blast!
I've programmed for a very long time, and I've rather incidentally taught and tutored programmers at many levels of ability over the years including some friends that were trying to learn LISP. I see very little benefit in teaching elementary school kids to program in LISP.
My daughter took her first programming course in college and used the excellent book (HTDP), see [1], intended for beginners that used Racket as the programming language. By the end of the semester she was able to make a simple graphical game. I thought the course was very good and she liked it and still has a fondness for Racket although I believe that she has only used Java, C++, and Python since that class (she is now a CS major).
Learning Racket (which is Scheme and very closely related to LISP) worked out well for my daughter, so why would I say that LISP in elementary school isn't a good choice? Several reasons:
• Lisp isn't a popular language, ranking 28th on the TIOBE list of programming languages so there is far less opportunities to make use of programming ability in LISP than in other languages.
• Lisp isn't easy. The Common Lisp Standard is well over 1000 pages long. It lists over 1000 functions. Take a look at the basic iteration constructs, the most general is LOOP and has a whole chapter, see [2], in Peter Seibel's excellent Practical Common Lisp, but even the simpler DO construct is no cake walk[3].
• Recursion for elementary students... LOL
• Smart people have tried introducing LISP based programming to youngsters already and it didn't seem to take off. See Papert's LOGO language from the 80's. I've actually written some programs in it, but I don't remember where I got to use the language. (Perhaps it ran on the PLATO system?). I agree with a number of his insights[4].
Elementary school children don't understand functions and they really don't even understand basic math. The problem with LISP is that while it has an abstract simplicity that I find appealing and really quite beautiful, it's abstract nature is going to make it difficult for children. Less abstract systems like ALICE or SCRATCH seem a better fit if we insist on teaching kids to code.
I personally don't see the need to teach kids to code. Why not teach them math first. Why would programming come before understanding how to divide fractions or how polynomials work or what simple logical operations mean?
they’re practically two sides of
the same coin. I keep thinking it should be possible to build a toy language that has both a forth-like mode and a lisp-like mode, that use the same standard library
I think you'd have to hobble Forth pretty badly to make it stand on the same level as Lisp. Lisp has too much hidden machinery- a garbage collector, environment structures, etc. You couldn't even represent forth's "/mod" without resorting to packing results into tuples. Forth is good because every piece of the compiler/interpreter is exposed, usable, and extensible. Lisp wants to keep its innards sealed off and abstracted from the hardware.
There was a language for calculators called Reverse Polish Lisp. (That's all I know about it.)
If I were to design a language matching that name, I guess it'd be basically Logo written backwards. This might be interesting for a live programming environment, because in Lisp you type a whole expression and and nothing happens till you hit enter; but on e.g. an old HP RPN calculator you see the state of the data change after every keypress. You could design an "RPL" to work that way too, though after every word instead of every key.
Forth is more low level, ie. there is no object memory/GC at what one would describe as implementation level. On the other hand all the Lisp Machines essentially are hardware implementations of something vaguely Forth-like with some level of hardware support for lisp-style tagged pointers.
A good way to jump over the low level implementation details is to look at Factor. Forth style but much better library and most of the low level constructs are baked in.
I can't say the documentation / tutorials that I've found are the best but it's definitely something.
This isn't at all implausible: one of you should take 'agumonkey up on this! I wrote my first Scheme interpreter at 11, and an older friend of mine taught me Forth a bit before that. I wrote my interpreter with a bootleg (and ancient) APL2 copy, though.
With parental guidance, a 7 year old could totally do this. Figuring out how to handle the stack will be the trickiest part, and there are a few examples online of this, so if you nudge them in the direction of others' work, they should be able to get it pretty quickly. They'd finish it by the time they're 8, at least.
I learned lisp (edscheme) in elementary school. Wrote a language interpreter in 4 weeks. Interestingly, 2 of my 3 my favorite languages are very lisp-inspired (but of course, without the parentheses)
On the subject of education, pg links to a great story on Scheme (a Lisp 1) in education [1] from his essay "Beating the Averages" [2]. The story references a book, "A Schemer's Guide" that is a good Scheme/Lisp or programming learning reference [3].
If you know folks learning programming for the first time and maybe struggling with Java or whatever they're using, these references may help. Sometimes you just need to see things like recursion or currying in a different way for it to click.
I think being interested in programming or not is pretty much like sexual preferences: there's nothing you can do about it, and it is not their fault ;-).
The entire discussion has so far had no educational perspective.
There are only five memes to make educational curriculum decisions.
1) Most arguments for programming are from programmers who had fun learning to program and if the purpose of K12 is fun, then all kids should have mandatory enforced fun time which means learning to program. Which will not work for maybe 99% of society, although they will be subject to the curriculum. I had fun playing kickball; the purpose of school is to create an entertaining daycare like environment; therefore all kids should be forced to spend lots of time playing kickball. What about learning, or kids that don't enjoy kickball (or lisp?)
2) We have to prep our 5 year olds by teaching solely vocational skills because nothing will change in programming for the next 12-16 years. I'm old enough to remember "vocational training" that amounted to memorizing ancient versions of already obsolete Excel menus and keystrokes. Utterly useless. We like to think "Clojure K12" would be teaching Clojure, we all know its going to degenerate to drill and kill multiple choice tests about emacs keystrokes. A giant scantron test of multiple guess questions asking what C-X C-C does in emacs or what does M-f do, isn't going to help kids in the real world 30 years later. Better off teaching them welding, industry trends seem to imply TIG is a forever skill?
3) We teach critical thinking skills and frankly kids that can't conceptually understand and solve X+1=2 for X are not going to get anything out of programming that they wouldn't get better out of algebra or geometry. And what do we drop from the curriculum to add Clojure? Sorry kids you're not going to learn Trig?
4) We'll do what the cool kids do. If the cool kids district buys ipads, we need ipads. We had millions invested involving no further reasoning. Is the cool kids district using LISP? No? Well at least a third of districts won't even consider it, then.
5) The school board makes all curriculum decisions and they got and keep their job based on religion and/or politics and LISP isn't hitting the election issue hot buttons quite like a nice argument about creation science in biology class or sex ed or the new stadium for the football team.
Industry does not like LISP because is its very expressive and the programmer has the freedom to express any crazy problem solving idea they have in their head. The problem is the human condition of education for the last couple millennia shows its really kinda hard to take a hyper-customized brain dump of ... anything ... from one head into the other. So, it can do anything, any way you'd like, is too much anarchy for any shared work. How this applies to K-12 is if you think grading essays is a PITA imagine trying to grade large LISP projects.
Scratch is a better introduction and wildly popular. Scheme is good middle school math classes -- it's smart but doesn't get things done easily. Python is great for getting cool stuff built once kids can handle syntax.
LISP is weird - I used it a lot in the late 80s and enjoyed it a lot but then moved on and, looking back, never missed it. I think it has a hypnotizing effect on programmers brains, feeding the endorphin feedback cycle, but ultimately there are just too many damn parens.
I always got a rush from solving problems in LISP but then realized I could much more easily solve the problem in an imperative language without the baggage of thinking through multiple levels of recursion wrapped into a function. If that feeling of accomplishment is because you’re succeeding at doing something clever I question the utility of the language. Maybe just down to what I’m used to.
Clojurist here I rarely write explicit recursion it’s almost always encapsulated in reduce which is also idiomatic these years in javascript... but the big win with Clojure is first class immutability without category theory
I don’t hear a lot of fans of FP and friends admit this very often.
I think there are a lot of great uses of the patterns that come out of s-expressions, for example, but I would still choose Go, Python or JS / TypeScript as my daily driver.
>I think there are a lot of great uses of the patterns that come out of s-expressions, for example, but I would still choose Go, Python or JS / TypeScript as my daily driver.
In exchange for parens, you never ever have to worry about any _other_ syntax though. I always find this argument baffling. Every new version of most languages, there's new syntax to learn. New arrangements of keywords and glyphs to do stuff that is always and forever going to be (new-thing ...) in a lisp.
The loop CL macro would like to have a word with you. Moreover, there’s also reader macros that you have to worry about. Macros in general can have whatever syntax the author chooses. No one actually writes lisp code that is truly homoiconic for one simple reason: being constrained to a completely regular syntax is kind of annoying.
> Not really. It's true if you compare, like f(x) to (f x), but in general Lisp really does have more.
In general I think this is true, but an important caveat that your post brings up is that sometimes it also has to do with how we translate one language to another.
Consider treating a 3-tuple as a list, and assume you have the primitive functions zip, square, sum, and sub (inverse of summing a list). Then the idea of the distance between two points reduces to:
Of course there really are more parentheses here than in the C snippet you wrote, and it's arguably harder to understand, especially if you're unfamiliar with functional programming; it could be slower, too, and careless use of the zip function truncates at the shortest array. (I'm sure there are other objections I haven't anticipated.)
At the same time, if you want to generalize the C function to arbitrary vectors, you'll have to do a little more work with passing arrays, checking bounds, adding a loop. The Lisp function, by contrast, requires no such effort, and once the concept is understood requires little further explanation.
Even so, the point is that line-for-line translations may not always capture idiomatic usage of the language, which may in turn make it look or feel worse than it really is.
but for some reason the typical Lisp style looks like Python where someone vomited parentheses on it. Not sure why people choose to style it like that.
> Well, not quite -- I did write the billing system, including a pretty wizzy security system that keeps the credit card numbers secure even against dishonest employees.
Are details of this publicly available?
The topic of storing credit card numbers is one of those for which searching returns a lot less information than one might expect. If you need to do this yourself, as opposed to using some third party vault/tokenization system, there doesn't seem to be a lot out there.
Yes and no. No, there was never anything published about the Google biller encryption system. But yes, there is now a lot of information available about encryption. The state of the art has advanced a lot in 20 years.
Running a REPL on a spacecraft that is 100 million miles away? I've seen some slow response times, but that probably takes the cake. Very cool, but horrible latency.
Actually the spacecraft being so far away was the reason to put some Lisp system on it. That was one goal of that mission. The software controlled the spacecraft and did not need detailed programs what to do. It would figure out what to do from higher-level task descriptions and what it sensed. Basically a spacecraft as a semi-autonomous robot. Since it was running on top of Lisp, actually LispWorks, it could also be debugged/reprogrammed at a lower level.
Java is COBOL of our era. Soulless, business, a lot of typing and searching. You just keep on fixing it. It did advance a bit but ... sigh. Clojure invented too late I supposed.
Not sure why Java gets so much hate. I think it's some PTSD from bad corporate jobs.
It's quite good that we have a rather performant, cross-platform, GCed language.
I also didn't feel any epiphany while learning Clojure. I think Haskell is a much nicer language for some tasks, but going full-Haskell seems near impossible without a PhD in some obscure academic niche.
My perfect programming language would be a Jupyter Notebook-like thing where I can just add sections in whatever makes sense. Write some imperative Java, put a small Haskell block for a function that recurses nicely, put a Minizinc block to solve a constraint... Of course, nobody except the original author would feel comfortable with those exact choices... which explains why companies must standardise on something.
This sounds like a UNIX where you could write snippets in different languages, have enough wrapper to let those snippets eat stdin and write on stdout, and plumb it all with pipes, organized in a shell script. That was the original dream anyway. Too bad the weak performance of "programs as functions" makes it all barely usable unless you programs are long-running
It kinda sounds like using UNIX pipes but you would also need a context of global variables and running threads / processes. In my vision the blocks would share memory / variables.
That is what the shell script is for: managing the lifetime of said processes, and storing any globals you might need.
The real issu with this approach in my opinion is that a stream of bytes is just too generic, and the necessary (de)serialization is going to eat a lot of CPU for not useful task
I've seen Haskellers "learn" Clojure, often nested ifs everywhere and an imperative mess, just expecting the language to make their poor thinking into a great program
Clojure wants to be data oriented, I recommend watching a lot of Rich Hickey talks first to "get" Clojure
"We were not out to win over the Lisp programmers; we were after the C++ programmers. We managed to drag a lot of them about halfway to Lisp." -- Guy Steele
I think it is not so much a testament to Lisp as to a talents of a people who wrote the software. I've met some of those. Give them anything and they'll still produce stunning results. Those were real doers. Can't even call them programmers, they had something better: brains size of f..ng Manhattan.
On the other hand, these talented people chose Lisp over other options. Maybe their talent is also what let them recognize the value of Lisp, and that they would have a much harder time solving those problems with anything else.
I'm the author of the OP so I can speak to this with some authority: I discovered Lisp very early in my career and I stuck with it mainly because I'm lazy. I can get things done in Lisp with 1/10th the effort it would take in any other language, and it just steams my clams to have to do extra work when I know there's a better way.
> I can get things done in Lisp with 1/10th the effort it would take in any other language
Should that be read as 1/10th the effort it would take you in any other language (because of your long history with lisp), or do you suppose it would apply to arbitrary other competent people?
Like let's say we could make a few clones of you and start them all out on identical lives except that their early choice of language is different: one gets Haskell, another gets Forth, another Python, Ruby, etc.
In this scenario you're saying you'd bet on the clone who went with Lisp being most productive?
That's a good question. The honest answer is I don't know. It's hard to do controlled experiments on this sort of thing.
I can say this though: the ability to meta-abstract the language is a HUGE lever that no other language has. And one can also observe subsets of Common Lisp being continually re-invented in other languages. So CL must have gotten something right.
Doesn't seem that special to me. Forth has a totally different (but at least as powerful) set of mechanisms for metaprogramming. REBOL, too. And hell, these days even languages like Rust and D give programmers first-class capabilities for compile-time metaprogramming.
It's disingenuous to imply that every other language is simply "reinventing Lisp". The Lisp community itself spent decades figuring out elements that are now considered basic, like lexical scope. Lisp is just a ball of mud that can absorb any idea in its vicinity, and Lisp programmers like to retcon history to make every idea theirs.
McCarthy invented conditionals. Seriously, if you've ever typed "if" into a program you owe a debt to Lisp. Likewise garbage collection. Lisp didn't absorb that; Lisp invented it. Likewise dynamic typing, first-class functions, recursion, etc etc.
It's not even slightly disingenuous to imply that every other language is simply "reinventing Lisp".
This kind of fallacious blustering is exactly what I'm talking about.
"if" statements existed in FORTRAN (hell, even FLOW-MATIC), which preceded Lisp. The earliest Lisp implementations didn't have garbage collection, and while mark-and-sweep was being developed for Lisp there were other high-level peers like APL experimenting with similar ideas. Same deal for dynamic typing. First-class functions were explored in the Lambda Calculus decades before Lisp programmers applied them, and mathematics has known about recursion since antiquity. Programmers were expressing recursive algorithms in machine language in the 50s.
None of this sprung fully-formed from the head of John McCarthy: Lisp borrowed from a rich preceding tradition of mathematics and computer science, and the rest of the world wasn't twiddling their thumbs while McCarthy worked on his machine-algebra system.
The first Fortran had arithmetic IF. Fortran didn't get logical IF (the structured if-then-else that every language uses today) until McCarthy proposed it.
Of course recursion and first-class functions were known to mathematics. But we were discussing programming languages, not mathematics. And specifically high-level languages, back when there was a lot of skepticism as to whether they were even a good idea. Fortran got there first. Lisp was second and provided the existence proof that abstract features Fortran didn't even attempt were nevertheless viable. And then it went on to become the standard language of AI.
Sure, there was a lot of creativity happening back then, and Lisp didn't cause that creativity to happen. But being the second-oldest programming language and the first viable implementation of so many high-level features made Lisp a crucial development in the history of computing that's still quite relevant today.
Possibly bizarrely, I find metaprogramming in perl to be the closest of the dynamic languages to a scheme (I'm not fluent in common lisp) - though if you can forgive the syntax ES6 with decorators is surprisingly malleable.
I think it is rather obvious that one is most productive in the language he or she knows best. So the question rather is, for language one knows well, what is the more productive? Also, this is a very abstract question. In reality, availability of libraries and development tools can tweak the balance for a certain task a lot. For example, I can be very productive with tcl/tk, because the whole package lets me do certain tasks very efficiently, but I wouldn't call the language "tcl" expecially productive.
But overall, I think Lisp is an extremely productive language. In the last decades, a lot of modern languages have implemented a lot of Lisp features, so getting closer in productivity, without quite reaching it. Consequentely, the productivity factor depends on which langauge you compare Lisp with. But compared to languages like C++, the productivity gain is really large.
For me, beyond the overall language qualities, there are two especial productivity boosters in Lisp: the ability for incremental and interactive development and to create domain specific abstractions. There are many dynamic languages, but so far I haven't seen one which as easily allows you to do interactive development. That is the ability to update single functions while your application is running. That speeds up the cycle of changing the code and testing it by orders of magnitude.
I had around a decade professional experience with Java before I started using Clojure. Within months I was able to easily tackle problems that were daunting for me in Java using Clojure. I having used many languages during my career, I definitely find Lisps to be some of the most productive languages out there.
Well, and the corollary, that's why your typical engineer doesn't get it - and anyone who really insists on it is just a lunatic; adopting it would be a nightmare to the average workaday developer who just wants to have the compiler tell him/her what to do...
My team used to be a Java shop, we adopted Clojure around a decade ago and never looked back. Not really sure what this whole compiler telling the developer what to do is supposed to mean to be honest. One of the advantages of using a Lisp is that you get immediate feedback from the compiler as you're writing code via the REPL driven workflow. https://vvvvalvalval.github.io/posts/what-makes-a-good-repl....
Lisp aside, I found something tragic, or at least a terribly tragic phase to their career: They stayed at JPL because it was a more meaningful place to work and abhorred the possibility of Java drudgery. Fine, good.
Then they went to Google, as it appeared different from the corporate bureaucratic programmer = widget approach, only to be charged with implementing a massive Java project. So, irony.
But the real tragedy was that it was all for Ad Tech. Presumably tired of the same JPL work, they not only end up using a hated tool, but also working on something that seems on the opposite end of the "meaningful work" spectrum. Adding to the tragedy, they could have taken any number of development jobs that happened to use Java and yet pushed the needle forward on meaningful work.
I really regret the last 20 years of the best & brightest being drawn into the modern equivalent of Mad Men advertising firms. I do however take a fair bit of solace in the fact that these massive companies and massive systems have resulted, if only as a side effect, in excellent technology that allows corralling and orchestrating massive amounts computational power in a way I thought a bit too "utopian" when "utility computing" was being bandied about in the 90's. Just kind of wish some other monetization model had emerged. Freemium looked promising for a while, but it turned out there was still an infinity of user acquisition friction in between "free" and "extremely cheap". Oh well. I still have hope for the next decade or so.
Ugh. I understand exactly what you're talking about here.
I work in SoCal for a smaller aerospace firm. According to glassdoor, the pay is pretty decent (a little above average for a software engineer out here), but it really pales in comparison to the numbers I hear people mention when talking about "FAANG" lol.
The cost of living is not less. If you want a 30 minute commute, be prepared to pay through the arse for a 50 year old house in need of serious repairs.
The traffic is an atrocity. I grew up in the Bay Area and never in my life have I seen anything like this.
The benefits can vary from garbage to decent. Where I work, there's free lunch, free dinner, and free (really good) health insurance. OTOH, some other companies have "coffee clubs" and "water clubs" where groups of coworkers donate money to have a monthly supply of water/coffee in their office space.
Oh yeah, and no matter what, if you say you work on space stuff, everyone assumes you work at spacex or nasa. There are no exceptions. lol.
Edit: I realize I probably forgot the upshot. Yeah, this kinda work is pretty darn cool! I also really enjoy doing it, or I'd go work on something else.
Complete tangent but ... you grew up in the bay area, currently live in socal ... but you spelled it "pay through the arse" like I (a .uk resident) would.
Care to explain why? No big deal, but my linguistic curiosity is piqued.
I see "arse" used on occasion here in the US as a way to make "ass" sound milder. It's like using "darn" instead of "damn", munging the pronunciation. In either case, the cultural exchange goes both ways.
I think sometimes arse is used in a sort of ironic sounding way here.
In general though, outside of this one, a lot of things that somes get described as British use turn out to be things that I would also say. Sometimes there are regional differences in the US, or sonetimes it might even vary with the individual. The patterns of who says what aren't always so neat and organized as "this is always American, this is always British."
I myself have one parent who was born in England and immigrated here fairly young, and I sometimes wonder if or assume that's a factor for me.
That's funny, because calling someone an "ass" can be euphemistically explained away as just saying they're a donkey, while "arse" is unambiguously anatomical.
I guess it’s a suicide to point this out on HN, but not everyone thinks Ad Tech is the opposite of meaningful work. I think it’s a very “limited imagination” point of view if you a) can’t think of anything less meaningful that people spend their time on, b) can’t think of any upside that Ad Tech brings. I’m happy to elaborate, having worked on ad tech for over 3 years, in case this comment doesn’t get downvoted into oblivion.
1. Ad tech is a product which ~10 million businesses and people online are willing to pay for (over $100B/year), for several years now. It’s extremely unlikely that it provides no positive effect to all these customers.
2. Ad Tech is trying to make advertising experience better (even if it might be failing at the moment). People working on it believe that more relevant ads are a better experience (or less annoying), than completely irrelevant ads (which you can easily find on traditional TV or in print media).
3. AdTech (which GP kinda mentioned) pays for a lot of things that are available for free and likely wouldn’t be otherwise, including open source software development (Angular, React) and technology such as video calling or video publishing.
1. Ad tech is a product which people are willing to pay for. It’s extremely unlikely that it provides no positive effect to all these customers.
I'm talking about positive effects to "the world" / society / the public at large. Obviously not to the people getting their pockets filled.
2. Ad Tech is trying to make advertising experience better
No, they're trying to make more money. Whether they do so by making the advertising "experience" (I loathe that freaking word) better or worse it's irrelevant.
3. AdTech (which GP kinda mentioned) pays for a lot of things that are available for free and likely wouldn’t be otherwise
Slavery pays for a lot of cheap or free things as well, so that's not a really good argument in of itself, is it?
You don't sound like a person capable of being persuaded by anything (using strawman arguments about slavery is a pretty good sign of that) but I have in the past sold non-software products and the simplest option to promote them is stuff like adwords/adsense.
The alternative would be dealing with magazine ads and the like which is a whole different profession I would have to learn. Companies have whole departments to deal with that headache. And probably the 1000+ dollar budget involved would be beyond my scale.
How about lax environmental regulations in developing countries leading to cheaper products paid for by ecological wastelands? That should be enough to get over that particular rhetorical hangup although you don't sound like you're capable of being convinced in this case either because your convenience depends on a barrage of information aimed at everyone unlucky enough to pass by a computer screen (or go outside).
Regardless, the GGP's arguments are specious, circular, and delusional. The fact that the GP used slavery as an example doesn't take away from that. (And yes I get the irony of this post, advertising is that hill I chose to die on).
> 2. Ad Tech is trying to make advertising experience better (even if it might be failing at the moment).
Oh my.
Youtube's current business model essentially breaks all ad-supported classical music content on the site. It does this by interrupting the music in the middle of a piece to insert an add. That means a long piece like a Wagner opera is littered with tens or perhaps even a hundred tiny adds that cut jarringly into the piece in the middle of acts, sections, and even phrases. It's not possible to concentrate on a piece of classical music that way. Your ad tech industry has destroyed whatever segment of content ad-supported classical music on youtube represents.[1]
But that's not my complaint.
I'm watching television the other day and I come upon some classic movie that I hadn't seen in awhile. I start to watch for about five minutes; then, in the middle of a scene, the movie switches to a loud, short commercial. In the middle of a scene! I thought maybe the channel got changed by accident. But no-- this was a youtube-sized commercial doing a youtube-sized jump-cut into a television program.
I kept watching and sure enough, every subsequent commercial used a jump cut in the very middle of dialogue, and every commercial was as short as the ones on Youtube. Now this movie was almost as unwatchable as youtube classical music is unlistenable.
So your ad tech has managed to do something I didn't think was possible-- become a big enough influence on commercial television programming so as to make it even more painful to watch. That's much worse than just "failing at the moment."
[1] Sometimes it appears that the ad-placement algo inserts ads at the next available silence. But that still destroys all kinds of music. For example, many long classical pieces are made up of short-- yet related-- variations separated by silence. Inserted loud, obnoxious sounds between every single variation is no more acceptable than blaring an air horn between each move in a chess game. Edit: on the Brahms Paganini variations I listened to the algo seemed to be every other variation. But that hardly matters.
> Youtube's current business model essentially breaks all ad-supported classical music content on the site.
that strikes me as a bit of a stretch, given that YouTube sells a premium subscription, which gives you a 100 percent ad-free experience.
don't like ads? (i don't!) pony up US$12.99 a month to get rid of them.
i have had a premium youtube account since the month they were announced, and boy do i ever believe i am getting my money's worth. so much stuff on there, by so many creators.
I do most of my YouTube viewing from a desktop so my ad blocker stops me from seeing ads. However I recently signed up for YouTube premium because I am very anti-advertising, and I want to vote with my wallet.
I hope that more services allow me to pay in return for removing ads.
> it appears that the ad-placement algo inserts ads at the next available silence
Silence is a part of music like darkness is a part of a painting. To assume that it's "space to fill" is testament to either enormous naivety about or complete disregard for the content itself.
People like all kinds of ads. Super Bowl? People watch for the ads. People cut out clever magazine ads and use them as posters. Instagram “influencer”? That means you’re doing a product placement ad and people love it. Jingles: we sing ‘em. MailChimp podcasts ads. All kinds of ads.
You know what kind of ads no one has ever ever ever liked? Web ads. Banner ads. Pop ups, pop unders, takeovers. They’re all shit, consumers hate them, and consumers love ads in general. It’s a failed business model that has never been viable for publishers.
Serious question: do you use an ad blocker when browsing the internet?
Edit: I ask that because I use an ad blocker, as does everyone that I know. And I can’t think of a way to reconcile blocking ads with truly believing that ads are good for me.
I am strongly anti-advertising, but I always like to hear from people with an opposite point of view.
I work in publishing. Multiple times I have seen developers who were running ad blockers and then had trouble figuring out why their own sites weren’t loading properly.
No, I have never used an ad blocker, and I know plenty of people who aren’t bothered to either.
I am pretty UX-focused, but I don’t frequent any sites were I would think: god, this is unusable because of the ads. (I do find sites where UX is bad for other reasons)
My current biggest annoyance is that I was trying to hack something on iOS, opened bunch of medium articles and now I am being forced to pay for it, which I’m pretty sure would not be warranted given my usage of it. (I do pay for other services where provided value outweighs cost, like say Spotify). I’d take ad-supported Medium over pay-wall Medium any day (as a customer, maybe as a producer ads would not be a viable model).
1) An appeal to the popularity of ad tech is not a sufficient argument in this case. First, it is tautological: At issue from my original comment is the sheet volume of talent it has attracted. Implicit within that is popularity. As such, you cannot use popularity to justify the phenomena.
2) Nor is the belief of people working in the field that it is useful a compelling argument. And of course it is trying to make advertising a better experience etc. But that just says "if it has to exist, it might as well be good at its job" and doesn't provide show a prima facie argument for it to begin with.
For both 1 & 2, you could easily replace Ad Tech with military equipment and arms dealers. I'm not saying one is as bad as the other. I'm saying the flaws in the arguments become more apparent.
3) This, I think, is where we agree. But I think alternate business models, had they rose to prominence instead, might have been better for society. But counter-factuals are inherently difficult to prove, and it's certainly possible we'd be in a worse state had things worked out differently.
BITD, the idea was going to be that ads would connect commercial offerings to people who would actually appreciate them... Hasn't really worked out that way, though.
The problem is with global brands that continue to advertise anyway. You can make the "discovery" argument for upstarts and niche businesses, but when Coke or Exxon are advertising to me, there is no pretense of giving me information I didn't already have. At some point it's purely about associating one's product with vaguely positive imagery and embedding it just a little bit deeper into everyone's consiousness. No end-consumers benefit from that.
I think this statement satisfies a working definition of irony: about half the audience is in on the joke, and the other half earnestly believes it. I guess that's also true of trolling. Kudos, I suppose.
"A truly great product sells itself" is one of the most persistent myths engineers believe.
I don't understand how people can still believe this when it's so obviously false. It's a nice story, that if you make something great you'll get what you deserve. That sounds fair, but there's no reason to believe the market works this way.
A "truly great product" that no one (or not enough people) know about will go out of business and disappear.
That's completely absurd. How can a product "sell itself" if nobody knows it exists, or knows how great it is? How do they find out those things? Well, in part, by advertising.
I heartily agree that some advertising is manipulative and anti-social... but I can't buy this "all advertising is just manipulation" stuff.
How do you think all of the startups on here really achieve the "viral" growth they all talk about? Heavily targeted ads have allowed niche businesses to get off the ground and become profitable in a way that was not previously possible. Agreed there are problems, and agreed they don't allow much(if any) real control over the targeting from the target's side of things, but to say that don't do any real good is unimaginative as the previous comment said.
You present a false dichotomy to represent my view as one end of "ad tech bad" with the other "ad tech good" or something along those lines. However when contrasted with helping explore the solar system and helping to build an ad-bidding platform, I think there's a clear winner in which pushed humanity forward as a civilization.
You also attribute to me beliefs I don't state. Lamenting the tremendous talent that has gone into ad tech instead of what I would consider more worth while endeavors doesn't mean I don't value it at all. Ad tech has facilitated robust sets of nominally "free" tools that certainly have some value.
My opinion of the field, however, is that the "hard" problems it contains are neither as hard nor as meaningful as other areas that have attracted less talent. It seems you'd like to attribute to me opinions that are "black or white", but I'm afraid my views on things are not so simple, and as such not so easy a subject for simple rebuttal.
I’m glad that this your view, thanks for the clarification. You did literally write that ad tech is on the “opposite end of meaningful work spectrum”, and we both agree that space exploration is certainly on the positive end of it. This literally mirrors the definition of black and white (ends of the color spectrum).
I am sure you are right in that there areas that would deserve more talent given their impact. Care to elaborate which ones stand out in your mind? (Besides space exploration?)
Thinking of things less meaningful doesn’t suddenly make something meaningful in the general sense. Both can still be pretty meaningless when it comes to advancing society long term.
Optimal auction algorithms for ad placement can be interesting technical work and there can be some interesting economics work in there for price discovery, but in the grand scheme of things it’s not very meaningful to the broader world. You won’t be getting a Nobel prize and your kids and grandkids won’t even be impressed with what you do.
Side note: don’t talk about downvotes coming for your post. It’s a sure way to ensure it and it’s boring.
I don't doubt their intelligence or talent in solving problems. But chasing the highest Total Compensation, or glittering silicon jewel, or a comforting workplace that closely resembled the social & work dynamics they'd just graduate from in college... they were comfortable choices to make. Even chasing money losing startups for the first decade of your career always held the prospect of an off-ramp to one of the established players. Chasing work that moves the needle forward on culture, society, and civilization often requires a higher tolerance for risk and discomfort.
I'm fascinated by two questions that both seem to suffer from a lack of data. What's the best way to introduce programming is one. There should be tons of studies and data on this, to end the discussion, no?
> imperative languages with poorly defined semantics like C and C++
What does he mean by this? Is this related to the C "undefined behavior" or is the reference here to something which affects imperative languages in general?
I guess it's mostly about many things not having any clear meaning and depending too much on conventions. Error reporting is a celebrated example: a C function that encountered a problem may return null or may return -1 and write null to the pointer passed to it, or return 0, if even the basic conventions are not followed, etc. All these things do not have any in-built meaning. Compare this with the Go system of returning a value and an error together. Comparing err to nil makes some intuitive sense, comparing a returned value of a function with null is, by itself, a meaningless operation.
This just reminds me of how many times I've been told things like "you can't use that java code that is open source and working and does exactly what we want - we're a python shop!" and other such nonsenses.
I'm always amazed at how people have strong feelings for the tools/languages, rather than what they do and how they can help you. Even in this example, the Google person didn't even ask what they wanted to use Lisp for.
People I guess just really like rewriting software in different languages?
Ugh, you haven't experience pain until you've worked on a project which has a bunch of different languages because Johnny wanted to use Scala, Freddy loved PERL and Sally was learning Elixir on the side. They all left and the new team has to figure out how all these stupid languages even build.
I didn't say "let everyone go off and learn a different language and develop new code in it," obviously that is ridiculous, and would get out of control as you said.
What I did say was there's a working piece of code out there that someone else has already spent time getting to work, but we can't use it because of the language it's written in. For example, I've had problems getting people to use Jenkins because it's written in java. Not that they were trying to write plugins or do anything internal to Jenkins, they just didn't like it because the language it was written in.
As an engineering manager, you could support every team or even every individual using whatever they like - but you then have other tradeoffs to make.
It's now not possible to run security tooling over everything or use a common artifact repository or follow common coding guidelines or linter settings and you'll get islands of language culture and endless internal reinvention and rewriting.
For me it makes lots of sense, it is not "nonsense".
Java is a gigantic dependency. Any project that depends on it adds all this dependency with it. In many cases this dependency could be bad or toxic for the project.
It depends of what you do for a living. Personally I will never let anyone include java as a dependency in my company's projects, because for what we do, java is slow and linked to Oracle.
Lisp is a different story. You can create c ,Python, Lua,c++ or whatever code from Lisp easily.
You will see it defended a lot on Hacker News, due to the disproportionate level of PL enthusiasts, and the Paul Graham thing. But yes, you're right that in reality, lisps have incredibly limited popularity. (You've met one more full-time Lisper than me and I've been in the industry for 10 years).
In my current and in my last 2 jobs I've been using a Lisp (Clojure). Since learning Clojure and functional programming in general, I made a conscious choice to seek it out for the last two positions (a small startup and a huge government project) and in this current one (university research) I'm starting two projects from scratch and get to decide myself which language to use, so of course I'm gonna choose the one I enjoy the most and will be more productive in. It's hard to go back to imperative/oop.
What for? I mean, I have relative freedom at work and could probably use Lisp on my own tools for my own job, but I’ve never really considered it (I use Haskell and Python instead).
zerr's post reads like Lisp is some kind of terminal disease that will cause spacecraft to slowly get cancer or something. It sounds like completely irrational hostility.
But the article reads like blatant fanboy-ism, as if Lisp turned spacecraft into unicorns that magically distributed rainbows around the solar system or something. It's completely oblivious to the organizational cost (except, oddly, at Google).
What's the organizational cost of Lisp at JPL? Somebody's got to maintain these tools for decades. Over the lifetime of a spacecraft mission, people leave. People even die. If nobody understands the software (or the tools the software depends on, including the build chain), then when someone other than the original magician has to fix something, you're stuck. JPL really doesn't want to be stuck when a probe is approaching a planet and a fix is needed to make the mission work. The article has zero recognition that such issues could even exist.
> JPL really doesn't want to be stuck when a probe is approaching a planet and a fix is needed
Ironically, the point of this JPL Lisp project was spacecraft autonomy. When you're coming up on a flyby, light-hours from Earth, the standard fault-tolerance strategy (go into safe mode, wait for instructions) makes it physically impossible to satisfy mission goals. You need something smarter and more adaptable, and that's pretty hard to accomplish in the standard style of software in C they were used to.
(I was there and read internal docs about the Remote Agent project this post was about.)
Years later in the Pluto mission this nightmare almost happened: the probe went into safe mode ten days before the flyby, and they heroically fixed it over a few days of crunch time.
> If nobody understands the software (or the tools the software depends on, including the build chain), then when someone other than the original magician has to fix something, you're stuck.
Speaking from experience, this is independent of the language chosen. I've seen several pieces of software over a variety of languages that are a nightmare to maintain.
What really ends up happening is a kind of oral lore. People learn how to use the parts of the software that is known to work, and the business processes ossify around those known behaviors. The current developers and maintainers walk around on eggshells to avoid changing that behavior because of the risk involved in changing those processes. And then nobody remembers why the processes are as convoluted as they are, and it just becomes the way things have always been done. When there's a series of successful missions backing that assertion, it's hard to argue the point.
But because this idea -- that perfectly reasonable yet "not mainstream" languages produce software thats's hard to understand and maintain -- is so pervasive, we get pressure coming back from the other side. Perfectly reasonable choices of language are blocked, and everything starts regressing to the mean. The old adage goes something like, nobody ever got fired for choosing IBM. Well, nobody ever got fired for choosing Java, either. Even Clojure or Scala, which are perfectly productive, excellent languages that also run on the JVM, are a nervous choice.
(Also, Makefiles are absolutely a dark art, and many modern languages have thankfully produced better abstractions for their purposes, but C and C++ are still seen as totally reasonable first choice languages for a program. =/ )
> JPL really doesn't want to be stuck when a probe is approaching a planet and a fix is needed to make the mission work. The article has zero recognition that such issues could even exist.
>> Spacecraft activities during the Jupiter Approach phase include spacecraft subsystem calibrations and maintenance activities, as well as flight software updates and hardware checks in preparation for Jupiter Orbit Insertion.
If I'm at JPL, doing C++, they probably have several more C++ programmers. I probably work with some of them. If I'm gone, they probably have someone who can take over, people who maybe use the same coding standard I use (they almost certainly have one for C++). But if I'm the first person there using Lisp, there probably isn't a standard (unless I wrote it, and if so, it's a standard nobody else there uses). There may not be anyone else there who knows the language. So while Lisp may be a perfectly reasonable language in terms of functionality, it's not a perfectly reasonable language in terms of the organizational issues. I, the programmer who wants to use Lisp, may not have to think about those issues, but managers do.
Yes, I agree! I merely think you overvalue C++ in this case. There's lots of C/C++ code that most people can't understand either -- and not for lack of knowing the language.
Apropos of nothing, I found Peter Naur's "Programming as Theory Building" [1] an enjoyable read on the topic of maintaining software after the original developers depart.
> If nobody understands the software (or the tools the software depends on, including the build chain), then when someone other than the original magician has to fix something, you're stuck.
Well the great news is that its Lisp, the language is simple. You have the same problem with other languages, except if you used C++, everything just got a magnitude more complicated, both because the whole thing is more complex and the ecosystem is changing and developing more rapidly.
With Lisp, the language is simple. But everyone writes their own DSL on top of the language because the language is simple. It's easy to do in Lisp, but it's also easy to make it incomprehensible to anyone else. I've heard this called "the curse of Lisp".
I've seen lots of programs written in lower-level languages that are incomprehensible because the patterns were macro-expanded by hand 1000 times and it's impossible to get the big picture.
I can't say I've ever seen a Lisp program where abstraction made it incomprehensible. That seems to be a common meme, though. I'd love to see an example.
I don' know how bad it is for Lisp, but I get the argument and I think its valid. I've seen to much bad non-idiomatic and homegrown code to know this is a big poblem. A big part of the value of using e.g. Spring Framework is that it is standardized, easy to understand and devoid of a lone hacker's "brilliant" ideas.
I think the part about the toolchain/dependencies still stands.
I think it is obvious that such extremely dynamic language with a non-deterministic GC is not suitable for particular tasks. Unless they were using some dialect with static guarantees and correctness provable constructs.
I think C and C++ are the most popular languages for hard real time development, so you'll have better OS support, better best practices, more experience if you choose one of those.
But I don't do things in hard real time, so please correct me if I'm wrong.
A lot of aerospace real-time environments are still geared for Ada, and while C/C++ are available they are not the same as "vanilla" C/C++. Also, is there really C-specific support in a good RTOS? Probably doubtful, except for inertia or presence of header files if they expose them.
I'm always curious though: what's the other side of the story? Why is lisp always rejected?
I started programming seriously (for money) in 2009, and when we needed a language we went with python because it had better math/science libraries than Java and much more libraries in general than clojure/lisp/ocaml. I confess I like the functional languages and like using some of their features in python.
Was the no-libraries dealbreaker for the "pure" functional languages also true in 1999, which led google to java? Or maybe there are complex meta-programming bugs you can generate in the functional languages (and elsewhere) that are the dealbreaker? Or what?