Hacker News new | past | comments | ask | show | jobs | submit login

> Given that languages used broadly in the industry are lacking behind research 20+ years

This is obviously true in abstract, but the real breakthrough happens when you make those concepts ergonomic for the working developer. The theory behind dependent types is well established, but I can't really write my next project in Idris, can I?

Similarly, there was a time when C was the only sensible choice to write anything non-academic, non-toy. It's not like people didn't know about OOP or functional programming back then, but it wasn't a realistic possibility.

Or parametric subtype polymorphism, also known by its common name "generics". The concept has been around at least since the 70s, C++ templates weren't widely used before what, 1990?

> My guess for the future is more that we will see a kind of split between a big group of people using advanced low-code like tools for programming day to day things

This has arguably already happened, we call them data scientists. Many of them are technical and have some light scripting skills but they couldn't be put to work on, say, your average backend project. Obviously this is a gross generalization, titles mean literally nothing, I'm pretty sure there exist data scientists that kick ass at coding.




> > Given that languages used broadly in the industry are lacking behind research 20+ years

> This is obviously true in abstract, but the real breakthrough happens when you make those concepts ergonomic for the working developer.

That's of course correct. That's actually why we're lacking behind research by such a long distance. It's not only type-systems. The "20 year lack to research" seems to be a quite general phenomena in IT. I'm not judging. It's an observation.

> This has arguably already happened, we call them data scientists.

I think this is only a facet of what I had in mind. I guess it will become more ubiquitous to use some "coding related" tools in a lot of places! But it won't be for sure software engineering what those people will do. I was thinking more in the direction of e.g. MS PowerApps. Or something on the spectrum between such thing and Jupyter notebooks.


> Similarly, there was a time when C was the only sensible choice to write anything non-academic, non-toy. It's not like people didn't know about OOP or functional programming back then, but it wasn't a realistic possibility.

But the elephant in the room? What about LISP?

To try to answer that by myself:

My guess is that (too?) "advanced" technology (at some point in time) doesn't get any traction on the mass market. That's another reason mainstream languages and tools lack significantly behind academia in my opinion. If something is called "academic" that implies "not pragmatic enough for use" for a lot of people, I suspect. (The LISP story has more to it but this would be largely off-topic so not going into that).


> But the elephant in the room? What about LISP?

My input on lisp:

I think as far as "practical" programming is concerned, lisp kind of missed the critical window. There have been three phenomena occuring simultaneously:

- Computers become so powerful that it's now justifiable to use inefficient languages. Ecosystem develops around said languages (perl, ruby, python...)

- The web becomes the primary way to ship software. The OS C API isn't something you necessarily have to keep in mind at all times.

- Mainstream languages incorporate 'academic' features.

The combination of 1 and 2 made it so that up until the turn of the century (and realistically well into the 00s) C and to an extent C++ were first class citizens, and everything else was at most ancillary. The most efficient Lisp compilers out there produce code that's within a factor of 3 of C++. That's amazing by today's standards, but if you're programming Pentiums (and earlier 486s, 386s) you don't really have a factor of 3 to spare.

As for the capabilites of the language itself, even around 2000, lisp really was secret alien technology. But by the time you could actually either afford the performance penalty on the desktop or to be web-based, the "inefficient" bunch is now also an option, offering more expressivity than C/C++ and better libraries/batteries than lisps.

> My guess is that (too?) "advanced" technology (at some point in time) doesn't get any traction on the mass market.

The point here is, IMO, that IT in general and programming languages in particular are very hard to innovate. PL innovations by definition don't offer more features than the status quo ante. Once you have a programming language that's sufficiently high level and with a sufficiently efficient compiler (in our timeline that was C), programming languages are effectively "feature-complete", there's never going to be a language with a feature that can't be replicated in C. In other words, the leap from ASM to C (or Fortran, Cobol, whatever, I'm using C for the sake of argument) is a 10x improvement, but everything after that has significantly diminishing returns.

What you're really doing is improving developer productivity: not every project needs C, so maybe write Python and be done in a tenth of the time, with a tenth of the bugs and without buffer overruns.

However what the market is after is "total" productivity: if my codebase is already in C, it's a huge productivity loss if we move to $NEW_LANGUAGE, regardless of the new features, and because language innovations has diminishing returns, it becomes increasingly difficult to justify switching.

> If something is called "academic" that implies "not pragmatic enough for use" for a lot of people, I suspect.

With that said, you are correct that there is sometimes a knee-jerk reaction. See for example the unreasonable pretense that "monad" is some kind of black magic.


Your analysis is quite astute. Lisp systems became big and resource-hungry ahead of newer, more affordable, less powerful machines being able to keep up.

Lisps were developed on departmental and corporate "big iron" computers from the beginning. Initially out of necessity, because those were the only viable computers that existed. Then later out of necessity because it was the only computers on which it would run well.

Very few languages (or other software!) that were popular on expensive big iron in the 1960-1985 era transitioned into the microcomputer era with their popularity intact, or at all.

For a window of time, microcomputer enthusiasts were simply not able to make any use of the big iron software at all. Those who programmed big iron at work and micros on the side did not pass on the knowledge from the big iron side to the newcomers who only knew consumer microcomputers. They just passed on stories and folklore. You can't pass on the actual knowledge without giving people the hands-on experience. And so the microcomputer culture came up with its own now iconic software.

Today we run descendants of Unix because Unix was actually developed pretty late into that big iron era, on smaller iron hardware; like a clumsy but workable ballerina, Unix readily made the hops to workstations having a few megabytes of memory like early Suns, to microcomputers like better-equipped 386 boxes.

There are stories from the 1980's of people developing some technology using Lisp, but then crystalizing it and rewriting it in something else, like C, to actually make it run on inexpensive hardware so they could market it. CLIPS is one example of this; there are others.

I don't think that people had no cycle to spare on 286 and 386 boxes. This is not true because even some much slower languages than Lisp were used for real programming. People used programs written in BASIC on 1 MHz 8 bit micros for doing real work. By and large, most of those people had no exposure to Lisp. BASIC was slow, but it fit resource-wise. Not fitting well into the memory is the deal breaker. Performance is not always a deal-breaker.

The Ashton-Tate dBase languages were another example of slow languages, yet widely used. They ran well on business microcomputers, and carried the selling story of being domain specific languages for database programming, something tremendously useful to a business.

All that said, our thinking today shouldn't be shackled today by some historical events 1980 to 1990.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: