If you are stuck in a boring language and want something more concise, you don’t have to give up legibility by going to K. There are plenty of languages where you are not expected to use “for” loops.
Python/numpy is most accessible and probably closest to “instant gratification”. Mathematica is great for, you know, mathematics, but you’ll need a license. And there is always haskell.
k is perfectly readable. That's like, one of the biggest points of the post. The post mocks the mindset featured within your comment.
None of the languages you listed are quite as readable as k to someone who knows it.
All of the languages you listed are significantly slower and feature none of the benefits of a concise notation, even with fast.ai's Python style guide made to imitate Arthur Whitney.
> k is perfectly readable. That's like, one of the biggest points of the post. The post mocks the mindset featured within your comment.
It mocks it, but it doesn't present a convincing argument. Switching from an imperative language to K is making several changes at once, so we can't really conclude anything about the details of which changes are good or bad. The closest I've seen to a controlled experiment is something like: on a Haskell codebase, switch between using symbolic names like "^>>" and alphabetic names like "promap". And my experience has been, in a real-world mixed-ability team, that overall the alphabetic names had the advantage; if this isn't so for K, then we should try to understand the reasons why. The OP gestures at this, but unconvincingly: why yes, "ordinal" is much clearer than "<<", thank you very much. "Ordinal" may mean different things in different contexts, but so too does "<<" for anyone who knows other programming languages; a name that can help with intuition is a good thing even when it isn't perfect.
I'm very much a believer in concise notation (more than most people I've worked with), but the advantages of having a name that can be unambiguously spoken are large enough to outweigh a small constant factor. I sketched a mainstream-FP implementation of the famous 1-liner APL game of life, which became something like three lines (not counting implementing the equivalents of the APL operators, which require slightly fancy polymorphism to do the right thing on vectors and matrices but are otherwise not that special) that could actually be read and talked about. That seemed like a good tradeoff.
One can - and should - write code that's much denser than the current maintstream style. I do think the industry massively underestimates the value that comes from fitting a function or class on a single page. But I remain unconvinced that it's worth going all the way to code that can't be communicated verbally.
I'm very much a believer in concise notation (more than most people I've worked with), but the advantages of having a name that can be unambiguously spoken are large enough to outweigh a small constant factor.
I don't believe this holds up to scrutiny. APL as a language was made to be spoken, and it seems to have succeeded in that. Alan Perlis briefly touched on that in a great paper,[1] and just about everyone who's worked with the language and its derivatives agrees that your statement isn't true.
The idea that APL and its derivatives can't be "read and talked about" is frankly ignoring reality by any stretch of the imagination.
For example, here's [2] [3] [4] some videos disproving your statement, the first two being APL, the last being k itself.
For extra bonus points, see Aaron Hsu's talks [5]; pick any of them, they're all pretty good, and most have a segment in them explaining how that viewpoint is wrong.
> I don't believe this holds up to scrutiny. APL as a language was made to be spoken, and it seems to have succeeded in that. Alan Perlis briefly touched on that in a great paper,[1] and just about everyone who's worked with the language and its derivatives agrees that your statement isn't true.
The people who have worked with the language are a small and highly specialised minority of programmers, unusual in many respects. I haven't worked with APL, but I have worked with symbol-heavy Haskell-style code, and I've found that even if these operators theoretically had standardised, pronounceable names, in practice being able to talk about them was a genuine problem. If there's some special difference between APL and symbolic Haskell libraries, what is that difference?
Alan Perlis is a Turing Award-winner who helped to standardize the most influential language of all time, ALGOL, of which Go, Python, C and most languages originate. In many respects, he was closer to a Go programmer than a Haskell programmer.
Haskell's symbols aren't notation, they're just scribbles. Miranda is a better comparison, but Miranda too doesn't lean hard enough.
Ken Iverson literally developed APL as a way to give better lectures at Harvard. It was a verbal and handwritten language long before it was typed. It excels at it, because it was designed for it.
> But I remain unconvinced that it's worth going all the way to code that can't be communicated verbally.
Symbols can be communicated verbally. You can make up new symbols and give them a verbal meaning.
I think the big issue here is that programmers are lazy and whine about anything they aren't able to pick up within a weekend with their prior experience and some hard staring. "I looked and I didn't get it so it's unreadable." By the same token, Greek, Russian, Japanes, Korean, and many many other languages must be unreadable, along with like mathematics.
Meanwhile, I'm happy that we have concise syntax for mathematics, even if you have to go out of your way to learn to read all these fancy symbols. Notation as a tool of thought and all that.
> I think the big issue here is that programmers are lazy and whine about anything they aren't able to pick up within a weekend with their prior experience and some hard staring. "I looked and I didn't get it so it's unreadable." By the same token, Greek, Russian, Japanes, Korean, and many many other languages must be unreadable, along with like mathematics.
Language design would be easy if it weren't for those pesky programmers :). I don't think you're wrong, exactly, but I'd note that few companies fully support people learning any of those (rather they rely on hiring people who studied them in university). Most programmers, quite rightly, aren't willing to commit a significant chunk of personal time to learning something with little immediate benefit, and most employers won't adopt a language that requires a significant training programme, and I'm not convinced they're wrong either.
> Language design would be easy if it weren't for those pesky programmers :).
What kind of a language would you make if time-to-learn-it were not a factor in it?
I've been thinking about language design for years but I still haven't arrived at anything specific. There are ideas floating in the air, but I haven't had the time to try them out in practice.
I think it'd be nice to see efforts like this, where language designers drop on the floor every concern about familiarity or similarity to existing languages and just find out what results in the most powerful language (in general, or in a particular domain).
It's a very difficult problem.
> Most programmers, quite rightly, aren't willing to commit a significant chunk of personal time to learning something with little immediate benefit, and most employers won't adopt a language that requires a significant training programme, and I'm not convinced they're wrong either.
They might be pragmatic, but I feel like we're kinda stuck in local minimas on too many fronts, thanks to such shortsightedness.
But don't you think it's mind boggling, and perhaps even a little hypocritical, that we spend up to around two decades of our lives in education (with a bunch of fluff that somebody always argues might be useful some day). And then after that, learning a new tool that would require more than a few weekends is no-no? I find it quite absurd.
But yeah I'm not really expecting companies to care, even though I think they should. To be honest I don't care too much about companies; from what I've seen, the vast majority of them are just crap ;) Thankfully there's open source.
Long term, I think something about the system needs to change. We should have more flexibility to train people on demand without putting all the burden and cost on the individual alone (or their company). For all the talk about lifetime learning, I don't think we've done much yet.
> I think it'd be nice to see efforts like this, where language designers drop on the floor every concern about familiarity or similarity to existing languages and just find out what results in the most powerful language (in general, or in a particular domain).
To my mind the purpose of a programming language is to bridge the gap between what a human can understand and what the computer can do. So I don't think you can draw a clear line between ease of learning and the value of the language; there will always be cases where you need to talk to a domain expert, so it needs to be easy to translate between the programming language and the human domain, which means there's a lot of value in familiarity.
I think the ideal language would look much like the way people explain what they do to other people (kind of an extension of the idea that the best programming languages look like pseudocode). Most fields of human endeavour rely on a limited amount of domain-specific jargon, but not a completely different alphabet, mathematics being a notable exception - if notation were truly a powerful tool for thought in general, wouldn't we expect to see more of it in other complex professions? Indeed I think a lot of existing language design has gone wrong by sticking far too closely to mathematics; for example, operator precedence massively complicates a language, but is really only used to support arithmetic, which is really quite a niche case. (And so it makes sense that APL-family languages might be successful as domain languages for mathematics, but I don't ever expect them to break out beyond that niche).
Python rightly has a good reputation syntax wise, though it persists with mathematical-style function application, overly cumbersome keyword constructs, and awkward special-casing of operators; the code I've seen that's looked most like English sentences (in shape) is probably Haskell or Scala when written in an operator-light style, with lines formed of space-separated words (even if the words themselves are unfamiliar) - Scala supports more of a "subject verb object" style, but sometimes requires brackets.
In terms of what's missing, I think a lot of the tools we use to structure text on a larger scale aren't there - imagine having to write an instruction manual using only plain text with no headings or chapters or the like. But on the small scale, yogothos has got to me; I don't think we should quite go all the way to lisp, but we do need less syntax rather than more. I'm thinking Python-style grouping (indentation and colons), Haskell-style function application (spaces), Smalltalk-style control flow (none). Most syntactic constructs take expressions (including e.g. method bodies), a block acts as an expression (evaluating to its last line). A few syntactic shortcuts, only where they're generally applicable enough to be worth the overhead: Scala's _ for lambdas, maybe Haskell's $, but not a lot else; I think it's worth using symbols for syntax, stuff that breaks the normal rules of the language, precisely to distinguish those things from regular well-behaved functions.
Sound pretty uninspired, middle-of-the-road? Yeah, it is; I really don't think the current state of the art in syntax is that far away from optimal. If I was trying to design a custom language I'd be a lot more concerned about implementing a good record system, about effect systems, about stage polymorphism.
> But don't you think it's mind boggling, and perhaps even a little hypocritical, that we spend up to around two decades of our lives in education (with a bunch of fluff that somebody always argues might be useful some day). And then after that, learning a new tool that would require more than a few weekends is no-no? I find it quite absurd.
Yeah. TBH our whole industry's attitude towards experience and career history is a mess; I highly doubt that an unbroken chain of programming jobs is the best possible way to gain skills, but your CV will get the side-eye if it shows anything else. I wonder how much of this is just path dependence, accidents of history, and no-one wanting to be the first employer to do something radical.
> But yeah I'm not really expecting companies to care, even though I think they should. To be honest I don't care too much about companies; from what I've seen, the vast majority of them are just crap ;) Thankfully there's open source.
Open source relies pretty heavily on companies, in my experience - either because a company decides to outright work on open source as a part of its primary business, or because it tolerates an employee who wants to contribute in doing so. You get some contributions from students, which are, uh, variable, and very occasionally a government grant, but most programming is being funded by corporations, by one route or another.
> The post mocks the mindset featured within your comment.
Actually, the post pretends you know nothing about functional programming and then presents K as the solution–but, as the grandparent points out, it’s not the only one. The post actually has no comment on this.
(I should also note that smugly mocking a mindset by reducing it to a caricature might help explain why people are exasperated by the post’s patronizing tone.)
The post sort of comments on that by claiming that K's somewhat cryptic-looking operator-style functions are more legible than named counterparts.
> Does giving a K idiom a name make it clearer, or does it obscure what is actually happening?
sum: +/
raze: ,/
ordinal: <<
> The word “ordinal” can mean anything, but the composition << has exactly one meaning. (That one took a while to click, but it was satisfying when it did.)
(Didn’t click for me but I also didn’t spend a while on it, whatever.)
However, one should note that it's not like other functional languages don't have their share of cryptic operators, intuitiveness up to debate of course. Here's a list for Mathematica: https://mathematica.stackexchange.com/a/25616
At the end of the day this approach doesn't scale very well and skews nicer when you look at very simple examples like +/!100.
Lots of OOP boilerplate makes absolutely no sense to me because I have never had to write such code. I would consider Java and C# unreadable compared to an array language.
I would compare trying to pick up K as trying to figure out one of the clicking languages when you know Spanish, while going between most languages is like going between Spanish and Italian. You can kind of guess your way through in the latter case, while it's significantly harder to do the former unless you actually sit down and learn the language formally.
Chomsky has an idea called "The poverty of the stimulus", where he attempts to prove his ideas about deep syntax (or whatever he calls it) by handwaving that toddlers can't possibly learn language by merely being immersed in it for a few years.
I'm not impressed with the notion as applied to toddlers, but I think it applies fairly well to learning what variables mean in a codebase. Especially one written to be terser-than-terse.
Having used C++ for 25+ years, I know it better than I wish I did. I have also used Clojure professionally for 5 years now, and I find it much more readable, as in there's much less cognitive load when figuring out what something does.
It is not just about abstraction level either: Python gives me intense discomfort.
I have not used K, but quite like the way it looks.
That's not true. Languages vary tremendously in how readable they are to people that know them. Especially if you set the bar for "knowing" at a consistent number of hours invested.
A language that's total noise at hour zero could also be the easiest to read at hour 20, and 40, and 100, and 1000...
When a language is something you're supposed to be doing real work in, as a core part of your job, it's perfectly reasonable to require some amount of training before you're set loose.
If you don't know the language, you won't be writing, editing or improving code in it anyway. Your test is therefore irrelevant for programming languages.
If the language is readable beforehand, but brings a 30% productivity decrease to those proficient in it compared to what they could have with one that wasn't, even the best programmer will be mediocre.
Further, the best have had a long history of finding APL and derivatives appealing: Ken Thompson? Liked APL. Alan Perlis? Loved APL. Rob Pike? Has spent years on an APL interpreter.
APL isn't lacking fantastic programmers.
k has shipped with a Python-like sugar layer for years now (q). Readable to most, but without the notational value, much is lost.
Q'Nial (one of my favorite programming languages) is readable by most. It's very verbose, but by all means an APL-derivative. It's not as useful a tool as one with notational benefit, and never attracted much interest.
Anyone close-minded enough to ignore something entirely because they don't understand it at first isn't likely "top talent."
With decent identifiers you can kind of guess the purpose of things without understanding anything else about it. A file with words such as "latitude" and "coordinate" and "distance" in it is probably something to do with maps, for example.
Non-programmers making sense of code in this way is extremely rare. I don't think this is a priority in programming language design, the same way that making dashboards accessible to non-pilots is not a priority in airplane design. That would certainly be a poor reason to sacrifice major improvements in airplane efficiency or safety. Designing simpler dashboards for student pilots as a learning device is of course a different matter.
It really doesn't. k's ancestors were taught to high school students and admin in incredibly short spans of time. Here's an anecdote from Kenneth Iverson about a man who learned APL in two weeks, completely alone, and the results after he taught his students it:
My daughter Janet attended Swarthmore High School, and recommended Rudy Amann (head of the math department) as an excellent teacher. I therefore approached him with a proposal that we put an APL terminal in his school as a tool for teaching mathematics, suggesting that he first spend the summer with the APL group to assess the matter.
Rudy responded that he could spend only two weeks, which he did. I gave him an office with a terminal (and the Calculus text in APL that I had written after our earlier experiment with high school teachers), and invited him to come to me or anyone in the group with questions. Since he never stirred from his office, I despaired, but at the end of the two weeks he announced that he wished to go ahead with the project.
Rudy was pleased with the results, and told me of canvassing those of his students who went on to college, finding that they were pleased with the preparation he had given them. One thing he had done was to use some of the final two “review” weeks to show them the translation from things like +/ to the sigma notation they would encounter in college.
Here's another where they taught it to high school teachers, with a direct explanation as to how students responded:
I believed that APL could be used in teaching, and Adin said that to test the point we must take a text used in the State school system, and try to teach the material in it. He further proposed that we invite active high school teachers.
We hired six for the summer, with the plan that two (nuns from a local school, who could provide a classroom in which we supplied a computer [typewriter] terminal) would do the teaching, while the other four (with a two-week head start) would write material.
To our surprise, the two teachers worked at the blackboard in their accustomed manner, except that they used a mixture of APL and conventional notation. Only when they and the class had worked out a program for some matter in the text would they call on some (eager) volunteer to use the terminal. The printed result was then examined; if it did not give the expected result, they returned to the blackboard to refine it.
There were also surprises in the writing. Although the great utility of matrices was recognized (as in a 3-by-2 to represent a triangle), there was a great reluctance to use them because the concept was considered to be too difficult.
Linda Alvord said to introduce the matrix as an outer product — an idea that the rest of us thought outrageous, until Linda pointed out that the kids already knew the idea from familiar addition and multiplication tables.
Finally, it was this interest in teaching that led us to recruit Paul Berry, after seeing his Pretending to Have (or to Be) a Computer as a Strategy in Teaching when it appeared in Harvard Educational Review, 34 (1964), pp. 383-401.
So then, why is K less popular than pretty much any other language? Because people are stupid? Uninformed? There's a lack of PR-minded people working with K? What exactly is your theory? It's funny that a language this flawless is slightly less popular than S or D or pretty much any other one-letter language. There must be something that makes it unappealing for the masses of programmers - what is it?
Javascript must've done _something_ right, no? Yes I know it's bad from "language design" perspective, but we all know what it did right: it was embedded out-of-the-box on a very popular platform (the browser). So, there is an explanation.
Do note that I didn't claim "popularity == quality". I'm not even claiming K is bad!!! Just that it's strange for something so clearly superior to everything else to be such a niche language. Surely it must have downsides...
Why is it strange? Think of languages like Lisp, Smalltalk, Haskell. You may not find them on top of TIOBE, but their innovations do trickle down to mainstream, indicating that the designers of mainstream languages find them worthwhile.
Array languages are very much poised to do the same, if they did not already: numpy is essentially a poor (and verbose) man's array language embedded in python. Or consider Matlab.
> "A LISP programmer knows the value of everything and the cost of nothing."
Here, a very old quote that tells you directly what is wrong with FP - the performance on old hardware sucked. For a very long while compilers/interpreters were not good enough.
The cost/benefit equation changed in recent years, and sure enough, FP ideas are becoming mainstream.
Again, I'm not saying that array languages don't have fundamentally great ideas. I'm saying that articles like this one do them no favor - they're just smug rants. Show the world you understand the downsides, and you get a better chance of promoting the upsides.
k costs $20,000 per-core (less accurate now than it used to be but I'm pretty sure it's still true for commercial uses). It has a billion-dollar company based entirely around it (Kx/FD) and a smaller multimillion one, so it worked out well, I guess.
J is free, but J has never had an advertising budget, and was only freed recently.
APL's current leading implementation is really bad in comparison to how nice the language used to be, and just as proprietary. It was also significantly nicer to input back when APL keyboards existed and typeballs did as well.
Very few people strike gold and then have the willingness to not sell it at market-rate. Unfortunately, most people implementing APL are very aware of the rate. And it's high.
A language doesn't cost... the environment might. Why did nobody pick up on the ideas? C# is not inspired by K, Go is not, Swift is not, Rust is not, Clojure is not... I mean, take any big or small company that decided "we need a new programming language" - I'm not aware of many taking inspiration from K.
I get it, there are people who love K, and are productive in it. And I'm not even claiming the ideas of K are inherently bad! But you know, when the rest of the world "doesn't get it", _maybe_ the reason is not that the rest of the world is dumb? Articles like this one do K no favor, IMO.. they might "feel good" but they IMO don't prove that K programmers are superior beings who reached enlightenment - they prove the exact opposite, lack of understanding the rest of the world. It's fine to say "I'm quirky and it's fine, this makes me special". It's wrong to say "The rest of the world is quirky, they don't share my niche preferences".
Go was, by the author's own admission, a language for the lowest common denominator (the computer science undergrad). In many ways, the computer science undergrad is less competent than someone with no formal backing at all: it takes less time to teach a new skillset to a blank slate than repair the damage done to undergrads. Rob Pike's hobby is working on an APL interpreter.
C# was Microsoft's answer to Java after getting sued for cloning Java.
Rust is by C++ devs, for C++ devs.
Swift was Apple's successor to Objective-C.
Clojure is another language doing the "Lisp but with a bootstrapped ecosystem" thing, by a guy who had been active in the Java, .net, and CL communities.
All languages build on something, and when the mainstay languages are all ripping off ALGOL-60, ALGOL-60 clone after ALGOL-60 clone is what you will get.
Programming languages are just now starting to become influenced by Prolog despite the benefits of Prolog for certain tasks being apparent; the entire result of the field right now can roughly be traced entirely to UT Austin.
The point of the article was not to claim anything about the superiority of k programmers, it was to point out that the rest of the world was extremely close-minded to anything with different syntax in a comical way, along with demonstrating that a common criticism (of which there are many, 99% of which unfounded, as you can see by checking any thread on Hacker News that even briefly touches upon k or APL) was hypocritical, which I think it did a great job at.
It was a reference to the Kübler-Ross model, which does seem to apply to people who find APL/J/K at some point.
> But you know, when the rest of the world "doesn't get it", _maybe_ the reason is not that the rest of the world is dumb?
Look at the kind of replies you get on every hn thread on APL/J/K - the majority of the negative comments around readability are emotional/gut level responses.
People aren’t trying to understand K examples for a couple of hours and failing... they are making snap judgements based on their experiences in other languages.
Uh, no? I read a lot of code, and much of it which is in languages I cannot write. But I can often get the gist of what it's doing, and this in itself is quite a useful thing to be able to do.
For you, not for anyone actually working with the language. Which again points to the test being entirely irrelevant for the stated purpose.
Some random high school student probably can't read your Swift. Sure, it might help them out somehow with the Excel functions they're needing, but it doesn't change whether or not Swift is a good or bad language, or whether the test actually means anything significant.
I think a random high school student who has just finished AP Computer Science should be able to read my Swift, at least when the task itself is not complicated. Maybe not all of it, but I think it's a failure if they can't get the gist of it. (Interesting anecdote: I have gotten emails from people who have translated some of my Swift code which talked to an API endpoint I reverse-engineered to their platform of choice. They knew what at GET request is, but they were familiar with was JavaScript or Kotlin or whatever–but they were able to make use of code written in a language they couldn't write. I think that's really great.)
You added more criteria to this high school student. Almost every one of them will have to use and program spreadsheets, few learn to program using an ALGOL-like language.
The idea you put forth disqualifies any programming language that isn't derivative of ALGOL: it's exactly the sort of thing used against Lisp dialects, and many other languages that have shown themselves to be quite wonderful ("What's cdr? That's stupid!" "You're telling me you use color instead of operators? Malarkey!" "What in Hell is Reverse Polish Notation?! This is America!").
APL (and derivatives, but using APL as it's the chief example of the paradigm and because if you know APL, k is readable) doesn't require a semester in a classroom, it requires a week or two and a book.
(Question: when did they start teaching AP Computer Science?)
Somewhat; there's a number of things about K which make it uniquely difficult to understand. One is of course that it is unpopular and nobody has really seen it before: while this isn't an inherent issue with the language, it is a practical one, since K exists in a world where other programming languages exist and people with experience in those languages exist. Something that can be understood by more of them is generally a nice feature to have. K's style of short identifiers makes it even harder to be accessible: while many non-programmers can kind of guess the purpose of some programs from these, K makes this essentially impossible. I would have to disagree with your assertion that APL's derivatives are fundamentally better and/or faster to learn actual computer science (note: AP Computer Science–which I think started in the '80s?–does not actually teach this very well, either). I think they're probably comparable, although it might be easier to get some of the numerical stuff because many students might be familiar with math.
What does "legibility" really mean? Presumably it means something like the ease of going from looking at code to understanding what it does and how, and having a mental model to reason about the code and what modifying it will do.
How many times have you opened a Java/C#/C++ project and opened up class file after class file trying to understand how the whole thing is pieced together, faced with a monstrous system and, days later, still no understanding of how the system works at a macro level and how everything is connected.
How many times have you started browsing a GitHub repository looking for where something is implemented, opening file after file only to give up an hour later with absolutely no understanding of the organization of the system or being able to predict where any particular functionality is implemented.
Most people would say Python is a fairly legible language. You can write a class or function and most programmers can understand how it works fairly easily. However, with Python (or any mainstream language) you are only looking "down". You see what the function uses and how. You cannot see how the function itself is used. The other part of a function's meaning is how it's used idiomatically and what problems are solved by it. This is why, when figuring out how to use a library, most people prefer to look for an example than to browse the code. I believe any codebase of sufficient size effectively becomes a sort of library for solving its specific problems in its specific ways. There is a sort of hierarchy of increasing domain-specificity from language to library or framework to codebase, where each level has its own idioms and preferred ways of doing things.
Even ignoring the seemingly unnecessarily obtuse notation (which you really do get used to reading, but I recognize it's a hard sell to say "use it regularly for a few weeks/months and it'll all be perfectly clear trust me"), K (and J, and APL which I'm more familiar with) are legible in a way that other languages are not. When the entire system fits in a page of code you can understand everything about it from the top down. APL allows you to look "up". From any function you can see, instantly, without endless scrolling and switching between dozens of files, everywhere the function is used and how and what functions it's used in conjunction with. You do not need to press "switch to header file" ever. You do not need to press "go to definition" ever. The definition is right in front of you. Often since APL prefers idioms to building giant towers of abstraction, the definition is the name.
Not using "for" loops is not really the point of APL. The point of APL is that being really, incredibly, seemingly unnecessarily concise fundamentally changes how you read code.
> However, with Python (or any mainstream language) you are only looking "down". You see what the function uses and how. You cannot see how the function itself is used.
Python/numpy is most accessible and probably closest to “instant gratification”. Mathematica is great for, you know, mathematics, but you’ll need a license. And there is always haskell.