Hacker News new | past | comments | ask | show | jobs | submit login

arcfide's answer is excellent; there's another aspect I like to present to this kind of rant (which invariably always comes up in the discussion of APL like languages):

When people write code in Pascal/Java/C/C++, it is considered bad style if it is not indented properly (where "properly" is the subject of religious wars, but there is mostly universal agreement that deeper scopes should have more indentation). That's not an issue with the language per-se, as it is pervasive among all those languages that did not eliminate it a-priori (like Python and Nim).

The compiler doesn't care; but programmers do, because we read code a lot more times than we write it, and the visual structure of the code informs us of (some of) its semantics, e.g. "this indented block is dominated by a condition specified in the lines above and below it; now let's see what the condition is, and whether it's an "if" or a loop of some kind".

And the real reasons programmers do, is because it makes the code easier to read "at a glance" -- using our efficient visual system to quickly recognize patterns immediately.

Using APL/J/K goes to the extreme with this principle - as it allows whole (useful!) functions to be compressed down to a few characters, it cuts down the need for at least one, often two or more, bottom levels of abstraction - one doesn't need to abstract computation of an average to a routine called "average" when the entire implementation (e.g. in J) " +/ % # " is shorter than the word "average".

One might argue that it is still better to name this; However, +/%# has the distinct advantage that it also conveys the behavior on an array of length 0 or a 2d-matrix, (what does your "average" function do? you have to read the source code or the often-incomplete documentation), the behavior on a matrix (it give a column by column average), etc.

There is a learning curve, for sure - but just like mathematical notation, once you have learned it and used it, it is concise, readable, pattern-matchable-with-eyes, and no one goes back to "let us multiply x by itself y items" when they are familiar with the notation "x^y".

Also, C++ is significantly more popular than the programming language "ADD ONE TO COBOL GIVING COBOL" for the same reason.




You have somewhat of a small point there that if a calculation is made up of a four-symbol idiom, it bears repeating in the code rather than hiding behind a definition. However, that doesn't extend to arbitrary length. There is a threshold at which it's better to have a definition. I wouldn't copy and paste a sequence of 15 symbols, say. Somewhere between 4 and 15, there is a "knee".

Fact is, large software is made up of definitions. Vast numbers of them.

You can copy and paste everything only in tiny programs for your own use.

Programs are written across multiple lines not just because of indentation, but because we reference code changes to lines. Version control tools currently, by and large, work with line granularity. In code reviews, we refer to lines: "everyone, please look at line 72".

> using our efficient visual system to quickly recognize patterns immediately.

Not everyone has this; maybe only a few otherwise unemployable code-golfing idiot savants.

Maybe I could recognize +/%# if it is on its own. In real code, it will be in the middle of other such cuneiform, because it has to obtain inputs from somewhere, and there is other code expecting its outputs. I will not easily spot +/%# in a ream of similar $%#={|: type stuff.

> what does your "average" function do? you have to read the source code or the often-incomplete documentation

That just shows you're coming from an environment where you expect user-defined code to be poorly documented.

Of course, people don't overly document their page long code-golfed line noise that is for their personal use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: