Hacker News new | past | comments | ask | show | jobs | submit login
The APL Source Code (2012) (computerhistory.org)
106 points by pncnmnp on Dec 20, 2022 | hide | past | favorite | 58 comments



A modern version of APL here: BQN [1] [2].

Its introduction says: "Although it maintains the concept of array-driven computing and much of APL's array functionality, BQN discards all compatibility with other array languages and changes many fundamental concepts. It uses the based array model with dedicated array notation, distinguishes between data types and expression roles to give the language a context-free grammar with first-class functions, and uses a new set of glyphs with different primitive pairings."

[1] https://mlochbaum.github.io/BQN/doc/quick.html

[2] https://mlochbaum.github.io/BQN/tutorial/index.html


If you’re interested in learning modern APL, check out https://xpqz.github.io/learnapl

Disclaimer: author


I always loved APL when I was a student around 1980, I had access to the little IBM "transportable" that supported APL and Basic, I think it was the 5100 or 5110. I remember writing a little table tennis game in APL, which was likely unreadable to me a day later.


Sounds silly, but do you have the source code for your table tennis game? If not, do you remember the logic you used? Was it using Graphpak? I am curious as to how one would write a table tennis game without any conditionals.


APL can branch on conditional to a specific line or label, or fall through to the next line.


I like APL (and k/j). Problem is, outside trading (and even there it's a niche), it is just not popular enough. This also because most people dismiss it as line noise / write only programming language; that's usually only people who never used it or used it briefly saying that. But it's hard to promote anything that has a reputation like that.

What is a fairly interesting observation (more people have made it, including Arthur Whitney); being able to reason about a program for which you can see all the code on one A4 without having to clickthrough into functions, search, browse through files, classes or even having to scroll is incredibly powerful. All you need is the cheatsheet (provided in your article) (which you know by heart in no time flat) and the program, which, often fits on 1 A4 or less. Modern example of that would be Aaron Hsu his 17 LoC parallel APL compiler[0] (from his thesis).

[0] https://www.bonfire.com/co-dfns-thesis-edition/


The reason you give (write only programming language) is indeed ONE of the reasons people may give when discussing about array-languages, and it is actually a serious reason to consider. Having spent much time on it as well as on J, I hope not being part of those "only people who never used it or used it briefly" because I would also add another reason, more important to me.

APL emphasizes some "formal external beauty" (of code) over the true beauty of an algorithm. Of course you can do such or such thing in 2½ characters of APL code rather than 60 boring lines of whatever language you want but it is generally full of unneeded hidden loops (or looping uselessly on whole parts of arrays when only a small part is required). Actual complexity of these "elegant" APL solutions is generaly insane.

On the other hand, the J language has a set of some "optimized" idioms -if I remember correctly. Not sure the thing is really better: "write it that way by pretending you are following the array-way-of-life; of course it would be insane to actually do it that way, but my engine will detect the very specific thing you want to do and will use a state-of-the-art algorithm for that purpose". Not sure it is a good thing because the task becomes over complicated: knowing the true (boring) algorithm, knowing the specific array-syntax encoding that algorithm, which falls back to looking in some "list" of what is optimized or not.

While I agree these languages are fascinating to think about computing, learn new things, etc. I finally went to more or less agree with Dijkstra (not the worst reference?) about these languages being "a mistake, carried through to perfection".

What is wrong about writing well commented, well indented, 30 lines of code if I can implement the exact algorithm that way?


> What is wrong about writing well commented, well indented, 30 lines of code if I can implement the exact algorithm that way?

Nothing, it's just interesting and I wish I could do a larger project with one of them outside just playing around.


Ironically, APL has won as a programming language. Look at the SIMD instruction set - it's APL!


Vector instructions came around roughly at the same time as APL, though.


Can you elaborate on that?


They're both array operations, even comparisons. No branching.


> This also because most people dismiss it as line noise / write only programming language

Not so. Anyone who says these things simply does not know APL. It's like saying musical notation is write-only. Sure, if you don't know it reading the darn thing is excruciating. Yet, if you do, you can see music on the page.

What APL is not is a low-skill, low-knowledge language. A casual glance at the language isn't enough to know it. Just as you can't internalize musical notation without dedication, work and experience.

Back in the '80's I used APL professionally for about a decade. From that context, it reads like music. To this day I can read it just fine (though not as quickly). Interestingly enough the same is the case musical notation, which I could read without thinking twenty years ago, when I was studying and playing classical guitar and piano with great regularity.

That said, I would not recommend anyone get serious about APL today. Study with some degree of dedication? Yes, I think that would be of value. Real projects? No. Definitely not.


Lots of programming isn't math or logic, its shovelling data and manipulating strings. Is APL good at that?


If a string (or a file or whatever) is "just" an array (of bytes; characters, again: whatever), then yes.


If that means "you don't get much more than what C offers", then I'll take that as a no. I18N, Unicode, Regex, built-in string manipulation functions, character set conversions and efficient data structures are table stakes at this point.


Sigh.

No, it’s quite a bit more than C offers, which has a paltry twenty-eight “operators”, none of which “work” on arrays. Most of the libraries in C and C-like languages have huge libraries because getting those twenty-eight operators to do anything can be a lot of work if you don’t think about it much.

In contrast, Iverson has more than 50 excellent operators, sensibly extended to arrays. Not all APLish languages have all the operators, but the choice about what they do have typically considers how all of Iversons operators can be/are used and how/when they can be implemented in terms of each other. This is largely a matter of taste.

And better symbols means better ideas. Some things are just more obvious in APL. And once you see it, you can take it with you.

When using C, I don’t typically use the C library, or libraries for any of those things you mentioned, because they just aren’t that complicated once you have seen APLish solutions to those problems. And that’s a good thing because less code means less bugs.


So if I understand you right, APL treats strings as arrays of whatever. This makes sense and isn't particularly hard to understand. Is there a way to treat Unicode strings as arrays of graphemes, instead of bytes or codepoints?


Dyalog APL does this by default.


It does code points, not graphemes. Graphemes are a pretty niche thing, I believe? Raku's the only language I've heard of having built-in support rather than using a library.


Ah oops, I misremembered what they were.


APL is very decent at string manipulation. I could try to explain why, but I am afraid that you'll have to see for yourself to understand why what you wrote is both true and not very applicable to APL.


This is indeed part of what I meant to see before in this thread; it is very hard to convince people, who are kind of biased already by the syntax and the 'old age' of the language, to just try it out for a while. It's the same issue with Lisp/Scheme; if you do wrestle through Common Lisp: A Gentle Introduction and/or SICP, you'll see things in a different light, probably forever and for the better. But it takes (a lot of) effort and if you are biased against something from the start, that's not going to stimulate you much.


> All you need is the cheatsheet [...] and the program

Or an interactive cheatsheet website[1], where you can paste APL code, and mousing over symbols gives their definitions?

[1] http://I-wish-this-existed-but ...



APL prevented me from getting into software for many years.

Back in the 80s I was an mainframe engineer at IBM. One of our customers (a bank?) was heavily into APL, and occasionally I would fix one of their keyboards.

(Sidenote: these were beautiful keyboards, and their keys were individually repairable. Also pleased to share my first mastodon link! https://mastodon.social/@hanshuebner/109412183054578487)

Watching people programming on these looked so intense and intimidating that I decided software wasn't for me, and it took years to find out today actually it was.


I believe Einstein’s only notational innovation was the summation convention, which remains very popular. Am I wrong?

By the way, APL was the first language I learned, on the APL/360 system described here, using paper terminals.


The a programming language programming language


T a programming language PLSC


^ the title was (and the original title is) the APL programming language source code, but they edited it for whatever reason.


"ChatGPT explains Arthur Whitney’s J Incunabulum"[1] was linked in a recent comment[2].

Reading APL code, at least for a novice me, includes puzzling out "ok, now what's going on there?". Being able to highlight a bit of code, and get an explanation, as if sitting with the author, or reading annotated code, might usefully lower the barrier to reading APL?

I've seen several "here is an APLish program, and I'll go through and explain it" posts over the years, though I'm failing to find one now. If that could be somewhat automated...

[1] https://medium.com/@solarbreeze69/chatgpt-explains-arthur-wh... [2] In "J one-page interpreter fragment (1992)" https://news.ycombinator.com/item?id=34050715 , the brief comment https://news.ycombinator.com/item?id=34053932 .


Interesting observation. In general this would allow writing code in formats designed for writing (concise shorthand like APL), and then separately expanded to a format optimized for readers that are less familiar.


There is The Array Cast podcast that covers mainly APL and it's related languages (k, q, J, etc.) that is helpful for learning what is happening in array focused languages.

https://podcasts.apple.com/us/podcast/the-array-cast/id15688...


I used APL in university and later on during Y2K conversions (fun stuff), but found out much later on that APL was the de-facto language and interface to of one of the first microcomputers - the MCM/70 [0]

[0] https://www.historyofinformation.com/detail.php?id=4340


> But APL programs are often cryptic and hard to decode. Some have joked that it is a “write-only language” because even the author of a program might have trouble understanding it later.

So the original Perl?


I actually find it way easier and shorter to review code I've written in J and APL, because it is so succinct and the symbols are well defined. Having to review something that is a paragraph or one page of code is much easier than following pages of other programming languages code. You can comment APL as you can in most PLs, and with the REPL-like dev environment, you can test pieces of code for verification quite immediately and easily. I love APL and J. I am currently using April [1]. APL in Lisp for the best of both worlds - the expressivity of APL (and Lisp), and the CL ecosystem for batteries included. The mix is always surprising.

  [1]  https://github.com/phantomics/april


we need just the correct expectations. APL is like mathematics for me. i need to read it several time to understand dense unfamiliar terms and formulas. same goes for APL. it is not the typical language which i assume can be read while driving


I'd argue you could view 2 lines of APL while driving with less distraction then flipping the screen or page to read your 2 pages of code in most other PLs.


What are you using April for?


I have always liked Lisp and array languages, so I am finding my footing with April, since it is very flexible. You can write predominantly in one or the other, but I am finding it easier to do the math and array stuff in APL, and the glue in Lisp. Right now I am experimenting with creating some generative art and music with CM (Common Music, a Lisp music composition system that lends itself well to creating algorithmic music) and using APL to create the patterns for CM as arrays and composed functions. I want to add graphics and animations to the code once I get CM working with April. I just started learning April, and I am having so much fun with it, I need to stay focused on my original task!


Another parallel for Rust is Rust and BQN. BQN is a new array language, and you can run it within Rust sort of like what April does for Lisp and APL.

https://detegr.github.io/cbqn-rs/cbqn/


Way harder to read than Perl..

You can create obscure Perl as you can create obfuscated C as in any language, but if you think about the reader you can create 'easy to read' program in Perl, even by a beginner.

But creating 'not obfuscated' APL? That's really hard!


Nope, not even close. I’ve written/read lots of Perl.

APL is a completely different paradigm.

Every now and then I get the urge to take a deep dive into APL or J but I really need a use case.

Since they are array languages they seem like they might be good for machine learning?


Image processing is a fun use case. For example, here's a bit of Common Lisp using April and the Opticl library to create a mirrored meme image:

  (opticl:write-png-file 
   "~/out.png" (april-c "2 0∘{(x s)←⍺←2 0 ⋄ (⌽[x],[x]⊢)⌽[x]⍣s⊢⍵↓[x]⍨(1-2×s)×⌊2÷⍨x⊃⍴⍵}" 
                        (opticl:read-png-file "~/in.png")))
The 2 0 at the start of the APL line above controls the mirroring behavior. The second number can be set to 0 or 1 to choose which side of the image to mirror, while the 2 sets the axis along which to mirror. This will be 1 or 2 for a raster image but this function can mirror any rank of array on any axis.

April was used to teach image filtering in a programming class for middle-schoolers, you can see a summary in this video: https://vimeo.com/504928819

For more APL-driven graphics, April's repo includes an ncurses demo featuring a convolution kernel powered by ⌺, the stencil operator: https://github.com/phantomics/april/tree/master/demos/ncurse...


Here's one paper on CNN's in APL: https://dl.acm.org/doi/10.1145/3315454.3329960


The use case is use J or K is to do advent of code or the Euler project :p

APL is a tool for thought more than a real programming language so in my opinion it's best suited to boost your puzzle solving ability.


Mandatory APL demonstration from 1975

https://youtu.be/_DTpQ4Kk2wA


> Expressions in APL are evaluated right-to-left

I find it more natural to evaluate left-to-right, because that's the order we read things in. For example, to read a file and sort its lines (in D):

    auto sortedLines = File("file.txt").byLineCopy.array.sort;


Isn't the assignment operator evaluated last, though? I don't know of any language that would support "true" left to right evaluation, though it would be neat! I suppose it would look like:

File("file.txt").byLineCopy.array.sort = sortedLines;

...which by first blush, seems bananas, but if you were to read it as English, would make sense: "Load the file 'file.txt' into an array of lines, sort them, then assign it to sortedLines"


> Isn't the assignment operator evaluated last, though?

Yes. Though one could use the . operator and a sink to make it completely left-to-right.


It's the symmetry of the '=' symbol which creates the difficulty. Replace it by an arrow in the direction of the assignment and it doesn't look strange anymore. File("file.txt").byLineCopy.array.sort -> sortedLines;

Especially if you add the |> operator (from Erlang I believe) to 'pipe' the data.

File("file.txt") |> readlines |> sort -> sortedLines;


I think it would be roughly like this:

  ↑ nget "file.txt"
Anyway I wouldn't want to munge strings in it, but that is not really its niche. Right-to-left can be nicer in different contexts, one is not really better or worse than the other.


My very first programming experience was with APL\360, using a time share IBM terminal with the APL type ball. We wrote code in a Geodetic Sciences class to compute Least Squares solutions to survey networks. I really liked APL...


> To access this material, you must agree to the terms of the license displayed here, which permits only non-commercial use and does not give you the right to license it to third parties by posting copies elsewhere on the web.

We have to agree to these kinds of terms about something that is more than half a century old and has zero commercial value. Something in the copyright system is deeply broken.


> We have to agree to these kinds of terms about something that is more than half a century old and has zero commercial value. Something in the copyright system is deeply broken.

I don't disagree.

The trouble is that, while almost everything 50 years old has almost no commercial value, a few things that old have tremendous commercial value.

My preferred solution is to have a very short automatic copyright period combined with a fee for registered works that increases exponentially based on the date of registration. Not that it really matters - I'm sure such a solution would never be enacted into law.


The current IBM zSeries is still backwards compatible with System/360 (and latter) models, so there could still be important APL\360 running in the wild.



I'm skimming the source code, noting a few things that might help others do so. The "ADD NAME=" lines separate what would have originated as separate decks of punched cards. Each card is an assembler statement; each has an 8-digit sequence number in columns 72-80. You might note that sequence numbers are sometimes not consecutive. This reflects changes due to editing. As long as the deck could be sorted in correct order on the last 8 columns it was ok.

Each such source deck would have been assembled to make an object deck. There are assembler commands for formatting the printed assembly listing: TITLE to set a running title on the listing pages, PRINT OFF to avoid listing repetitive macros, PRINT ON, PRINT NOGEN to not print macro-generated statements, EJECT to start a new listing page.

A DSECT line introduces a data section, essentially a record. The pseudo-instruction USING tells the assembler which register contains the base address of a DSECT, so USING MKLOCLS,14 says, assume R14 has the address of the DSECT MKLOCLS. Reference to a name defined within that DSECT will generate an instruction using R14 plus the offset to that name.

A CSECT names a "control section" or named block of instructions. The linker handles CSECTs and DSECTs as units. A CSECT typically begins with USING *,15 or some other register, to inform the assembler that at run-time R15 will have the base address of this code. Local jump addresses will assemble as offsets from that register. A DROP <reg> statement tells the assembler to no longer assume that register is in use.

There are inside jokes: the code with TITLE 'THE AGORONOMIC ROUTINES' is memory management.

APL\360 managed memory on two levels. First, as a time-sharing system, it created a separate "workspace" for each logged-on user. A workspace was a contiguous block of memory (typically 32K as a practical size) that contained all of the user's current code definitions. There was not room in the typical 360 RAM for more than a few user workspaces at one time, so the system swapped whole workspaces in and out as it serviced each user. The code to do this operated the disk drives directly on a head and track level and was highly optimized. You'll see a number of references to "one-track workspace" as a special case; if a workspace fit on one track of the (2311 or 2315) disk, it could be read in or written out in minimum time.

On the user level, the interpreter while executing APL code is constantly creating and trashing values which are stored as memory objects within the current workspace. So there was memory allocation and garbage collection. Garbage collection was a simple sweep across the workspace moving all active objects to one end.

APL would be initiated as a normal job under OS/MFT and would immediately take control of the entire system, kicking the OS out and managing all aspects of the 360 hardware itself, including interrupt handling, storage protection and so on. This was before the day of hardware-supported virtual memory, so it didn't have that to deal with. See NAME=APLSAPLM and forward. I'm pretty sure in that code, TCB stands for Task Control Block, a dispatching object. APL had multiple internal tasks being time-shared, as necessitated by the fact that it had to handle asynchronous interrupts from both terminals and disk drives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: