Hacker News new | past | comments | ask | show | jobs | submit login

Nice to see this getting some attention again. I hope some people venture out to learn about the actual J language:

https://code.jsoftware.com/wiki/Guides/GettingStarted

For what it's worth, I've also studied this code a bit. This repo has an annotated and (somewhat) reformatted version of the code:

https://github.com/tangentstorm/j-incunabulum




J is really great. I spent some time last year playing around with it, Dyalog APL, and some other new array languages like BQN.

I was extremely impressed by the breadth of integrations into different ecosystems that the J community had created (like R and the web tech).

Using the language reminds me of using Common Lisp. There are a lot of things that seem odd now, like how you define new words (i.e. functions), how namespaces work, or how the FFI/system calls work (i.e. !: ) [1]. Kind of like how in CL things are named "mapc", "mapcar", "mapcan", etc. Both kinds of quirks come from the fact that these people were really innovating in new frontiers, and Ken Iverson and Roger Hui just kept on developing their ideas.

[1]: https://code.jsoftware.com/wiki/Vocabulary/bangco for how it works and https://code.jsoftware.com/wiki/Vocabulary/Foreigns for what you do with it.


Since we are on the topic, I've thought about APLs a decent amount so here are some other resources/notes. I'm not an expert on this topic - I don't work with or research the language or anything. These probably are not good getting-started resources.

There is a VM model for APL languages[1] which can make optimizations comparable to those made by CLP(FD). If you read about CLP(FD) implementations[2], you'll see operations similar to what the "An APL Machine" paper calls beating. I'm not sure if any APL-like languages actually implement such optimizations.

There are different models of arrays (and their types) used by APL-like languages[3]. Also array frame agreement can be statically typed[4], though it usually isn't.

Some other OSS implementations of similar languages include Nial[5], ngn/k[6], and GNU APL[7]. My favorite is ngn/k. If you use a K-like language, a great source of inspiration is nsl[8].

There is an unusual and fun calculus book that uses J, by Iverson, but it moves somewhat quickly and loosely[9]. It perhaps gives a good example of what APL was intended to be(?). On that note, his original paper, "Notation as a Tool of Thought" is interesting[10]. There is also podcast interview with Robert Kowalski, one of the creators of Prolog, who says - if I remember correctly - that he was looking for a better way of thinking when he came up with SLD resolution[11]. It's interesting how these languages came out of different paths towards a similar goal.

Also beware the reverence of Arthur Whitney. His work is definitely inspired, but the community around K can seem schizoid-like[12], in a way comparable to Wolfram's projects[13].

That said, J is an exceptionally fun language to use. My favorite insight from an APL-like language that generalizes is how K encourages writing functions that converge by the easiest-to-use loop operator being one that applies a function to an argument repeatedly until the output stops changing.

---

[1]: https://www.softwarepreservation.org/projects/apl/Papers/197...

[2]: http://cri-dist.univ-paris1.fr/diaz/publications/GNU-PROLOG/... (there are probably more to the point papers, this is just the one I read when I noticed the similarities).

[3]: https://aplwiki.com/wiki/Array_model

[4]: https://www.khoury.northeastern.edu/home/jrslepak/typed-j.pd... (implemented in racket iirc)

[5]: https://www.nial-array-language.org/

[6]: https://codeberg.org/ngn/k (honestly it is a miracle this exists)

[7]: https://www.gnu.org/software/apl/

[8]: https://nsl.com

[10]: https://www.eecg.utoronto.ca/~jzhu/csc326/readings/iverson.p...

[11]: https://thesearch.space/episodes/1-the-poet-of-logic-program...

[12]: https://www.ijpsy.com/volumen3/num2/63/the-schizoid-personal...

[13]: http://genius.cat-v.org/richard-feynman/writtings/letters/wo...


Here are my two cents on array compilation. I think a lot of the research goes in the direction of immediately fixing types and breaking array operations into scalar components because it's easy to compile, but this ignores some advantages of dynamic typing and immutable arrays. When you can implement most operations with SIMD, a smaller type always means faster code, so dynamic types with overflow checking can be very powerful on code that deals with a lot of small integers.

https://mlochbaum.github.io/BQN/implementation/compile/intro...

I'm somewhat skeptical of the virtual optimizations on indices, "beating" and similar. They sound nice because you get to eliminate some operations completely! But if you end up with non-contiguous indices then you'll pay for it later when you can't do vector loads. Slicing seems fine and is implemented in J and BQN. Virtual subarrays, reverse, and so on could be okay, I don't know. I'm pretty sure virtual transpose is a bad idea and wrote about it here:

https://mlochbaum.github.io/BQN/implementation/primitive/tra...


Blind virtual transpose (as seen in numpy) is a bad idea. A principled, locality-aware version would be fine and good.


> VM model for APL languages

It's cute—but from my skimming a while ago fairly primitive. We can do much better with less effort using more general mechanisms. (Not a knock—it's a product of it's time—a lot of old compiler tech was not very good and even so remains unsurpassed.)

> statically typed

I in principle espouse a much more nuanced view than this, but in short: just don't.


> statically typed APL

that’d be a curious case indeed, why not, only it won’t be APL even remotely :)


> applies a function to an argument repeatedly until the output stops changing

In other words: instead of worrying about which n to use for "loop n times", it just always loops (effectively) an infinite number of times...


What is reference [9] ?


Ah, mea culpa, it is here: https://www.jsoftware.com/books/pdf/calculus.pdf

The preface says:

The scope is broader than is usual in an introduction, embracing not only the differential and integral calculus, but also the difference calculus so useful in approximations, and the partial derivatives and the fractional calculus usually met only in advanced courses. Such breadth is achievable in small compass not only because of the adoption of informality, but also because of the executable notation employed. In particular, the array character of the notation makes possible an elementary treatment of partial derivatives in the manner used in tensor analysis. The text is paced for a reader familiar with polynomials, matrix products, linear functions, and other notions of elementary algebra; nevertheless, full definitions of such matters are also provided.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: