In one line of J (both APL and J were created by Kenneth E. Iverson)
life=: (3 3 (+/ e. 3+0,4&{)@,;._3 ])@(0,0,~0,.0,.~])
Given a life instance, the next generation can be found by:
1. adding extra empty cells, surrounding the life instance,
2. tessellating the result, finding every overlapping 3 by 3 subinstance,
3. totaling the number of live cells in each subinstance,
4. treating a subinstance as a live cell iff that total is a member of the sequence 3,x where x is 3 if the center cell was previously dead, and 4 if the center cell was previously alive (that said, note that 4 is also the index of the center cell, with the sub instance arranged as a flat list).
Please, people, stop using, promoting or even placing J on the same footing as APL. It is an absolute abomination of a language. It's ONLY motivation was that doing non-ASCII characters was really difficult way-back-when. That's it. And in creating J they absolutely butchered the very principles Iverson and others were vocal and passionate about promoting and teaching.
I learned APL about thirty years ago. Yes, we had computers thirty years ago. I learned it from what I'll call "first generation APL'ers". It's like saying "the first inner circle who started to dabble in Linux as soon as Linus made is public". The first disciples of Iverson, if you will. So, I am two levels away from Iverson. I met him multiple times at conferences, seminars and APL university meets. Him and other APL pioneers like Ray Polivka, Sandra Pakin, James Brown, Adrian Smith and Paul Berry. All of these people wrote books about APL and taught it.
Big deal, right? Well, yeah. Because I was in the room, a very small room, with maybe 200 people, if not 100, when this paper was presented by Iverson himself (in fact, I have a picture with him at the event):
Notation as a Tool of Thought was an incredible eye opening paradigm for me back then. I sat there listening to this presentation that pulled together all had learned about APL during the prior year or two of using it. And it was revelation. Notation made the power of the APL concept that much more than the ideas of easy array manipulation. It brought computational problem solving to a level of expression matching mathematical notation. This was way ahead of it's time and I felt privileged to be among the first generation of people using it.
Yet, there was a problem. Machines of the time made it difficult to work with non-ASCII character sets. We had to use specialized terminals (IBM and Tektronix made some).
When the IBM PC came out we had to literally burn a custom character set ROM for the graphics card and replace the line drawing characters with APL characters. That was great, but it sure didn't make for easily opening up APL to the masses. You had to be very, very interested in APL to not only buy an expensive APL interpreter but also engage in swapping chips out of your computer. The language was definitely ahead of it's time. In some cases you had to buy very specific hardware or it would not work.
As PC's exploded and portables like the early Compaq's came out it became more and more difficult to deal with the character set.
And so, I remember going to APL conferences where the concept of transliterating the language started to surface. The idea being to use a set of ASCII characters to represent APL characters and, therefore, enable this franken-APL to be accessible without hardware mods to your computer.
This was incredibly short-sighted. Many of us knew it was a matter of time before graphical display of symbols (like math and music notation) would be easy.
Eventually this evolved into J. A notation that completely abandoned the paper that earned Iverson the ACM Turing Award. It's APL without perhaps the most powerful thing about APL. It looks like a terminal shitting characters because it is operating at the wrong baud rate. Iverson made a huge mistake with J and he abandoned EVERYTHING he was so right to champion years earlier.
If we are ever going to go in that direction again (I think we have to, in some form) we need to understand that notation, more than almost anything else, was the key.
Computational problem solving needs to evolve beyond a state where the best we can do is type English words into a text editor and into something that is far more expressive. For that you need a notation, not an abomination like J.
I use both J and Dyalog APL. It is my understanding that Arthur Whitney, Roger Hui and Ken Iverson himself worked on J. The goal was not just to address the difficulty of using non-ASCII character, but to add funtional programming, hook and fork, easier tacit programming and fix certain other issues that even Ken Iverson himself acknowledged [1][2].
PS: I use Dyalog APL because I do personally like the once-alien-to-me characters!
> to add funtional programming, hook and fork, easier tacit programming and fix certain other issues that even Ken Iverson himself acknowledged
Whitney and Iverson disagreed about a lot of these things - e.g. Whitney prefers scan and fold to go left-to-right, rather than Iverson's right-to-left. Iverson is consistent with the rest of the language, Whitney is not (though much more practical). Also, Iverson added hook and fork; Whitney preferred the much simpler multi-monadic or dyadic-multi-monadic juxtaposition.
I don't remember where I read this, but I do remember reading Iverson conceding that Whitney's practical approach is probably a better idea than his own pure approach.
What's interesting is that in reference [1] you provided Iverson himself reveals exactly what I said about the reasons for which J abandoned APL's symbology. Here's an excerpt:
"When we came to implement APL, the alphabet then widely available on computers was extremely limited, and we decided to exploit a feature of our company’s Selectric* typewriter, whose changeable typing element allowed us to design our own alphabet of 88 characters. By limiting the English alphabet to one case (majuscules), and by using the backspace key to produce composite characters, we were able to design a spelling scheme that used only one-character words for primitives.
Moreover, the spelling scheme was quite mnemonic in an international sense, relying on the appearance of the symbols rather than on names of the functions in any national language. Thus the phrase k↑x takes k elements from x , and ↓ denotes drop.
Because the use of the APL alphabet was relatively limited, it was not included in the standard ASCII alphabet now widely adopted. As a consequence, it was not available on most printers, and the printing and publication of APL material became onerous. Nevertheless, in spite of some experiments with “reserved words” in the manner of other programming languages, the original APL alphabet has remained the standard for APL systems."
I actually had one of those IBM Selectric printers with a custom APL print head.
The thinking behind this approach was completely flawed. APL wasn't and never did compete with BASIC or FORTRAN. It was a language ahead of it's time and one that used symbols associated with meaning much like math does. The integral symbol is understood by US, Russian, French and Pakistani scientists and engineers without necessarily having to know each other's spoken languages. That's part of the power of notation as a tool of thought, ideas are reduced to these symbols that are filled with meaning. For example, ⌊ and ⌈ represent minimum and maximum, respectively. You only have to see these once to get it and, likely, never for get it.
Why is notation a tool for though? Well, the paper I referenced in my prior post covers this well so I won't try to redefine it here. Using the example of the symbols I introduced for minimum and maximum we can show a simple case of this:
A ← B ⌊ C
Means "set A to the lesser of B and C". Which would be something like this:
if(B < C)
{
A = B;
}
else
{
A = C;
}
Or
A = (B < C) ? B : C;
The latter feeling more "expressive" and concise.
So, what's the big deal, the second form (or a range of other forms) are easy to read and understand. And that's true. Yet that isn't notation, it's a bunch of ASCII text and unnecessary stuff that ONE SYMBOL can replace.
What happens if A, B and C are arrays of 100 elements? Or, matrices of 10 by 10 elements? Or tensors having 10 by 10 by 50 elements?
Well, in APL the expression might still be relevant:
A ← B ⌊ C
This will result in A containing the result of comparing B and C element by element and picking the smallest of the two. Now we are expressing ideas without piles of code. A C-like language would require two or more nested loops, indices, etc. Your mind quickly starts to separate from living in the problem space to dealing with the mechanics of expressing ideas. A single symbol becomes ten lines of code.
BTW, I said "might still be relevant" because it depends on what we are after if we are comparing two vectors for a "minimum of B and C" condition. I won't get into all the permutations. I'll just keep it to the simple case of an element-by-element comparison.
When we introduce the scan operator "/" things get more interesting. This operator applies the function to the elements of a vector (or the first dimension of a multidimensional array):
A ← ⌊/ B
This will take array B and return the smallest value. If B is a matrix, it will do this for each row.
Again, we are expressing ideas with simple symbols, just as we do in math.
J departed from this in order to solve the weird symbols "problem", when in reality, the weird symbols were what wielded APL's power and, in my opinion, could have become the future of humans telling computers what we want them to do and how.
Think about it. Why do we need so many ASCII-english-based languages? A loop is a loop is a loop. A conditional is a conditional is a conditional. Objects, classes, functions, etc. Same crap. And, in most cases, in the end, very similar machine code.
We need to eventually make a departure from this and create a "CS language", one where doing something like finding the smallest element of an array boils down to a symbol and not much more than that. One where, perhaps, a genetic binary solver or just the genetic step of mating individuals of a population can be represented by a single symbol or a set of relevant symbols. Perhaps sorting can have it's own symbology.
I'm just riffing here but I hope you get the point. Over time CS will no longer be about developing more and more languages that repeat 95% of what every other language has done and will ever do but rather evolve into adding to the symbolic alphabet of CS, which, when executed by any machine, will produce the same results, today or in 500 years, just like mathematics.
This is interesting to hear since I came to J without having worked on previous APL flavors and am still a big fan of my brush with it. I will go read that paper in depth though so I can at least grasp what I am missing. What is the most tractable example of what you feel J lost from APL?
I wish I had the time to devote to thinking and doing research on this front. Running three businesses makes that pretty hard. At the same time, I realize it might take someone who has the historical frame of reference to fully grok what could be possible. Having used APL professionally for about ten years on solutions ranging from business to DNA sequencing software does give me that perspective.
Having said that, there are plenty of people out there who've had the same or even a deeper experience path in APL. The question is whether or not they have the interest in using some (not all, APL isn't perfect, by far) of the concepts to try and understand how we might tell computers what we want them to do in 100 years from now and beyond.
Again, if I could focus my efforts and devote all my time to it I'd do it. I just can't.
Having dealt with APL in the past, I've always been a bit skeptical of the "Blub Paradox" and other forms of Lisp triumphalism. APL is clearly far more powerful that any other language in some respects and far more limited in others.
life=: (3 3 (+/ e. 3+0,4&{)@,;._3 ])@(0,0,~0,.0,.~])
Given a life instance, the next generation can be found by:
1. adding extra empty cells, surrounding the life instance,
2. tessellating the result, finding every overlapping 3 by 3 subinstance,
3. totaling the number of live cells in each subinstance,
4. treating a subinstance as a live cell iff that total is a member of the sequence 3,x where x is 3 if the center cell was previously dead, and 4 if the center cell was previously alive (that said, note that 4 is also the index of the center cell, with the sub instance arranged as a flat list).