Hacker News new | past | comments | ask | show | jobs | submit login
Wirth Evolution (alopex.li)
38 points by todsacerdoti on Sept 12, 2022 | hide | past | favorite | 14 comments



PL360, Algol W, and EULER are additional languages created by Wirth. PL360 was a programming language that provided low level access to IBM's 360 mainframe hardware, Algol W made a number of improvements to Algol 60 to make it more practical--I think it was a reaction to Algol 68 which was very complex for the time and consequently ended up with very few compilers[1]. I don't know anything about Euler.

I had the pleasure of meeting Nicklaus Wirth around 1977 at Texas Instruments. I worked in a department that had developed a PASCAL compiler for TI processors; Wirth had been invited to give us a demonstration of his Pascal/Modula based system. I don't remember the details of the impressive presentation, but I do remember him telling the joke about his name: in Europe they call him by name (sounds like "Virt") while Americans call him by value saying "Worth" for his last name.

I enjoyed his books a great deal and read every published paper of his that I could get my hands on. I learned a great deal from his clear writing during formative years in my programming career.

--- below this line, I stray off topic ---

[1] For example, the Algol 68 Report used a two-level grammar (Van Wijngaarden grammar) to describe the syntax of Algol 68. Such a two level grammar had one set of rules that would expand into an arbitrarily large different set of productions describing the syntax of an Algol 68 program. Vw grammars are capable of defining recursively enumerable languages. In the Algol 68 report it was hoped that more language features could be precisely described by such a formal grammar rather than by prose that stated requirements like "variable must be declared before use".

While admirable, such a goal isn't practical because the set of recursively enumerable languages is essentially equivalent to the set of Turing machine programs or Chomsky's type 0 grammars. We don't want compilers to be burdened with solving the impossible to solve halting problem.


As always, the off topic invitation cannot be passed.

The two-level grammar of Algol 68 gave the semantics of Algol 68 in a denotational language, the second level. As such, it is not more complicated as other languages with defined semantics, such as in the ML family. To my knowledge, people used that second level to implement a real virtual (ho hum) machine, resulting in a working interpreter. However, the ML aficionados can be surprised with the information that their language was not first one with fully defined semantics. Algol 68 was ahead of its time.

As for Pascal, history as I know says that Wirth invented Pascal as a training language for a course in which a one pass compiler was to be built. Only later, when the Algol 68 project became complicated, Wirth started to market it as a superior design - something which it never has been.

And yes, I enjoyed Turbo Pascal.


Is there a source on Van Vijngaarden grammars that’s even barely readable to the modern eye (and more detailed than Wikipedia)? I tried a number of contemporary papers, and they’re impressively impenetrable when they try to define the whole thing inside a page and a half—which is entirely unsurprising even if you set aside the prehistoric terminology (“notions” are ... nonterminals? or maybe all symbols? I can’t tell).

In any case, there’s something to be said for defining a language in a series of layers: tokens, then trees, then perhaps elaboration / desugaring, then one or more layers of semantics (typing, binding, execution). In an actual frontend, you probably even want a looser tree syntax followed by a layer of “semantic” checks that could technically be folded into the syntax, in order to be able to tell the user things like “you can’t use an array type like that” rather than “unexpected bracket”. Compare Lisp s-expressions, the Dylan idea of skeleton syntax trees, and the class of visibly pushdown languages as used in one recent structural editor[1]. Contrast the refusal of the Glasgow Haskell Compiler to desugar before typechecking[2].

[1] https://news.ycombinator.com/item?id=31384528

[2] https://www.aosabook.org/en/ghc.html


I learned Algol 68 by studying the Revised Report on Algol 68, which was published in 1977 as an edition of ACM SIGPLAN (Special Interest Group on Programming Languages) Notices. The publication can be downloaded for free from the ACM website, see [1]. The first author is Van Vijngaarden.

Page 6 of the report presents the ideas of the Vw-grammar. Protonotions, Notions, Metanotions, Hypernotions, Paranotions, Symbols, Productions and Production Trees are introduced to explain the grammar! I encourage readers to take a peek at almost any paragraph of the 70 page Algol 68 report and compare it to the The Programming Language Pascal Revised Report by Wirth [2],[3]. It's easy to see why Pascal became quite popular and Algol 68 never did.

Despite the fact that Algol 68 was too difficult to compile efficiently at the time, it introduced a number of features we take for granted today. See Richard Hamlet's Ignorance of Algol 68 Considered Harmful, [4].

On a personal note, I was a young Computer Scientist at the time these reports were published, and I miss how easy it was back then to keep up with the research that was going on at the time. Today it's impossible to be a generalist. So many independent and important branches have sprouted off the trunk of CS that everyone is a novice in major parts of the field.

[1] ACM SIGPLAN Notices, Volume 12, Issue 5, May 1977 pp 1–70, https://doi.org/10.1145/954652.1781176

[2] http://standardpascaline.org/The_Programming_Language_Pascal...

[3] https://www.amazon.com/Pascal-User-Manual-Report-Standard/dp...

[4] ACM SIGPLAN Notices, Volume 12, Issue 4, April 1977 pp 51-56, https://doi.org/10.1145/954654.954659


Is the "Learning Algol 68 Genie 3.0" guide [1] of some help?

[1] https://jmvdveer.home.xs4all.nl/en.download.learning-algol-6...


First off, thank you for finally getting me to read that, it’s been sitting around in my to-read list since forever (2017? but the version is 2.0 from 2010? a long time ago, anyway).

Second... OK, I got through the first part (the language description), and it basically confirms my earlier brief glances at various Algol 68 documents: it’s a fundamentally pretty nice and consistent language with some warts, some hilariously archaic points, and godawful one-of-a-kind terminology, and its age excuses basically all of those. (No surprise I liked it though, I like both Standard ML and METAFONT, and those are both transparently Algolish in different ways.) Not looking forward to the ALLCAPS operators, but then I wrote some all-caps Forth some time ago and it was okay.

Third, though—that description (unlike the report) doesn’t seem to be using W-grammars at all? It uses what is pretty pedestrian ABNF in funny notation. It’s very helpful in getting me to understand how Algol 68 works, but doesn’t really relate to my original question of how those two-level W-things work. Except by maybe reverse engineering them by comparing the report with the guide, but that sounds like a last-resort option.


I think the author meant this to be a bit tongue in cheek, but if one were to take this seriously, there would be much to criticize about the approach. One big concern I'd have, for instance, is to rely on the standards as reflecting Wirth's state of mind in the design of a language.

This is a questionable idea for almost any programming language, as the standard represents an endpoint in the design process, rather than the starting point. It's even more questionable for Wirth designed languages, as he was not very involved in the standardization efforts. By the time Pascal was standardized, he had moved on to Modula-2 which he thought had superior solutions to many of the design flaws in Pascal, so he had little use for standardized Pascal.

Furthermore, the languages themselves had quite a bit of evolution over their lifetimes. Wirth and his team were very involved in systems design and compiler construction, Wirth was a big advocate for simplicity, and that philosophy included co-evolving the programming language along with the system. Every now and then he decided that some particular feature was more trouble to implement in a new compiler than it really was worth in practice, so he simply changed the language spec.


> "Such a module is called a monitor…” Not a mutex? It literally has the words “mutual exclusion” in it. Is this where Java’s bogus synchronization comes from?

> Can’t really blame him, literally nobody knew how to do multithreading well before Rust. Even in 2013 your choices were limited to “Erlang” and “suffering”. And in 1978 I’m not sure that anyone even had done enough multiprogramming for practical concerns like that to come up.

To borrow from the author himself, "omg it's adorable". Per Brinch Hansen invented, named, and implemented this "monitor" concept in 1968 or somewhen about that time and published it in his 1973 book "Operating System Principles" (C.A.R. Hoare would popularize this concept further in his 1974 paper "Monitors: An Operating System Structuring Concept"). This book also describes RC 4000 operating system which, by the way, inspired the whole microkernel research in the 70-ies and 80-ies. By 1978, the same Hansen guy has published another book, "The Architecture of Concurrent Programs", in which he describes both a new language, Concurrent Pascal, and a new microkernel OS called "Solo", implemented entirely in Concurrent Pascal.

So believe it or not, people did know how to do multithreading stuff way before Rust has graced us all with its existence.


It's called a procedural language for a reason :)

Something that doesn't return a value shouldn't be called a "function", that's just a C-ism. And curly braces are ugly! </troll>

Integers, enums and chars are all so-called "ordinal" types. They have consecutive values with a lower and upper bound, that's why they can all be used as array indexes or in sets (floating-point numbers can't, nor could enums with gaps, if they existed).

Ranges aren't used just for array bounds. I think it's really elegant how the usual "0 .. len-1" is actually an anonymous subtype of integer, not some special syntax for declaring arrays.


> Something that doesn't return a value shouldn't be called a "function", that's just a C-ism.

Something that does not seem to return a value returns in fact a value of the type void which is empty. The idea comes from Algol 68 and has been a contribution by a certain John McCarthy.


Ahh I finally get why C calls the unit / top type “void”—it’s empty as in requires no storage! (It’s not the usual empty / bottom / contradiction type, which has an empty set of possible values and would imply the function does not return at all.)


>All of these languages follow the same block syntax style designed for quite dumb one-pass compilers.

Recursive descent single pass compilers are an elegant and damned fast tools. I've always had compiles that just happen in a blink. As I pick up more C projects, I know it's going to suck waiting for compiles.


Wirth designed two languages before Pascal: EULER and Algol W.


Actually, it was Wirth and C.A.R. Hoare who did Algol W. (“A Contribution to the Development of ALGOL”, CACM June 1966). Decades ago, I did a fair bit of programming in Algol W, it was quite pleasant.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: