I did a Fortran 74 course in 76 and we touched on algol 68 and I was a giant, colossal mind fuck. Call by name, call by value, call by references... Give me back my giant global section and begone!! What is this code as arguments stuff. Recursion? That's the devil's music!!!
They astounded the attendees – who had made estimates of up to 100 man-years to implement the language, using up to 7 pass compilers – when they described how they had already implemented a one-pass compiler which was in production use in engineering and scientific applications.
It reminds me of the story of how Donald Knuth wrote an early ALGOL compiler by himself, in 3.5 months:
An amazing feat, though that was ALGOL 58, a rather different beast than ALGOL 68 (though hardware was also significantly less capable then, of course).
From FreeBSD's ports collection: "The Algol 68 Genie project
preserves Algol 68 out of educational as well as scientific-historical
interest, by making available Algol 68 Genie; a recent, well-featured
implementation written from scratch." It's GPL.
I strongly recommend this talk by software consultant and author Kevlin Henney called Procedural Programming: It's Back? It Never Went Away
He talks a bit out Algol 68 and how influential it was on programming language design. Many language keywords we're familiar with today originated in the 50 year old language specification for Algol 68: int, bool, skip, void, struct.
> They astounded the attendees – who had made estimates of up to 100 man-years to implement the language, using up to 7 pass compilers – when they described how they had already implemented a one-pass compiler which was in production use in engineering and scientific applications...The ALGOL 68-R compiler was initially written using the RRE ALGOL 60 compiler including extensions for address manipulation and list processing but stripped out of features such as real number handling which made it more suitable and more focused on compiling a compiler.
Dropping any kind of real or float support sounds like a major issue for scientific use; did the users simply have to express everything as fixed-point integers? What else did they simply drop to get their subset/prototype working?
My informatics teacher was big fan on Niclaus Wirth. Unexpectedly, ~15 years down the line, remembering her functional programming maxims came handy when dealing with Golang. I found languages are sharing some vibe.
I'm not sure myself after those years, but it felt like functional programming.
Golang is not much about functional programming than say JS from my personal impression, but it felt like it had some attempts to follow it in its foundation.
In the end, it failed to realise the original though of Wirth - "with clear separation of functions and inputs, every program turns into a spreadsheet"
That does not really sound like Wirth, who never was involved in functional programming as far as I can tell and was not a big fan of spreadsheets.
Wirth always was insistent that programs should strive to be transparent about the computational effort involved in an action, so I can't imagine him being fond of e.g. Haskell.
Then, my memory fails me. Maybe I am mistaking him for somebody else. But I do remember my teacher obsessing over virtues of obscure languages like Algol, Oberon and Modula.
I still have that Algol68-R User's Guide on my shelf and remember to this day the excitement that gripped me when I learned from it about REFs.
I went on to implement my final year project in it, which was considered by all concerned to be a crazily risky thing to do.
The only thing that matched it for excitement was connecting for the first time to a "live" remote computer through a teletype and that thing actually typing things back at me.
I wish I'd stuck with it.