Hacker News new | past | comments | ask | show | jobs | submit login
Niklaus Wirth, or the Importance of Being Simple (acm.org)
304 points by madmax108 on Jan 15, 2024 | hide | past | favorite | 155 comments



The article mentioned Philippe Kahn (Borland co-founder) as the student of Wirth, never heard the fact before. The podcast [1] confirms this. Probably the Borland's decision to buy Compass Pascal might be influenced by his Kahn's early impressions.

[00:07:12] ... I remember you had a choice between two programming language and on one side they taught Fortran and Fortran is the language of science, or it was the language of scientists.

[00:07:40] And then there was this new class that was started by this Professor Niklaus Wirth about Pascal. And it was, I think the first or second year it was taught. There were a lot of people in the Fortran class and not that many people in the Pascal class. So I said, oh, I'll go to the Pascal class.

[00:07:59] And that's how I met Professor Wirth. And that was great. That was my favorite class from that moment because he's such a, such an enlightened person and a clear thinker that it was a great, great experience for me.

[1] https://ethz.ch/en/news-and-events/eth-news/news/2022/05/we-...


That's a very cool fact!

Whenever I read these stories, I kinda wish I grew up ~one generation earlier in computing. It seems like the industry was smaller and more exciting

Although probably Microsoft made the industry worse for a lot of people (definitely Borland) -- I think many people forget that these days


> in the sense that the limitations and exclusions of the language design were precisely what made compact implementations possible and widely successful.

All of the Pascals that were widely successful extended the language in key ways. I was an initial fan of Pascal, until I discovered that a large amount of my time was spent trying to work around limitations of the language.

With C, those workarounds disappeared and I was far more productive.

(I know C compilers always have numerous extensions, most of them of dubious value, but still, plain C remains a far, far more useful language than the Pascal of "Pascal User Manual and Report". Which is why C buried Pascal.)


As I said in another forum earlier this week,

"A lot of the hate centered around how it wasn’t a systems programming language, when it wasn’t supposed to be one. It’s like complaining your driving instructor’s car can’t be used to dig a trench without extensive modification."

After BASIC, I learned on Pascal (Apple and IBM). It was invaluable to clarify in my mind how programming structures work, as opposed to the (fun) chaos of early BASICs. I really didn't need much more than the I/O available in standard Pascal at the time. And it hid details like endianness that I was not yet ready to handle.

Were there problems? Of course. Among others, the vendors should have done a lot more to standardize extensions, the P1 parameterized array bounds should have been in the initial spec, and while the P-machines had many virtues, performance was not one of them. Far too many of the early implementations just ran the P-code in an interpreter.


While there may be general agreement that Pascal is a great teaching language and a weak systems language, I believe that early versions of mac operating system (system N, not OSX) relied heavily on a modified version of Pascal. Perhaps Steve did not get the memo.


Becaues Clascal/Object Pascal had an enormous number of non-standard extensions to make it a much more system and apps oriented language than the 1973 version. I believe Bill Atkinson was involved in the decision. Same story with TurboPascal and Delphi.

Walter's complaint is partially that it needed extensions to do anything non-educational on 1973 hardware. This is true, but to me is vaguely non-sensical, as it clearly had not been designed for that and was labelled as such.


It's been well over 40 years since I wrote Wirth Pascal code, but one of the problems was one couldn't write a line without a line ending. That's not an activity confined to system programming!


Sure. But for the output of the tiny calculator class project we did, it just didn't matter.


Yes, Pascal is good for that purpose.


Many of the decisions for Pascal seem aimed for a teaching language as opposed to a production language. For instance, in a production language, optimisation of generated code is worth a longer compile cycle, but in a teaching language (where programs are repeatedly resubmitted until they barely run, and then are never touched again) short* compile cycles are everything and quality of generated code is an afterthought.

* and cheap: I remember in the days of 30 engineers sharing a VAX that if one person was compiling at a time everything was snappy, but (especially near the end of quarters!) if ten people tried compiling at once interactive latency suffered greatly. Classroom use must've been even worse.


Many of the decisions for Pascal seem aimed for a teaching language as opposed to a production language.

Because it was. The fact it could be extended in so many ways to be a production language shows it had 'good bones', but many practical issues, like I/O, were left as an 'excercise to for the student'.

There seems to have been a lot of revisionist history around Wirths passing with people using Pascals limitations as an indictment of his PLT creds, virtually all of which ignore he was an academic working in academic environment on topics that interested him at the virtual beginning of programming on very, very limited machines. It's like calling Watt a hack because he didn't also add a supercharger and emissions control to the steam engine.


Here's what Wirth wrote about it in retrospective, and yes it was explicitly designed as a teaching language with a syntax suitable for a recursive descent one pass compiler http://youtu.be/5RyU50qbvzQ


> For instance, in a production language, optimisation of generated code is worth a longer compile cycle, but in a teaching language (...) short* compile cycles are everything and quality of generated code is an afterthought.

I don't think this is a valid take. It sounds like an attempt to rationalize away awful build times from some systems language. C++ has atrocious build times, but in contrast C compiles almost instantly. Other production languages such as C# and Java also have fast build times.

I don't think long build times are correlated with optimization or production-ready code. They are however linked with attempts to get compilers to do a lot of work just to get our code to generate machine code, but some of that work is due to how some aspects of a programming language sucks and require more work just to pull off an optimization.


> but in contrast C compiles almost instantly

was that the case 30 years ago?


For some compilers, yes.

Also, code was a lot shorter in those days.


Haha, so true! The Caltech PDP-10 slowed to an agonizing crawl the week before finals. Even though playing games on it was banned for that week.


Original Pascal was a teaching language. That's all.


Are you saying that Pascal dialect had WriteLn but not Write?


No, it has write(), but some versions on some OSs needed an eol to flush the buffers. In general the file ops were deliberately underspecified [1], but it was 1973 and the variety of OSs was much weirder than today.

[1] https://www.standardpascaline.org/The_Programming_Language_P...


> It’s like complaining your driving instructor’s car can’t be used to dig a trench without extensive modification.

Correct. But then you get hired for a trench-digging job, and told that you have to use the car, because reasons. And so you wind up really hating the car, when the problem was the other people who decided that it was the right tool for that job.


> But then you get hired for a trench-digging job, and told that you have to use the car, because reasons.

That's not a problem caused by the car. That's a problem caused by someone looking at a worksheet, see "dig a trench" in it, and promptly say "this looks like a job for my trusty car.".


> It’s like complaining your driving instructor’s car can’t be used to dig a trench without extensive modification.

It's more the other way around: learning to drive on a trench digger. Then you find out that what you really need on the places people actually drive, like a freeway, is a car.


Agreed. Pascal made it simple to understand fast and I was in productive flow.

Then suddenly I realized there are stuff I cannot do with it. That was the last day I used Pascal - and I remember it like my dearest experience in programming!


Pascal also has had numerous extensions.

No one used Pascal of "Pascal User Manual and Report" from 1976.

Strangley compiler extentions are only cool when talking about C, in fact, most C developers have no idea what ISO C is actually all about.


I did, in 1986 through 88. The only extension it had was separate compilation. (This was on PDOS running on a 68000, for the morbidly curious.)

And I agree with WalterBright. Unextended C was far more usable than unextended Pascal.


From the looks of Small-C and RatC not really that usable.

During the 1980's there was already several usable Pascal alternatives, plus Modula-2 came out in 1978 exactly to fix the shortcomings of Pascal without extensions, and the largely ignored ISO Extended Pascal came in 1991, retifying many of the 1980's common extensions.

Using original Pascal, was really when there was no alternative, like myself trying to fit some form of Pascal compiler into a 48 KB on top of Timex 2068 ROM, which naturally wouldn't fit.

Neither did the Darthmound BASIC JIT compiler for that matter, or one of those Small-C / RatC alternatives.


I remember the Modula-2 crowd being rather bitter that the advent of C++ sunk M-2. One of the M-2 compiler guys ruefully said to me "I backed the wrong horse." (Zortech C++ had taken the PC market by storm then.) The success of ZTC++ is why Borland changed direction from Pascal to C++. I heard rumors that Microsoft had been internally developing its own OOP language, but abandoned it and switched to C++ also after the success of ZTC++.


Borland did keep supporting Pascal in addition to C++ then, no? They introduced Object Pascal with Turbo Pascal 5.5 IIRC, and later kept extending that with Delphi. As far as I know, their strategy was playing both Pascal and C++ up through the bitter end with Delphi and C++ Builder. I don't know which of them, C++ or Pascal/Delphi, but Delphi was certainly popular enough at its heyday.

Obviously, Object Pascal was a very far cry from the Pascal User Manual and Report, and even at its extended version it was strictly less powerful than C++, so your point still stands. But it was more beginner-friendly. In my opinion, the moment you got beyond the basics (no pun intended), it was friendlier than BASIC or Visual Basic, while far more powerful. That was quite a sweet spot.

Once I felt comfortable with C++, I couldn't go back to Pascal. Even with the extensions, there were some things that were too painful for me like lack of RAII, lack of Generics, short strings by default. Perhaps some of these are addressed by Free Pascal or modern versions of Delphi nowadays, but that ship has sailed. I feel like Pascal with its extensions worked great in the 1980s and 1990s, and I have very fond memories of my time using it, but it just doesn't have much to offer anymore. There are other beginner friendly languages out there that are powerful enough and have better tooling and far larger communities. And it was all about the tooling and community all along.


To this day, many of the C++ Builder libraries are actually written in Object Pascal Delphi's dialect from Borland.


Are you ever wondering what would have been if you had not written a C++ compiler?


However, Wirth himself realized the problems of Pascal and his later languages are basically improved versions of Pascal -- Modula, Modula-2, and Oberon. But these languages didn't even really displace Pascal itself let alone C -- but maybe if he had named them in a way that made it clear to outsiders that these were Pascal improvements they would have had more uptake.


Object Pascal got most of those improvements, hence why it was hard for them to be adopted, maybe if a big OS vendor had picked them instead of C++, or later Java it would have helped.

At least we have some of Oberon ideas living on Go, which despite my usual rants, is preferable to plain C.


One could argue that C's success is largely because it was even simpler than Pascal and more generic --- a notable example is that Pascal has I/O built-in to the language, while C defined them as part of the standard library (which could even not be present.)


From a compiler writer's perspective, C is much more complex than Wirth's Pascal.

Pascal's builtin I/O was a major impediment to its usability.

However, one really great feature of Wirth's Pascal is nested functions with access to outer scopes, which D wholeheartedly embraces. They really are slick. I use them heavily.


> one really great feature of Wirth's Pascal is nested functions with access to outer scopes, which D wholeheartedly embraces

Can you give an example? What does the function access in the outer scope? Is it like an environment-capturing closure?


It can access the variables in an outer scope:

    int moon(int s)
    {
        int sum(int x) { return s + x; }
        return sum(3);
    }
An extra parameter is passed to the nested function, called a "static link", which is a pointer to the stack frame of the statically enclosing function. This is in addition to the usual "dynamic link" which is a pointer to the stack frame that called the nested function.

Nested functions can also be nested, and can access variables in enclosing functions by walking the static links back.

The neato thing about this is it makes stack frames work exactly like members of a struct. In fact, a pointer to a nested function is analogous to (and binary interchangeable with) a pointer to a stack object.


My first (and only) serious compiler had such nested functions, though without stack frames. Instead my VM had two stacks: one for function arguments and locals, and the other for return addresses, same as Forth.

I had no stack frame at all. Instead, the compiler kept track of the stack offset of every accessible local (relative to the top of stack). That way the implementation of my nested function was kind of trivial: there were no difference between true locals like `x` and locals from an outer scope like `s`, except of course for the possibility of shadowing.

One reason this was not special is that internally, it was nested scopes all the way down: there was one scope per local variable, so merely declaring two variables already means dealing with nested scopes. Once that was out of the way adding nesting functions was really easy (I believe it added less than 20 lines to my compiler).

Nowadays I think I would use a single stack instead, but if my language is simple enough I’ll probably skip the frame pointer and just keep track of offsets like I did this first time.


if your outer function f called a function 'quicksort' which called a function 'partition' which called your nested function 'compare', how did 'compare' get access to the variables of f from its statically enclosing scope? how did it know how many locals 'quicksort' and 'partition' had pushed onto the operand stack?


Err… I think recursive inner functions would just blow up… oops.

I guess it’s a good thing I didn’t advertise inner functions and only used them in the standard library…


well, that's the problem the static link solves; it's not such a difficult thing to implement if you let procedure values (function pointers) be two words instead of one. gcc emits trampoline code onto the stack at runtime to support this without breaking abi compatibility, so the pointer to the nested function actually points to the trampoline, which just supplies the static link and calls the real nested function


Do you know if that’s how C++ lambda functions work?

    auto sum = [=](int x) { return s + x; }


Almost, the nested function goes out of scope too if the outer function finishes.


True. D handles that complication allocating such scopes on the GC heap, if the nested function survives the termination of the other function.


I get that the C syntax is very awkward, compared to Pascal's. Are there more reasons that make C compilers more complex than Pascal ones, and would you tell which, please? I'm curious.


> One could argue that C's success is largely because[...]

Why do so many ignore the obvious? Ergonomics matters. At a time when most programmers didn't touch type (even the big stars), the economy of being able to type `int foo(int bar) { ... }` is a major productivity boost in comparison to `PROCEDURE foo(VAR bar: INTEGER); BEGIN ... END foo;`. Hell, that even bothers modern programmers. I mean, I know people whose entire reason for switching to Chrome when it came out was that the toolbars at the top of the screen were 18 pixels shorter or whatever. That's it. People are incredibly capricious.

Factor in the environment—Unix has a widely acknowledged beauty/elegance in its design—and it's not hard to see why people gravitated to C, even at a time when Pascal was anointed as the official development language by Apple for a platform where people were known to be able to make money writing apps and selling them to businesses and consumers alike.


Recent and related:

Closing word at Zürich Colloquium (1968) - https://news.ycombinator.com/item?id=38883652 - Jan 2024 (28 comments)

Niklaus Wirth, 1934-2024: Geek For Life - https://news.ycombinator.com/item?id=38871086 - Jan 2024 (61 comments)

Niklaus Wirth has died - https://news.ycombinator.com/item?id=38858012 - Jan 2024 (403 comments)


Nice tribute to Wirth. I just have some feedback :-)

> ...modular languages offer one implementation for each interface.

Unfortunately, this is totally incorrect. Modular languages absolutely allow any interface to have multiple implementations which can be chosen amongst. This corresponds to how a Java 'interface' can have multiple implementations. In fact programming to the interface and swapping the implementation is one of the main selling points of interfaces.

> Some of those constraints remained long after advances in hardware and software made the insistence on one-pass compilation seem obsolete.

With compile speeds nowadays we can only wish that this insistence was obsolete. It's needed now more than ever with the slow-as-molasses compilers (apart from a notable few) of today.


He is talking in the context of modules as in Modula-2 and Oberon, not language types, or the split interface/implementation, which goes towards your point as Modula-3 and Ada allowed various interface module/package declarations to the same implementation.

Still not the same as Objective-C/Java interfaces that later on become more widely known, or Standard ML modules, CLU clusters, all of them type system based.


It's not clear to me that he means only Modula-2 and Oberon when he says 'modular languages', especially as just in the previous paragraph he says: '...stopped caring much for purely modular languages, including Ada as it was then'


Modular languages are languages that support modules.

https://en.wikipedia.org/wiki/Modular_programming

See "Languages that formally support the module concept include..."


what he says is correct about languages like modula-2, oberon, ada, ocaml, c, and c++, at least within a single compilation. you can have multiple implementations of the interface described in a single .h file, but you have to choose at most one of them every time you go to compile. this is distinct from how things like java work; java doesn't have this problem

c++ compilation is mostly slow not because c++ compilers use lalr parsers (they don't) but because c++ programs commonly wedge lots of the implementation into the "interface", i.e., the .h file, and use recursive textual #include

non-c++ compilers are super fast


Nowadays,

    import std;
    import something_with_templates;

    int main () {
      std::println ("Hello World");
      something_with_templates::my_func("data");
    }
Is also super fast.


thanks! this is interesting!

for me, with the something_with_templates commented out, that gives these errors:

    $ g++ -Wall -Werror -O -std=c++23    pjmlp-hello.cc   -o pjmlp-hello
    pjmlp-hello.cc:1:1: error: ‘import’ does not name a type
        1 | import std;
          | ^~~~~~
    pjmlp-hello.cc:1:1: note: C++20 ‘import’ only available with ‘-fmodules-ts’, which is not yet enabled with ‘-std=c++20’
    pjmlp-hello.cc: In function ‘int main()’:
    pjmlp-hello.cc:5:8: error: ‘println’ is not a member of ‘std’
        5 |   std::println ("Hello World");
          |        ^~~~~~~
    $ g++ -Wall -Werror -O -std=c++23 -fmodules-ts    pjmlp-hello.cc   -o pjmlp-hello
    In module imported at pjmlp-hello.cc:1:1:
    std: error: failed to read compiled module: No such file or directory
    std: note: compiled module file is ‘gcm.cache/std.gcm’
    std: note: imports must be built before being imported
    std: fatal error: returning to the gate for a mechanical issue
    compilation terminated.
so it looks like it could conceivably be super fast if it worked. perhaps std.gcm should be included in the libstdc++ package or built as part of apt install?

still, though i wasn't able to test, i anticipate some slowness due to the semantic model of template expansion, which is one step removed from preprocessor textual expansion


You have chosen the compiler that currently has worst support for modules from the big three.

Try the same with Visual C++, the currently leading one in ISO C++ support.

When std still isn't available as compiled module,

    Build started at 12:21...
    1>------ Build started: Project: ConsoleApplication1, Configuration: Debug x64 ------
    1>Scanning sources for module dependencies...
    1>std.ixx
    1>std.compat.ixx
    1>Compiling...
    1>std.ixx
    1>std.compat.ixx
    1>ConsoleApplication1.cpp
    1>ConsoleApplication1.vcxproj -> C:\.......\ConsoleApplication1\x64\Debug\ConsoleApplication1.exe
    ========== Build: 1 succeeded, 0 failed, 0 up-to-date, 0 skipped ==========
    ========== Build completed at 12:21 and took 08,078 seconds ==========
After changing the message contents and recompiling,

    Build started at 12:22...
    1>------ Build started: Project: ConsoleApplication1, Configuration: Debug x64 ------
    1>Scanning sources for module dependencies...
    1>ConsoleApplication1.cpp
    1>ConsoleApplication1.vcxproj -> C:\.....\ConsoleApplication1\x64\Debug\ConsoleApplication1.exe
    ========== Build: 1 succeeded, 0 failed, 0 up-to-date, 0 skipped ==========
    ========== Build completed at 12:22 and took 00,567 seconds ==========


it took 567 milliseconds to compile six lines of code?

i used to use visual c++ last millennium when people paid me to, but i recall it compiling more than eleven lines of code per second, even on the hardware of the time


Cynics would say that C++ (and in some sense) C took almost 50 years to catch up with Pascal.


In what concerns fast compilation times, and type safe linkage, there were other languages doing it before Pascal.

The language extensions for units/modules in Pascal are influenced by other languages from the 1960's and 1970's. ESPOL/NEWP, PL/I dialects, ALGOL dialects, Mesa, Modula-2.

Linker technology that the C authors decided to ignore,

"Although we entertained occasional thoughts about implementing one of the major languages of the time like Fortran, PL/I, or Algol 68, such a project seemed hopelessly large for our resources: much simpler and smaller tools were called for. All these languages influenced our work, but it was more fun to do things on our own."

-- https://www.bell-labs.com/usr/dmr/www/chist.html


I used Borland's Pascal. And I know this: it had insane compile/build time, even with TurboVision or OWL libraries for UI. Borland's C++ compiler of same age weren't even close. And despite all language extensions, Pascal was quick to parse and to generate not great, but okay code. Over 20 years later, GCC without -O1 generates countless coping for very simple C code. Not to mention all brainpower and money behind C and C++ versus behind Pascal. This quote proves that computer industry often goes after the worst option and is very fond of reinventing wheel.


countless copies?


Yes, instead copying from memory to memory using registers it was something like memory-registers-stack-registers-memory. Is not some ancient GCC from early 1990s, it was around 2010 or so.


yeah, i've had that experience too, which is why i guessed you meant to say 'countless copies'


Perhaps part of the issue with the "evolution of the programming language field in recent years" is that 'simplicity' is a high cost optimization. Most every 'simple' system I've been a part of building started life as an oversized beast, laboriously wrestled into a more streamlined form.

Making complicated things is cheaper, easier, and lets you say 'yes' more often to smart, persuasive, people. Simple takes a lot of time, effort, and "getting to no" with people that have good reasons for the things they want to add.


Making complicated things may only be cheaper initially.

I think the problem is that to learn to make simple things you first need to learn making complicated things. This is my story and pretty much the story of every person I ever talked to that learned to make simple things. Some people get it faster, some get slower and some never learn to appreciate simplicity, but everybody first had to learn to make before they learned to make simple.

So now realise we were through decades of exponential growth of the number of developers and at any point in time people with experience were greatly outnumbered by people with little experience and the answer becomes easier to formulate.

Simplicity isn't something that is easy to teach. The best ways to do it I found is by mentoring and working with people, but there are so many people you can meaningfully mentor on a daily basis. People with experience are so outnumbered (and also promoted up if they are any good) that it is very hard for an average apprentice programmer Joe to be able to find his master.


  A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system. - Gall's Law


No, that's not true. Easily disproven by example: I have many times seen smart engineers with little experience design and immediately implement an extremely complex solution to a very simple problem.

Don't believe everything people say. Just because somebody called something "a law" does not mean it is true.


They might have designed and implemented it, but does it work? :)


Sometimes it does. Depending on definition of "work":)


A working "complex solution to a very simple problem" is not a contradiction, as it can be "simpler" than a simple working solution to a complex problem.


I think this true, and in agreement with the parent comment. A working complex system evolves out of a smaller working simple system. You have to climb the hill from the one side. Getting to a complex, working system is the 'learning' part. If you want to be able to do the same things with a simpler system, then you have to do the work to collapse that back down to its fundamentals.

I don't think it's something that can be 'taught' directly. I suspect it requires the experience of building the simple and then complicated things in order to understand the problem enough to make the complicated thing simpler.


The Go programming language is not only partly inspired by Wirth's work (Robert Griesemer, one of the "founding fathers" of Go, studied at ETH Zürich), it also has this goal of simplicity: it started out with fewer (but powerful and unfortunately often underestimated) features than other languages, and while it has added more complexity over the years (modules, generics, ...), this has happened at a much slower rate than with other languages.


If being simple gives us go, perhaps we should try to drop simplicity for simplicity's sake and accept that complicated things might be needed occasionally?

Go is a modern programming language, which suffers from a lot of issues that were well known at the time of its creation.


> Go is a modern programming language

Is it? What in particular makes it "modern"?


Its limited age compared to the other common ones.


;-)


And yet a lot of modern software is written in it and people seem to be happy using it. Let the music do the talking.


This exact comment could have been written 10 years ago about C and 15 years ago about visual basic.


And before that FORTRAN, COBOL and Assembly.


That's not even the same ballpark. C is timelessness itself. The comment could have been written 30 years ago, or 30 years hence. Nice try!


Had it not been for UNIX being free beer on its early decade, and C would have hardly made a mark on the computing world.


Many of us use Go, because we have to.

Besides being a saner alternative to C in regards to userspace stuff, and also bare metal stuff (for the belivers in automatic memory management), there isn't much else about it in language features to call home about.


> The Go programming language is not only partly inspired by Wirth's work

There are very little similarities between Go and Wirth's languages. R. Griesemer's doctoral supervisor was H. Mössenböck, not N. Wirth.


Look at this document by Mössenböck https://www.astrobe.com/CPIde/CP-Lang.pdf on Component Pascal, which is essentally a modified Oberon 2 (Wirth language)

> Except for some minor points, Component Pascal is a superset of Oberon-2. Compared to Oberon-2, it provides several clarifications and improvements.

The similarities are hard to overlook, replace curly braces by BEGIN / END Pascal style syntax and you are more than halfway there.

Of course Go has added features that were not present in Oberon / Component pascal, the implicit interfaces come to my mind, Go channels also. But inherently its a Wirthian language.

Also Griesemer is not the only Oberon connection of Go. Rob Pike has studied the Oberon system when developing the ACME editor for Plan 9 (Source: https://wiki.c2.com/?OberonOperatingSystem).

I honestly think Oberon didn't get named, because at the time of launching Go, the Pascal / C conflict was a lot more present in the memories of people so there was nothing to gain from pointing out these roots.

It is not uncommon for programming languages to not talk about their roots during launch due to politics (Java for example traces its object-oriented roots to https://en.wikipedia.org/wiki/Strongtalk a typed Smalltalk descendant, but was advertised as a cleaned-up C++).


I don't see what this all has to do with Component Pascal (the language report is not by Mössenböck, but by Pfister et al., btw.). You should instead have a look at the Newsqueak publications. There are more similarities of Go with Algol than with Oberon. Newsqueak reused a few features of Pascal, some of which are still present in Go.

The Acme editor has similarities with the original, text based Oberon system user interface, which unfortunately shares the name with the language, but is not the same thing. The user interface of the Oberon system again looks like a stripped down version of the Cedar environment developed at PARC (where Wirth had his sabbatical).

The object model of Java directly traces back to Simula 67, not to Smalltalk or its descendants (see e.g. https://www.youtube.com/watch?v=ccRtIdlTqlU&t=2444).


You don't have to take my word for it, here's a graph from the Go Programming Language book: https://www.oreilly.com/api/v2/epubs/9780134190570/files/gra...

The most visible Pascal-family influence is the variable declaration syntax ("i int" instead of "int i" as in C et al).


Unfortunately, this graph is at least misleading, if not wrong. The middle path is overly prominent; the detour to Object Oberon (probably so that Griesemer's name is represented) is amusing. Actually there should be a direct line from Newsqueak to Go. Aleph's brief intermezzo is mostly historically relevant for Go, not technically. The path from C should actually point primarily to Newsqueak (in addition to Pascal). Technically, there would be a very thick arrow from Newsqueak to Go, a slightly less thick from C to Go, and a very thin one from Oberon-2 and a few other languages. But as a highly simplified representation of the languages and people directly and indirectly involved and the chronological sequence, the graph is ok.


Here's the accompanying text from The Go Programming Language for that line in the graph:

> One major stream of influence comes from languages by Niklaus Wirth, beginning with Pascal. Modula-2 inspired the package concept, Oberon eliminated the distinction between module interface files and module implementation files. Oberon-2 influenced the syntax for packages, imports, and declarations, and Object Oberon provided the syntax for method declarations.

So mostly packages and syntax - not the most exciting stuff, but the package concept contributes to enabling Go's compilation speed, so it shouldn't be disregarded. Ok, you can still argue that many or most of the C/Pascal-family influences were already present in Newsqueak (I'm only familiar with Pascal, C and Go, so I can't judge that), but I think it's understandable that they wanted to have the language the book is about at the focal point of the graph, and not one of its less successful and more obscure ancestors.


Thanks, I have the book. The major influence had Pascal on Newsqueak. I don't see any specific similarity between Modula-2 and Go. There are other languages which are at least as close or even closer. The last sentence is even wrong. Object Oberon is a completely different language in terms of OO. Go uses the receiver syntax of Oberon-2, which is a completely different concept than in Object Oberon.


it's true that golang is basically newsqueak with structural interface types, but newsqueak is pretty obviously strongly pascal-influenced. i agree that newsqueak shows significant c influence, but in most of the ways that c and pascal differ, it follows pascal. and it seems unavoidable that griesemer spending his grad school at wirth's school hacking on wirth's language oberon would have influenced the taste he applied to golang


In case you're interested I'm looking for feedback for my Oberon concurrency proposal: https://github.com/oberon-lang/oberon-lang.github.io/blob/ma...


i'll take a look! but i don't know much about oberon, though i have the book


Great, thanks!

There are books online for free, e.g.

https://people.inf.ethz.ch/wirth/ProgInOberonWR.pdf

and https://ssw.jku.at/Research/Books/Oberon2.pdf

Oberon+ is a superset of Oberon 90 and Oberon-2. Here is more information: https://oberon-lang.github.io/, and here is the current language specification: https://github.com/oberon-lang/specification/blob/master/The.... I already had valuable feedback here on HN concerning the channel extensions. Further research brought me to the conclusion, that Oberon+ should support both, channels and also monitors, because even in Go, the sync package primitives are used twice as much as channels. Mutexes and condition variables can be emulated with channels (I tried my luck here: https://www.quora.com/How-can-we-emulate-mutexes-and-conditi...), but for efficiency reasons I think monitors should be directly supported in the language as well, even if it might collide with the goal of simplicity.

Feel free to comment here or e.g. in https://news.ycombinator.com/item?id=38903351 or in https://github.com/rochus-keller/Oberon/discussions/45.


sorry for the delay

i don't know that much about concurrency in general, so take this with a grain of salt

i feel like monitors are potentially an improvement over bare locks, but they tend to perform poorly in manycore settings and are still vulnerable to deadlock, perhaps more so due to their implicit nature. (i see that you point this out with respect to java, and also that it doesn't really have monitors.) fundamentally, blocking synchronization of any kind is incompatible with modularity

you misspelled brinch (rip)

there are lots of versions of actors, and the most influential one effectively didn't have any concurrency at all, so it's important to disambiguate in the text that you're talking about the one in agha's 01985 dissertation, which is also very influential. also your footnote says it's from "1986"; is that a different document by agha with the same title?

objects with their own threads (like concurrent actors or like processes in csp or erlang) are not the same thing as monitors, contrary to your assertion in the 'concurrency in oberon' section. there is a well-known duality between message passing between threads and control passing between monitors, but the mechanisms have quite different ergonomics and common failure modes

it's surprising that in a discussion that surveys original-modula, modula-2, csp, erlang, active oberon, c, c++, concurrent pascal, object pascal, superpascal, joyce, newsqueak, old squeak, alef, limbo, golang, java, and ada, you didn't mention javascript at all. in browsers and node, javascript's concurrency model was originally that of original oberon — each event handler runs to completion in succession — which strongly suggests that the concurrency facilities it has added more recently might be worth a look. those are python-like async/await, sometimes caricatured as 'colored functions', and web workers. also it might be worthwhile to at least mention the π-calculus

my uninformed viewpoint is that js, erlang, haskell with its stm, and (non-async!) rust are the languages with the most promising approaches to the concurrency problem. all four of them work hard to be safe (unlike the entire rest of the js language), though i think their concurrency safety guarantees are all to some degree fragile. except maybe haskell, which i haven't tried

your reference [31] https://www.jtolio.com/2016/03/go-channels-are-bad-and-you-s... doesn't purport to show that no single concurrency construct is adequate, as you claim, or even that communication channels are inadequate; rather, it argues that the implementation in golang of that concept is inadequate, because golang implements them inefficiently, because it can't garbage-collect processes, because there are some design errors, and that because golang also offers other concurrency constructs (as you propose for oberon+), it's hard to avoid concurrency bugs. though certainly not conclusive evidence, this runs strongly against the conclusion you are using it to argue for

also your bibliography incorrectly abbreviates 'per brinch hansen' as 'hansen, p. b.' rather than 'brinch hansen, p.' a few times

i hope this is helpful, and i'm glad to see you're exploring how to solve this problem! feel free to copy or excerpt my comments elsewhere if you think it would be useful


Thank you very much for your time and your feedback which I really appreciate.

> but they tend to perform poorly in manycore settings and are still vulnerable to deadlock [..] fundamentally, blocking synchronization of any kind is incompatible with modularity

Do you have references for this? With the selected version of signals (only an await procedure, no explicit signal required), the only way for a deadlock from my point of view is when a protected procedure calls directly or indirectly another protected procedure of the same monitor, thus trying to acquire the same lock twice. But this can be detected by the compiler. Or did I miss something?

> you're talking about the one in agha's 01985 dissertation

My reference is this book: https://direct.mit.edu/books/monograph/4794/ActorsA-Model-of.... It's essentially his PhD released as an MIT press book (at least I didn't see significant differences). But I prefer to reference books over PhDs if possible.

> objects with their own threads [..] are not the same thing as monitors

I'm not aware I claimed this. Will check whether the formulation is misleading. Actually I wanted to point out that the "objects with threads" in Active Oberon give little added value and are not "active objects" at all.

> you didn't mention javascript at all

Well, the intention was not to have a general discussion about concurrency in all programming languages; I'm primarily interested in the relatives of Oberon, plus a selection of languages based on CSP. I just mentioned Erlang because it is a prominent example of an Actor language. But I will have a look at more recent JS developments for ideas compatible with Oberon.

> javascript's concurrency model was originally that of original oberon

Just because of the message delegation/handler concept (which existed before Oberon)? I wasn't aware that there is a direct connection of JS and Oberon (it is not even mentioned in the 2020 JS HOPL paper).

> are the languages with the most promising approaches to the concurrency problem

Do you think that JS concurrency is better than Go concurrency, i.e. channels aren't worthwhile at all? Why?

> it might be worthwhile to at least mention the π-calculus

From my perspective, that's sufficiently covered by [13], as far as the concepts might be useful for Oberon+. Which relevant idea do you think is specifically missing?

> your reference [31] doesn't purport to show that no single concurrency construct is adequate

The paper explicitly states "Unfortunately, as it stands, there are simply a surprising amount of problems that are solved better with traditional synchronization primitives than with Go’s version of CSP. [..] The summary here is that you will be using traditional synchronization primitives in addition to channels if you want to do anything real", from which I conclude that channels alone are not good enough; this conclusion is also supported by [32], where one of the tables demonstrates that the analyzed projects use low-level primitives twice as much as channels. I agree with the author that there are problems which can more elegantly be solved with classic synchronisation (including monitors), regardless of possible design issues of Go; we could use channels to emulate those (as demonstrated), but for efficiency reasons I preferred to directly add monitors (as Go did with adding low-level primitives instead). I think monitors are still better than the Go sync package, and if everything else fails Oberon+ could access pthreads by FFI. But that's actually the main point where more thinking and discussions are required.


R. Griesemer's thesis was based on creating a Oberon dialect.


He implemented Oberon-V for the Cray YMP vector computer.


From where I am standing, Oberon-V is a Oberon dialect.


I assume that's why it is also called "Oberon" ;-)

But this fact doesn't make Go more similar to Oberon.


It makes from the sense Robert Griesemer decided to influence Go's design based on his ETHZ experience while working in Oberon ecosystem.


Maybe he contributed the receiver syntax to Go, we can only assume. But anyway it's not that much Oberon in Go, much less than pop culture assumes. By the end of the day, the two languages mostly share the concept of static typing (with Go even stricter than Oberon), modularity and garbage collection; otherwise they have a different focus (concurrency in Go, type extension and minimalism in Oberon).


Only if consider the Oberon from 1987 and nothing else in the Oberon ecosystem.

Nothing else in the ETHZ group doing Oberon related research counts from that point of view, if it didn't had the touch of Niklaus Wirth himself adding lines to the compiler.


> Only if consider the Oberon from 1987 and nothing else in the Oberon ecosystem counts [..] if it didn't had the touch of Niklaus Wirth himself

I considered all languages where Wirth was at least co-author, because that was the claim made in this thread. I assume you mean Active Oberon; but this version is even further away from Go (closer to Ada95, Java and ConcurrentPascal). And I'm neither a "Wirthian" nor do I believe in personality cults, if it is that what you assume.


I mean every single Oberon variant produced by ETHZ students during their PhD, all of them.

For me that is always what I cared about in the Oberon universe.

Original Oberon was interesting for 1987, and Oberon-07 is kind of interesting from Niklaus Wirth pursue of minimalism point of view.

Most more interesting are all the works produced in the context of languages and operating systems research at ETHZ in the context of Oberon derivatives.

One of this students brought his education to another programming language.

From my point of view, we will keep on disagreeing, regarding Oberon.


> From my point of view, we will keep on disagreeing, regarding Oberon.

I don't even know in what respect we disagree; maybe it's also due to the language barrier. If you have specific questions about my projects, feel free to ask.


Its "simplicity" is a reason why it depends so heavily on compile time code generation across the ecosystem.


One might argue that a lot of languages depend heavily on compile time code generation. Just because Rust's macros, or C++'s template metaprograms are sweeping their generated code under a stylish rug doesn't mean it's not happening. Slowly.


I write fiction as a hobby and "Most every 'simple' system I've been a part of building started life as an oversized beast, laboriously wrestled into a more streamlined form" describes my day to day life in both that and my programming job


Being simple...

And then there comes Rust in all its glory with "my string constant".to_string() awkwardness and Golang with datetime to string formmating using "2006-01-02 15:04:05.999999999 -0700 MST"...

Modern languages are full of idiosyncrasies that work like putting left hand in the right pocket, and from behind.


You are confusing "simple" with "easy". When you are trying to solve hard problems the "easy" way you are going to have bigger problems later than just awkwardness.


I'm not sure I would call Rust's solution "simple" either.

My sense is that fine-grained, per-object, deterministic memory management simply cannot be made simple or easy, because the very thing you're trying to do is inherently complex and difficult.

I realize it doesn't provide the same level of safety guarantees that you get out of Rust, but I am very sympathetic to Zig's approach of having no implicit default allocator, so that I can instead just use something simple and easy to reason about like an arena allocator in the 90% of cases where that's good enough for my purposes.


> And then there comes Rust in all its glory with "my string constant".to_string() awkwardness

Would you feel better if it was `"my string constant".to_owned_string()`?


You could use the constants that everyone uses like time.RFC3339 or ISO 8601.

Why it is under the hood allowed to define your own timeformat is quite obviously legacy data. At least it's human readable.


Huh, the article mentioned a fact that I could have never expected: Logitech was an indirect sire of Wirth, since people from ETH had wanted to commercialize Modula-2. [1] Their first product: a Modula-2 development system bundled with a 3-button mouse.

[1] https://web.archive.org/web/20210324044010/http://www.edm2.c...


Everywhere the company showed their Modula-2 development system people started making inquires about the mouse and its availability as a separate item, Logitech scrambled to put together a developers kit for DOS and started to offer the mouse for sale

Brilliant. It's very important to notice early enough that the clients prefer buying not exactly what you're selling...


I never like this platonic view of there being one 'perfect language', maybe yet invented, that stood above all others. Instead I am always more of a 'classist'(i didn't find any word for it). I believe there is one class of language, of which it's not hard to achieve, that for all intents and purposes are all equally good. A bit like the notion of 'turing completeness' except turing completeness is way too broad and measure another thing. But I'm betting that the 'best' language is already achieved, and there are quite a few of them. Which ones, that's up to debate.


> Like a Renaissance man, or one of those 18-th century "philosophers" who knew no discipline boundaries, Wirth straddled many subjects. It was in particular still possible (and perhaps necessary) in his generation to pay attention to both hardware and software. Wirth is most remembered for his software work but he was also a hardware builder

> One of his maxims was indeed that the field remains driven by hardware advances, which make software progress possible.

> One of his maxims for a successful career was that there are a few things that you don't want to do because they are boring or feel useless, but if you don't take care of them right away they will come back and take even more of your time, so you should devote 10% of that time to discharge them promptly.

> Wirth seems to have decided in reaction to the enormity of Algol 68 that simplicity and small size were the cardinal virtues of a language design, leading to Pascal


I can't believe I missed the news of his death. Pascal was my language of choice for many years. RIP, Niklaus.


The litmus test for simplicity of a programming language design is its compilation speed, if the language compile fast it is simple but if the language compile slow it is overly complex. Modern programming languages like Go and D have fast compilation, but C++ and Rust compile much slower. Go is a direct descendent of Wirth's languages namely Modula and Oberon, while D is not albeit some of its feature like nested function is directly taken from Pascal [1]. Interestingly both were designed by authors with engineering background, and personally I think the simplicity is not a coincident since typical engineers loath to embrace any form of complexity.

[1]Nested function:

https://en.wikipedia.org/wiki/Nested_function


I think that’s an oversimplification. By that measure any assembler would be simple, yet assembly is not simple at all. Most esoteric languages compile super fast, but are usually complicated by design.

Also, it’s not even a sound measure: There are C compilers that are extremely fast and some that are a lot slower but achieve better performance. Java compiles to byte code and performs compilation to machine code during execution. Interpreted languages sometimes only use an AST for execution and don’t compile to machine code at all.


Assembler is very simple as a language family. It is not simple to use.

The real challenge is combining the two.

I also think focusing on different compilers misses the point, which would perhaps be better expressed by to what extent a naive compiler implementation for the language would be likely to be fast or not.

E.g. Wirth's languages can be compiled single pass, and are fairly easy to compile without even building an AST. Some other languages can fit that too, but for many it's not that you need an AST and multiples passes just to generate good code, but that in some cases it gets increasingly impractical or impossible to compile them at all without much more machinery than Pascal or Oberon needs.


Simple depends on the context. You may say programming in assembly language is simple, but it is only simple from the context of writing processor instructions; if you think high-level, like accessing fields in a struct, then programming in assembly complects (or weaves) field access with processor instructions and it turns into a complex thing (from the point of view of accessing fields in a struct).


> The litmus test for simplicity of a programming language design is its compilation speed, if the language compile fast it is simple but if the language compile slow it is overly complex

No. OCaml for example has a really fast compiler (like Go), but I would not call it simple. It does have PPX (PreProcessor Extensions) which are like macros, so you can't "blame" the lack of them either.

And everything that uses LLVM is _always_ going to be _slow_, no matter the language.


More to the point, OCaml belongs to the ecosystems that has the golden route, offering multiple implementations, allowing to pick the best one for each workflow.

If Rust had an interpreter in the box algonside the compilers like OCaml (there are multiple ones as well), it would already make a big different in development workflows.


Slightly irrelevant fun fact: the Rust compiler was implemented in OCaml for a long time during the R&D phase of the project.


Is there any particular reason for LLVM being slow? Does it do a lot of complicated optimizations when generating code, or is it designed in a way that makes it slow?


I’ve heard it used to be lean and fast. Then new developers came in, new features were implemented, and it bloated over time. Thus it wasn’t designed in a way that makes it slow. It grew in a way that makes it slow.

Source: hearsay.


> I’ve heard it used to be lean and fast. Then new developers came in, new features were implemented, and it bloated over time.

That's basically the story of every single software project ever made.


Not only that, at the end of ISO C++ Standards Committee Panel Discussion,

https://www.youtube.com/watch?v=sf_3Vfh6yRA

Someone from the panel clearly mentions that clang and GCC have become slower as some big names have removed their support from the projects (most likely meaning Apple and Google).


A possible reason is the use of Static Single Assignment (SSA). While it makes many optimizations easier, it has to translate its IR out-of-SSA to generate code. This is very expensive as it needs computation of dominance on the Control Flow Graph, insert copies and much more. But it's just a guess.


I could take an existing language and throw in a lot of rules that would make compilation marginally faster:

1. No variable names etc. with a length of more than 8

2. The order of functions matter now; don’t forward-reference anything

3. You can only nest scopes 16 times, max

Would this make the language simpler for me in a way that I would care about?


This is a personal question related to taste. An impossible question for strangers to answer.


Simple is the antonym of complex, and complex means that a lot of things are tangled—in principle it can be investigated in objective ways.


> The litmus test for simplicity of a programming language design is its compilation speed

Simple for programmer to understand and use (small, orthogonal, consistent) isn't always going to be the same as simple for the compiler writer to implement.


> Go is a direct descendent of Wirth's languages namely Modula and Oberon

This assumption comes up again and again, but the evidence is rather poor. There are very few and only marginal similarities. The most obvious is the receiver syntax of the bound procedures. But this can be found in Mössenböck's Oberon-2, not in Wirth's Oberon. Although Wirth was a co-author, he ignored any innovations in Oberon-2 in his subsequent work. Go has a completely different focus and is essentially a further development of Newsqueak; it's definitely not a "direct descentant" of Modula or Oberon.


Part of the reason it persistently comes up is that Robert Griesemer got his PhD under Mössenböck and Wirth, and people not paying very close attention would probably also see the Oberon-2 connection as a confirmation via an indirect step rather than as an argument against the claim.


Yes, Mössenböck was his PhD supervisor (Wirth was only co-examiner). Personally I consider Oberon-2 a better language, but there are hardly any applications of bound procedures; particularly not in the Oberon systems developed at ETH, and surprisingly few in Linz Oberon either. And Active Oberon followed the more conventional Object Pascal approach.


To take it to the extreme:

Writing binary directly is simple because the compilation time is zero.


It would be nice to have actual benchmarks of compilation speed of equivalent programs in different languages rather than just the runtime performance as is typical in language shootouts. Go has to me a surprisingly high time to compile a simple hello, world program (about 200ms and generates a 2MB binary). But I suppose that is a fixed overhead and perhaps it scales well.

Generally though, I'm disappointed if hello, world takes more than 20ms to compile -- which is of course true of pretty much every popular language.


The problem is that a hello world isn't sufficient to identify compilation speed, you'd need a program with thousands of lines that does barely anything. And then you're fighting IO as well. although that could be fixed by putting the program in /dev/shm first and running it.


> The litmus test for simplicity of a programming language design is its compilation speed

From a compiler's perspective, sure. Not being a compiler, I find that metric not very relevant.

Simplicity of a language is better gauged by how easy it is to express complex things in it, and how difficult it is for one person to comprehend what another wrote.


> From a compiler's perspective, sure. Not being a compiler, I find that metric not very relevant.

I'm afraid you're just showing a bit if survivorship bias. I can assure you that compilation speed is a critical trait of any programming language, and the only people who don't understand this are those who benefit from all the work invested by predecessors who did.

Think about it for a moment: how would your turnaround speed be if every single time you had to build your project to test a change you had to brace yourself to wait over one hour to get things to work? After a week of enduring that workflow, how much would you be willing to pay to drive down that build time to 10 minutes, and how much more would you fork to only require 1 minute to pull off an end-to-end build?


Compilation speed can be an important trait of a programming language (or more precisely, a dev env / buildchain). I remember writing code in M68000 assembly, the compile step was lightning fast because you didn't need one. I do also remember going near cross-eyed tracing code flow down narrow columns of vaguely varied yet similar-looking statements -- hours upon hours!

If your daily task build is taking over an hour on modern hardware, it's likely you have organizational problems masquerading as tech debt. No language choice will prevent that; good technical leadership can help mitigate it.


Thankfully C++ modules are on the right path to improve the story on the C++ side.

Using C++23 standard library, alongside modules, and cargo/vcpkg binary caches for 3rd party libs is quite fast.

Rust well, until cargo does offer the option to use binary libraries, it will always lag behind in what C++ tooling is capable of. Maybe if scache becomes part of it.


> Using C++23 standard library, alongside modules, and cargo/vcpkg binary caches for 3rd party libs is quite fast.

I don't think your assertion makes sense. The only thing that conan/vcpkg brings to C++ is precompiled dependencies which you don't have to build yourself. You get the same result if you build those libs yourself, packaged them in a zip file somewhere, and unpacked that zip file in your project tree whenever you had to bootstrap a build environment. The problems that conan/vcpkg solve are not build times or even C++, they are tied to the way you chose to structure your project.

With C++, you get a far greater speedup if you onboard a compiler cache tool like ccache and organize your project around modules that don't needlessly cause others to recompile whever the are touched.


> The only thing that conan/vcpkg brings to C++ is precompiled dependencies which you don't have to build yourself.

Which is a big deal for many folks.


Naturally I meant conan/vcpkg.


I wouldn’t call D simple. The list of features seems to be endless.


Go compilator is fast, because it doesn't allow circular references.


Turbo Pascal was already blazing fast in MS-DOS computers, and it allows circular dependencies.


As a nice example, Malbolge compiles instantly.


I'm shocked about passing of Niklaus Wirth, just found out now. Been in "bunker mode" last few months due to porting C++ code to Modula-3 (M3) code.

I'm using the cm3 Modula-3 distribution at https://github.com/modula3/cm3

I have been thinking alot about Pascal/Modula origins of M3 (Re: Wirth) as well as the DEC/Olivetti/CriticalMass/etc. people related to crafting M3, due to ELEGANCE and ADDICTIVENESS of the Modula-3 programming language.

Been coding C++ for nearly three decades and it's obvious the designers of M3 as well as Wirth et al. were "prophetic", in that they had an idea of the {bloat, complexity} problems that will afflict an ever-growing C++ spec/standard; Scott Meyers C++ experiences concerning C++'s "issues" is notable.

A programming language does not have to be a "complexity beast" in order to be productive for encoding knowledge into code. Note, a large-scale codebase makes C++ "cracks" more obvious. M3 provides a good cognitive-load paradigm during design/implementation of the code; thanks to principles of, more notably, modula programming, revelations, opaque-types.

M3 gets to the point in a very elegant manner and made me realise early in the porting process that the "C++ experiment" has gone on for too long. This is how good M3 is.

One book on M3, i.e. "Nelson's" book, is essentially the M3 spec; a spec from the late 1980's which feature-packed the language in a concise/logical manner. No multi-decadal evolving specs/committees as in C++. No compiler that does not support a language-feature. As in Wirth's paradigm, M3 has a relatively lean/simple language-core that can be used to create useful libraries/programs.

Harbison's book can help flesh out some detail due to the former book sometimes being a bit less verbose; may be due to my applied-science background where I selected the "C", not "Pascal", route when starting computer programming decades ago.

The tutorial links, etc., at the github site are useful.

Anyway, once doing enough C++ (even at pre-C++-11 level), M3 will be an understandable surprise.

Yes, it is important for the core of a programming language to be SIMPLE as well as being PRACTICAL and COGNITIVE-LOAD-EFFICIENT. My recent experiences lead me to believe that Modula-3 achieves this well.

Gees ... Wirth is gone.


Makes me think of Rich Hickey and Clojure when I think of simplicity in this context! But, good article!


Sometimes a picture is wirth a thousand words.


Niklaus Wirth could afford to be simple, he lived in simpler times where demand was much lower than today, being chased by much less investment.

Change was measured in years, compute and storage options were limited. I wonder how many of his OSes / programming languages spanned multiple heterogeneous compute architectures.

Don't get me wrong, love the guy...but wonder what kind of impact he would have if in his prime today...


Don't need to "wonder". Just use a Wirthian-inspired language and see where it takes you.

For me, been coding C++ for nearly 30 years and last few months have been taking Modula-3 (M3) for a "test drive". M3 is "complete" and early on it was very obvious to me that the "C++ experiment" has gone on for too many decades. The amazing thing is that M3 was "complete" since the late 1980's, thanks to the programming-language gurus at DEC/Olivetti/etc. You can sense in the M3 literature that these/other gurus were aware of the "issues" that C++ would develop in the future. As Wirth would imply, all you need is a lightweight/simple language-core that can be used to create great libraries.


it's unclear whether this comment is describing the extreme opposite of the truth out of ignorance or for satirical purposes




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: