Hacker News new | past | comments | ask | show | jobs | submit login

It is worth re-posting this, and re-reading Wirth's works every couple of years. The module concept of Oberon's predecessor Modula-2 is still unrivalled today: .def modules that specify the interface and that can be compiled separately from their .mod implementations, which may not even exist when a client application can already be coded against the compiled interface in a type-safe way.

Also, Wirth's book Compilerbau (in German, not sure if it was translated) is a piece of pristine clarity; at just ~100 pages in pocket paperback form, everyone reading it immediately feels writing a compiler or byte-code interpreter is simple and they can do it.




I love Wirth's work, but that has been available in Ada, Mesa, Mesa/Cedar and a plenty of other languages as well.

And since it is available in Ada, it is rivalled today.

More to the point, Ada allows for multiple interface packages, an idea also copied by Modula-3, where the same package can be exposed in different ways to multiple clients.

For example, the same package can have a public interface for in-house consume that is wider than the official public interface for third parties.


Ada came to mind when the parent mentioned separate interface and implementation files. Making methods in the implementation private unless they appear in the interface was an inspired design decision. I use only Oracle's PL/SQL dialect but I appreciate the design of Ada more the longer I code. Honestly, I'd consider using full-blown Ada in modern software development. It gives you the ability to write really clean code.


When I wrote Ada code long ago, it was only used on military projects.

To me, it was hard to write code in Ada. Lots of niceties from other languages were unavailable in Ada, by design. For example there were no variable argument lists.

It grew on me though, and several years later I worked on a commercial project that used Ada. I was surprised because I expected adopting Ada to be like adopting the adopting the tax code.

Then I realized one thing - although Ada is harder to write, it is nice to have an existing Ada project. And people who have done Ada a while learn to think in Ada and it's not as hard to be expressive.

It's also possible to be pretty accurate in Ada. You can know exactly what the largest or smallest integer is. Moreover you can define integers of a specific range, like -11 to 219.

Nowadays all of that has matured and I think ada is a viable commercial language, and interesting things like spark have happened.

Too bad in the intervening years other languages haven't changed much.

For example, C could have added modules. I guess nobody cares about C.


I think that by Ada 95 they were already available, were you still using Ada 83?

Yeah, just check the list of features for C2X, WG14 isn't that keen in innovating that much, nor in fixing C's flaws.


I notice that Ada 95 included inheritance. My experience in other languages with inheritance is that the feature creates a lot of complexity. Have you used inheritance in Ada and, if so, has it created any issues?


The problem with many other languages is that they do everything through inheritance. In most popular languages, inheritance is set up such that it does anything you want it to. This is what creates complexity, not the inheritance in and of itself.

Ada does things slightly differently. It manages to separate out the various parts of OOP into different language constructs, and this makes it possible to pick and choose what you need, and not get everything including the kitchen sink when you try to use one thing (like inheritance.)


I use OOP languages since I learned OOP with Turbo Pascal 5.5 back in 1991.

So no, I never had any big deal with inheritance in any language, and in what concerns Ada its tag based dispatch is also quite interesting as idea.


I'd use Ada if there was a really good open source version with an MIT or similar license. It does "feel" nice.


Nim has a type system that is heavily inspired by Modula/Oberon/Ada


Did you have a look at gnat?

Your generated code is free:

https://en.m.wikipedia.org/wiki/GNAT#License


OCaml (and probably Standard ML) also have powerful module systems that I would argue rival Modula-2's module system.


Don't forget C, which has separate compilation as well ;-). And you can trivially make multiple interfaces, too.


Separate compilation without type safety, hardly much better than macro assemblers.

So yeah, you can go that route and I have done so, poor man's modules, as it helped to keep me sane with C, but it requires discipline.


> but it requires discipline.

That pretty much defines software engineering.

For as long as I have been coding, I’ve watched people and corporations chase the will’o the wisp of the “undisciplined coder,” where we can hire naive, barely-educated children, straight out of school, and mold them into our personal “coding automatons,” or even better, let people who are experts in other domains, create software, without having to deal with “annoying” engineers.

So...how’s that working out?

Even when we have AI creating software (and we will), the AI will still need disciplined requirements, which, I suspect, will look a lot like...code.


Pretty good, every solution that expects discipline is a source of revenue for security consulting, code quality automation products and conferences.

The outcome of Azure Sphere having chosen C as their main SDK language is proving itself without surprises,

https://techcommunity.microsoft.com/t5/internet-of-things/az...


> That pretty much defines software engineering. Sure, but by moving things that require discipline into type systems/tools, it makes working with others easier.

Not to mention that no matter how disciplined you are, you will make mistakes, and having the compiler catch those for you is valuable.

It also means that the discipline applied by the programmer can be focused on areas that can't be checked or enforced by a compiler.


> So...how’s that working out?

Fantastically if the goal is to set up recurring revenue to maintain the produced systems.

> Even when we have AI creating software (and we will), the AI will still need disciplined requirements, which, I suspect, will look a lot like...code.

https://github.com/webyrd/quines is an interesting example of writing code to create code based on a specification. Perhaps not the AI code generator of some people's dreams, but it exists today.


There’s just a certain amount of effort that you can spend in a certain amount of time. And discipline takes effort. If you need less discipline you can spend your effort somewhere else. For instance you can put effort into fitting your code into an ownership model like in Rust or proof the code with Coq. The difference is that with C you can never know if there was enough discipline (usually there isn’t).


It’s my experience that discipline is “front-loaded.” It takes conscious effort for some time, while establishing a habit, then, it becomes pretty much “free.”

For example, when I was writing ObjC and PHP, I got used to using Whitesmiths indenting. Once I started writing Swift, it was more appropriate to use KNF style.

It took a couple of months of having to remember to not use the old indenting style, but I haven’t given indenting any thought in years.

”We are what we repeatedly do. Excellence, then, is not act; but a habit.” -Attributed to Aristotle


> So...how’s that working out?

So well that it was a large part of the reason I accepted an offer (today in fact) somewhere else, life is too short for that mess.


How are modules related to type safety?


Because, if you program in a type-safe language, if you call something that is compiled separately from you, you'd still like to maintain type safety across that call boundary.


You absolutely get type safety across module boundaries with C in that if provider and user both compile against the same interface, this will be typesafe.

You could even have this type safety on the linker level as far as C is concerned. You just need an object file format that exports C types for symbols. This is not done on any of the (few) systems I know, and probably for practical reasons.

Some other languages give you this link time safety, but I assume at the cost of less interoperable object files.


> For example, the same package can have a public interface for in-house consume that is wider than the official public interface for third parties.

Isn’t this similar to package private in Java or internal in C#?


Not really, because the interface is separate from the implementation and you can provide multiple interface packages for the same implementation package.

So client A sees mylib-public-A, client B sees mylib-public-B, but both link to the same mylib so to speak.


Ahhh, neat!


Yes, except that still limits you to two consumers.


That would be "Compiler Construction", the last version being freely available[1].

Given that Oberon is a simpler language than pretty much all of its predecessors and that the latest revision went even further, I'd be interested what Wirth does think about contemporary strongly typed system languages like Rust or Go (the latter being quite, erm, influenced by Oberon). Or heck, Eiffel, being the language of his successor at ETH Zürich.

IIRC he didn't have a high opinion of functional programming.

[1]: https://people.inf.ethz.ch/wirth/CompilerConstruction/index....


Ah, found it here[1]:

"To postulate a state-less model of computation on top of a machinery whose most eminent characteristic is state, seems to be an odd idea, to say the least. The gap between model and machinery is wide, and therefore costly to bridge."

[1]: https://people.inf.ethz.ch/wirth/Articles/GoodIdeas_origFig....


Those sentences form the opening of a rather peculiar paragraph. You don't have to read past the abstract of the seminal paper on functional programming, Backus's Can Programming be Liberated from the Von Neumann Style? to see that the goal isn't to eradicate state, just to tame it. "Unlike von Neumann languages, these systems have semantics loosely coupled to state." (emphasis mine) Loose coupling is not the same thing as elimination.

In the next paragraph, Wirth further indicates that he has chosen to argue against a caricature of functional programming when he suggests that "[Functional programming] probably also considers the nesting of procedures as undesirable." That's another strange thing to insinuate against a programming style that is noted for its use of closures.

(For that matter, where would closures be without state?)


This is a weird phenomenon. There seems to be two conflicting philosophies in regards to reducing accidental complexity in software.

The "less is more" crowd adheres to avoiding and reducing feature bloat and writes lower level, often efficient, very consistent code that is easy to grok.

And the "correctness by concept" crowd, with many variants thereof. Expressive type systems, functional programming, abstraction and general "higher order-ness" are dominating themes here.

Languages and paradigms often land on the spectrum of these two. I wonder if these concepts can be married in some way and what we would have to give up to do so.


I'm not sure that they're all that different in the first place. At the one "less is more" place I worked, they also relied pretty heavily on abstraction and general higher-order-ness. They just had a different way of doing it: Service boundaries and protocols. Arguably the Unix philosophy is similar: A bunch of small programs that do one thing and do it well, which you can chain together with pipes.

The CTO's official reason for the "less is more" philosophy was not that he didn't think that more powerful language features weren't useful, it was that sticking to less powerful features discouraged the growth of individual modules into large complicated tangles, by making it actively painful to do that.

My one, somewhat guarded, criticism of that approach is that I think it may have depended critically on the company being in a position to maintain some very selective hiring practices. When you limit yourself to only hiring people who can really appreciate Chuck Moore's style, well, you've limited your hiring quite a bit. I could be convinced that the "correctness by concept" approach is less fragile and dependent on having a rigid corporate monoculture in order to work out properly.


> That's another strange thing to insinuate against a programming style that is noted for its use of closures.

Let's be honest: there are two completely opposite meanings of "functional programming" that we're burdened with having to put up with, and most of what passes for "functional programming" straight up isn't. Somehow the people making heavy use of closures manage to pass themselves off as doing FP, even though that style is decidedly unfunctional. I wish there were wide recognition of the distinction between this pseudo-functional (PF) style from actual FP, and that we'd call it out as appropriate.

Frustratingly, the inhabitants of this bizarro world who program in the PF style still tend to undeservedly hold the same smug expressions on their faces as the one FP folks do with regard to OO—with lifted noses about OO being unclean, even though PF folks' closure-heavy style is no better—with PF being equivalent to OO, except for the former being being less syntaxy, which only leads it it being harder to spot the trickery employed in the PF folks' programs. This is fairly annoying.


Huh? I don't know if you are aware of this, but higher-order functions make heavy use of closures. I don't think there is even a purpose to higher-order functions without closures. I don't how haskell is implemented internally, but conceptually, it makes use of closures quite heavily. It uses closures on currying, it uses closures on monads, and on everything else. If you assert that heavy use of closures alone is decidely unfunctional, then haskell must be one, which is obviously false.

I don't know what definition of functional programming you are using[f], but you don't have to be arrogant about it. Your comment is fairly annoying.

[f] Let me guess, only immutable variables and pure functions? How PF.


> higher-order functions make heavy use of closures

No, you're conflating higher-order functions with closures. Higher-order functions that use closures make use of closures. Higher-order functions that don't use closures do not.

> I don't think there is even a purpose to higher-order functions without closures.

I'm not sure how anyone can say this with a straight face, let alone someone who considers themselves to be in a position to challenge somebody about whether or not they grok functional programming.


>No, you're conflating higher-order functions with closures

Well, if you are actually doing the real hardcore FP™, and not just the lame pretentious PF, then yes higher-order functions will indeed very much make heavy use of closures. Did you miss the part where I gave haskell as an example? And note that I didn't say that closures and higher-order functions are the same thing.

> Higher-order functions that use closures make use of closures. Higher-order functions that don't use closures do not

How are these tautologies even an argument? This does not say anything meaningful, like saying wet water is wet. Don't worry, I'm not even trying to keep a straight face while reading what all you have said so far.

> I don't think there is even a purpose to higher-order functions without closures.

Heh, I did qualify my statement with "I don't think" since I didn't really give that much though on that one.

But okay, I admit that statement is dumb and invalid since the usual map, filter, reduce functions are good examples where higher-order functions is not a closure. But more often than not, you really do need to use closures to do anything beyond simple cases like map(array, x => x*x).

My overall point still holds. I'm still in a very good position to challenge your dogmatic beliefs that: heavy-usage of closures is pseudofunctional and unfunctional.


> How are these tautologies even an argument?

They're not, and that was exactly the point of my comment: it's a circular argument that you have to take responsibility for, not me. You seem to have missed that—it's your nonsense claims that are in focus when the tautology is being spelled out.

Higher-order functions and closures are different things.

> Well, if you are actually doing the real hardcore FP™, and not just the lame pretentious PF

I wouldn't call the pseudo-functional style "hardcore"—any more than OO is hardcore, given that they're equivalent. It's frequently portrayed as the naive/easy way out. Actual FP, on the other hand, is hardcore. (And pretentious—which is an odd attempt to try to stir me up; do you think I'm an advocate of FP or something? I suggest re-reading.)

> But more often than not, you really do need to use closures to do anything

Yes, which is why I'm not an FP advocate.

I was very clear in my original comment. The pseudo-functional style is a preference for how to write programs, and therefore immediately defensible as valid. What's not defensible, though, is equivocating on the meaning of "function" while simultaneously trying to lump the pseudo-functional style in with FP. The moment one starts making heavy use of closures and carrying around state is the moment one forfeits the right to be smug about how unclean OO is, given the equivalence of objects and closures and given that one is no longer actually practicing FP.

> I'm still in a very good position to challenge your dogmatic beliefs that: heavy-usage of closures is pseudofunctional and unfunctional

No, you're not. It's unfunctional by definition.


It sure is easy moving the goalposts around when you have no grounds to base on. Please, please you have said this much and haven't still even once defined what true functional programming™ is?

> It's unfunctional by definition.

There you go, more self-fulfilling tautologies. And for some magical reason, it's me that are making nonsense claims? How is "higher-order functions make heavy use of closures" a nonsense claim?

I have provided a very clear and direct counter-example that falsifies your core argument. On the other hand, you have provided zero actual rebuttals. In case it isn't clear, calling mine "nonsense, circular and tautological" and yours "by definition" doesn't count as an argument.

> The pseudo-functional style is a preference for how to write programs, and therefore immediately defensible as valid

Is the word style even relevant here? You can call it style, paradigm, or computational model, it doesn't change your point.

> What's not defensible, though, is equivocating on the meaning of "function" while simultaneously trying to lump the pseudo-functional style in with FP.

Ugh, I'm guessing your definition of "function" is a special amorphous one that changes meaning to conveniently support your claims.

> The moment one starts making heavy use of closures and carrying around state is the moment one forfeits the right to be smug about how unclean OO is, given the equivalence of objects and closures and given that one is no longer actually practicing FP.

No, repeating your statements don't make them true. Once again, see my original counter-example with haskell. If you insist in ignoring it, then fine with me. I'm done here.


> It sure is easy moving the goalposts around

If I've moved the goalposts, you should be able to show where it happened. So do—point to it or fuck off.

As for the rest of your comment and being "done", that's fine. There's zero chance that I'm going to waste my time on a point-by-point rebuttal for anyone who's acting in this much bad faith, ignoring the points I've already made, and trying to pawn off the flaws in your arguments as mine.


Internally, Haskell's intermediate representation is a version of the lambda calculus. Which would mean that, practically speaking, Haskell is largely just one big pile of closures.

Which really shouldn't be a surprise. After all, you can't curry if you can't close.


In the case of the former, I think you're misunderstanding Wirth. His statement isn't predicated on the idea that functional programming 100% requires the language to eliminate state, necessarily; just that functional programming discourages state in lieu of primitives less aligned with the machine.

For instance, functional programmers would almost all tell you that `map (x => ...) xs` is "better" than `for i from 0..len(xs): xs[i] = ...`. But the former, implemented trivially, is very slow: from the allocation of the closure to the allocation of the new list to the function calls on each iteration and the lack of tail recursion in `map`'s implementation (this is a trivial implementation, remember?)

Of course, the functional programmer would tell you, "Well, it's easy to optimize that, the performance issues are just because your implementation is too trivial", and Wirth would rejoin, "Too trivial? What's that?"


I don't think this is a fair point, since the two snippets do different things. One creates a new list, and the other does not. If, as a functional programmer, I was actually interested in mutating the existing sequence (e.g. for performance reasons), I would definitely write the loop.

If you're interested in maximum constness (which I tend to be, because I find it's almost always easier to read code where values don't unexpectedly change in a branch somewhere) then you'd be comparing

    let ys = map f xs
to

    let ys = []
    for i from 0 .. len(xs):
        ys.push(f(xs[i]))
where the former obviously makes it much more clear what's going on.

Sure, it's using "primitives further from the physical machine" but that is exactly what programming is about! You create a new layer of primitives on top of the old ones, where the new layer makes it slightly easier to express the solution to the problem you're solving. You do this incrementally.

When someone has built a more easily handled set of primitives for you, it would be silly not to use them, all else equal.

----

In other words: the only real reason to mutate values is to improve performance at the cost of readability, and at the cost of losing the ability to safely share that data across concurrent threads.

If, indeed, that is a cost you're willing to pay for the additional performance, no functional programmer I know would shy away from the imperative mutating loop.


They do two different things, but they do different things the way the style they're written in encourages. Performance-considerations aside, the functional programmer would rather create a new list (or, as you said, they're "interested in maximum constness" - the precise preference for statelessness Wirth is calling out); the imperative programmer would mutate the existing list.

Wirth is not talking about clarity, not in the sense of "can I look at the code and understand the high-level intent of the programmer"; Wirth is interested in clarity in the sense of "can I look at the code and understand exactly what it's doing, at every level"?

For Wirth, programming is not about using an endless stack of primitives that get you further and further from the physical machine, so much that they start to obfuscate what's happening at lower layers. It's about building the smallest, simplest stack of primitives such that you can express yourself effectively while still understanding the entirety of the system. The Oberon system includes everything from the HDL for the silicon all the way up to the OS and compiler in around 10,000 lines of code because you're supposed to be able to keep all of it in your head.

I'm not saying that any of this is correct, per se, nor am I arguing for it - I'm sympathetic to it in some ways and disagree with it in others (I am, in fact, very much into FP). I'm just trying to give a charitable and clear interpretation of his perspective. FP may not want to get rid of state in one sense, as you've pointed out; but it wants to get rid of state in another, and Wirth doesn't like that because it necessitates complexity - and Wirth hates that.


For Wirth, programming is not about using an endless stack of primitives that get you further and further from the physical machine, so much that they start to obfuscate what's happening at lower layers. It's about building the smallest, simplest stack of primitives such that you can express yourself effectively while still understanding the entirety of the system.

Now that is an interesting perspective I hadn't even considered. Also not sure I would agree – but if I was interested in finding out more but found the OP unconvincing, where would I go to find out more?



Easy, it thinks they are full of bloat, including Go.

Each release of Oberon-07 drops features, it is reduced to a C like with GC, with a single form of loop constructions.


Got any specific citations? His general opinion seems pretty clear, but I would like him going on about some details.

I think I remember him saying that if one would want to design a language, starting with Oberon would be his recommendation. In that regard Go at least does something right.

And it does at least have a specification, too, which is another item that Wirth is pretty adamant about.

I'd pay good money to have him and Meyer argue about design, syntax and semantics.


Easy, compare 1992's Oberon with Oberon-07 revisions from 2011, 2013, 2014, 2015 and 2016.

Each Oberon-07 revision, as mentioned, drops language features.

Also note that as far as I know, he wasn't too keen in the offsprings from Oberon, namely Active Oberon, Oberon.NET, Component Pascal and Zonnon.

Oberon-2 was his last collaborative work in the context of Oberon language family.

And while for me Active Oberon is the best one for systems programming (still in use at ETHZ OS classes), with support for several low level features that original Oberon requires Assembly, I doubt Wirth would appreciate it, given that it is Modula-3 like in size and features.

http://cas.inf.ethz.ch/projects/a2/repository/raw/trunk/Lang...


Oberon has three (not one) loop statements, namely WHILE, REPEAT and FOR:

https://www.miasap.se/obnc/oberon-report.html

If anyone is interested in using the language outside of the Oberon operating system, here is a freestanding compiler:

https://www.miasap.se/obnc/


Oberon yes, but I guess you missed the Oberon-07 part of the comment.


I'm referring to Oberon-07 which is the latest version of Oberon, last updated in 2016.


I stand corrected, what was dropped was LOOP and EXIT, and I somehow mixed it up.

Sorry about that.


Given how many languages you know, and how many revisions of languages, you might be forgiven for having mixed up one detail on one revision...


Though this is why we shouldn't be snarky when replying to others. Aside from just being nice, we might be the one in error and not know it.


Exactly, Oberon is now a purely structured language in which each statement sequence fully executed. There are no goto-like constructs.


> The module concept of Oberon's predecessor Modula-2 is still unrivalled today

I'm not familiar with Modula-2's module system. What does it provide that the module system of OCaml does not?

>.def modules that specify the interface and that can be compiled separately from their .mod implementations, which may not even exist when a client application can already be coded against the compiled interface in a type-safe way.

I believe .mli files can be compiled separately from the matching .ml files, and client modules can be compiled against an .mli that does not have a corresponding .ml file.

And OCaml also supports functors, so modules can be parameterized.

Sorry, not arguing that Modula-2's module system is not good, guess I'm just not convinced that it's unrivalled today. And for all I know ML's module system was probably influenced by Modula-2.


> The module concept of Oberon's predecessor Modula-2 is still unrivalled today

The module concept of Oberon is also very good (leaner than Modula's). There are also other languages with good module concepts, e.g. Ada, or the CLR based languages.

> Wirth's book Compilerbau ... is a piece of pristine clarity

For a certain type of compiler (rarely used today).


Sounds like COM type libraries.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: