Hacker News new | past | comments | ask | show | jobs | submit login
50 Years of Pascal (acm.org)
202 points by matt_d on Feb 24, 2021 | hide | past | favorite | 147 comments



As a student I learned Turbo Pascal around 1991. Most of the problems with the original language were fixed and Borland provided enough extensions to use the language to program hardware drivers under DOS.

Then on my first job I learned C++, again in its Borland incarnation. The slow compilation speed was a big issue. We were forced to use carefully structured precompiled headers and tried not to touch headers included in the core compilation set. And all that after very fast compilation and linking even on big projects with Turbo Pascal. There was even discussion to switch the project to Pascal to be more productive. But C++ were considered cooler and we stayed with it.

And at my current work almost 30 years later I still have to wait hours for full recompilation to finish. C++ modules that do roughly what Turbo Pascal had with its unit files are supposed to fix that, but we are still few years from using that in production.


Turbo Pascal (and perhaps its imitators like Think C) really seems like it was a milestone in IDE usability and responsiveness that has rarely (if at all) been exceeded or even equaled.

I know CPUs are not GPUs, but it seems embarrassing that we can render beautiful, highly responsive, ray-traced games at 4K resolution and 120FPS but we still have to wait for code to compile, and code editors often feel clunky and unresponsive.

I wish Xcode, Visual Studio, IntelliJ IDEA, etc. would apply the monstrous computing power we have in 2021 a bit more effectively.


Every aspect of the system was fast. I particularly liked the lightning-fast "hypertext" online help. A single keypress would instantly bring up reference documentation for the language and libraries.


Wow ... once every 2nd year I try to add that basic feuture to my workflow IDEs but give up each time. The function signature is ofent not enough.

Tooling has really degraded.


It is largely because unlike Borland's products, nowadays there isn't a single source that provides everything and tools are really made up from barely (if at all) related components.

In some cases where tools are made up for each other you have better integration - e.g. Free Pascal's own IDE does have online help like Turbo Pascal (though a downside is that because the help is in CHM, i.e. HTML, the formatting in the text mode IDE sometimes doesn't look that good - but the documentation is there anyway).


I bought a copy of Turbo Pascal from the Borland booth at the West Coast Computer Fair in 1983 (or 4?). It was amazing product to use; it might be looking at so many years hence, but it still feel like no programming system ever could match the speed. While I do Swift today, I still remember how much fun it was to do Pascal almost instantly.


Nim despite having Python like syntax inherits much from Pascal. It was even originally written in FreePascal [1]!

I've come to appreciate `var` and `let` blocks in Nim that I believe come from Pascal syntax:

    let
      mnist = load_mnist(cache = true)
      # Training data is 60k 28x28 greyscale images from 0-255 
      # Change shape from [N, H, W] to [N, C, H, W], with C = 1 (unsqueeze). Convolution expect 4d tensors
      X_train = ctx.variable x_train.unsqueeze(1)
      ...

However unlike Pascal syntaxes (and Ada), Nim remains less verbose and feels less "straight jacketed" for it. It's odd but while it encourages safer code than C its doesn't feel quite as stifling. You can do most any pointer manipulations you might need, but using builtin functions (i.e. `addr` or `unsafeAddr`) rather than symbol operators.

1: https://news.ycombinator.com/item?id=25242926 2: https://github.com/mratsim/Arraymancer/blob/master/examples/...


That's more likely copied from the functional programming world (ML / Miranda / Haskell). And they got it from Lisp's "let" form.

Wirth's languages always had a very strict separation between variable declarations and code. Your example might look something like this in Pascal:

    PROCEDURE foo;
    VAR mnist: MNistRecord;
        x_train: XTrainRecord;
    BEGIN
      mnist := load_mnist(true);
      ...


Was hoping someone would mention Ada. It's the most Pascal-y of all Pascal's children. I enjoyed my brief time doing Ada. I should look at Nim.


If it's been a while, SPARK/Ada has been interesting to play with. No real objective for me since work won't touch it, but it's been interesting playing with the prover and design-by-contract elements of it. It's more properly integrated (with Ada 2012) than the prior versions.


Nim is adding proof engine integration [1] { though it may not be for everyone/all circumstances [2] ;-) }.

[1] https://nim-lang.github.io/Nim/drnim.html

[2] https://blag.cedeela.fr/curry-howard-scam


Interesting, though that only seems to be a small subset of what SPARK does.

> DrNim currently only tries to prove array indexing or subrange checks, overflow errors are not prevented. Overflows will be checked for in the future.

Those are the very basics of what you might use SPARK for, but it can be used for deeper proofs of your program invariants as well.


Many Nim things are works in progress. If passionate & capable people like yourself contribute, the finish line will seem that much closer. :-)


If Niklaus Wirth is reading this: Thank you for all your work on Pascal, Modula and Oberon! I would especially like to thank you for your work on "drastic simplifications and consolidations" - to reduce complexity and make it easy to use!


Pascal was the first language I learned in college. It is an excellent language for CS students, because it has clear syntax and it easy to figure out when something goes wrong (unlike C). Unfortunately it lost space to C and similar languages.


Wirth's Pascal as described in User Manual and Report is nearly impossible to write non-trivial programs in. I know, I tried. It required a lot of extensions to become usable.

In 1982 or so a friend loaned me his copy of K+R, I read it, and was instantly hooked. Never wrote another line of Pascal. I had written a lot of PDP-11 assembler at the time, and C's PDP-11 heritage made it very easy to pick up C. I saw how the PDP-11's semantics mapped right on to C.

But some good came out of my experience with Pascal. D's nested functions are inspired by Pascal's nested functions. Simple and easy, including the ability to reach back on the stack and access the callers's variables (all the way up to main()).

C should have had nested functions.


And typesafe enums and sets of said enums and ranges of ints and enums and arrays addressable by said ranges and the ability to pass values by reference to functions :-P


Agree on nested functions. Gcc has them as an extension but the last time I looked, clang did not. A pity.


There are also some pitfalls in gcc's implementation. On one project we found that nested functions used as callbacks didn't play well with green threads, and it was a pain to deal with.


The only thing I disliked was the `begin ... end` and the `begin ... end;` (notice the semicolon) structure on if-then-else

C (and many others) has simplified employing brackets.


So you never tried to write a function that accepts strings with different length?


This was 'solved' with pointers, didn't? (symbol ^) Well, we're talking about at least 25 years ago, so I clearly not remember this hiccup whatsoever


Turbo Pascal didnt have that hiccup, only the original language.


You can work around this - Turbo Pascal's strings are essentially syntax sugar for

    type string = record Length: Byte; Chars: array [1..255] of Char end;
You can use this to make your own string handling procedures, the hardest part would be passing arbitrary string literals but that could probably worked around too with conversion functions.


My first too. The first time I looked at a C example it was soo cryptic! I wonder if we could compare the first time impression of C and Pascal though.


I once reverse-engineered a C program by compiling it and disassembling it, as I found it easier to read M68000 assembler.


C and C++ were free. Free crap often wins over purchased quality.


C was ported to any platform. Free crap (when it comes to programming languages) win, because they can be used everywhere therefore a better thing to learn. For example, Borland never opened the sources of the compiler, and Turbo Pascal has never been ported to platforms other than DOS/Windows.

Free crap in other areas (such as computer graphics) did not win over purchased quality. For example graphics designers even choose their Operating System based on available (non-free) quality software.


Delphi was ported to Linux with Kylix (using Qt as a backend for the GUI) but didn't saw much use.

However the issue with Delphi was really that Borland got greedy around the turn of the century, started chasing enterprises and jacked up the prices considerably. Turbo Pascal originally costed less than a game yet you got both one of the best compilers and a book to learn from and Delphi until around 5 or 6 or so (do not remember the exact version) could be bought for around $99 at its cheapest version (AFAIK upgrade was cheaper) which was still affordable even by highschool students and provided a very high quality package. And during the 90s Borland Pascal and Delphi had a TON of mindshare - something they lost as they moved away from being affordable to everyone.

People wouldn't mind much if Delphi wasn't free as long as it was still affordable to everyone and good quality. Sadly not only it became unaffordable to the most important segment of programmers (the new ones who want to experiment with stuff) but it also degraded in quality.


C compilers weren't always free. When I started my career on the original Mac, there were multiple commercial Pascal and C compilers but nothing free for either. For some years after that I worked on multiple UNIX machines and variants that each shipped with their own C compiler (often developed by one of several companies that specialized in this stuff) because gcc was no easier to port and produced lower-quality code. So it was free for users, but for the people who actually bought the systems it was bundled which is not the same.


Crap that works wins over stuff that sounds good in theory. If you ever had to work with an as-defined-by-Wirth Pascal, it was painful.

Another data point: Turbo C wasn't free, either. How did it sell compared to Turbo Pascal? (I don't actually know; does anyone have data?)


Countless generations of Polish high school students were taught Pascal as their first programming language, either in Turbo Pascal or Free Pascal.

As per how useful it is to teach a language that has been niche since 20+ years, I can't tell.

But the other common alternative was C++ in Dev-C++ or Code::Blocks, albeit most used books focus on the ancient practices, not even in line with C++11. People learn from Grębosz's "Symfonia C++" to this day, which is way outdated.

Currently, I guess the easiest choice to teach would be Python (and actually a usefull skill for the students), but not being a statically typed language kind of takes away educational value you can milk from it without introducing another language.


Pascal, as defined by Wirth, is just horrible.

No "fopen" to open files of a given name in the filesystem.

In fact, no "open/creat", nor any other system calls at all.

No "extern", no "#include", no user-created libraries, no ability to create a .a file, or even a .o file without main().

Actually, no system libraries, either; just a few built-in functions (sin and cos, but no tan; ln but no log).

No "default" on case statements.

No "break" or "continue" statements; only "goto".

No bitwise operators for and/or/xor.

No equivalent of C's && and || operators, where the second argument is guaranteed not to be evaluated if the first one determines the result.

And, for everyone who complains about C strings, all Wirth Pascal strings are fixed-length arrays of characters. So, you can't pass the constant string 'foo' to a function that is expecting an array of 4 characters.

Finally, in TeX, Knuth says he found the fact that when you get around to defining a forward function, you must NOT repeat the parameter list (names, types) again, to be so objectionable that he chose to use global variables to communicate with forward functions instead.

All of people's fond Pascal memories are actually due to extensions in Turbo Pascal, etc.


You are looking at the original Pascal from a modern perspective. At its time it was very nice which is why it exploded in popularity.

When Pascal was designed there wasn't an idea of a 'file' like what you have today - files often had their own data types and were series of defined records and different systems had different ideas of their structure. AFAIK in original Pascal you were supposed to pass any files as part of the program's environments and you declared as part of the program itself (`program Foo(a,b)`) these files - it was up to the system to provide those (Turbo Pascal ignored those).

The other stuff you mention about extern, include, etc follow from that. Many systems didn't think of files like Unix where they were just series of bytes.

The bits about default, continue, break, etc were intentional as having a structure in the program's flow was a big thing at the time and those would be seen like hidden gotos (the addition of 'goto' was probably seen a necessary evil for its time - but even then you had to explicitly declare the targets).


And Pascal was basically just an improved ALGOL which in part was designed to come up with a common notation to communicate algorithms in journal articles. The negative issues aren’t so bad in that light.


No, I’m looking at Pascal from a 1978 perspective. If there had been a good C compiler for the PDP10, TeX would not have been written in, actually, a macro language on top of Pascal (to help work around its shortcomings both in string handling and memory management; oh and that statement labels have to be numeric).


From a 1978 perspective, Pascal was indeed already becoming dated, sure. But by then Wirth had already developed Modula, and started Modula 2.

But even from a 1978 perspective, most of your criticism of Pascal is for being bad at things it was never designed to be good at, and never pretended to be good at. It was a teaching language trimmed down (in terms of concepts) from Algol W for ease of implementation over feature-completeness, not designed to be a systems language.

That it became the basis for so many language implementations that opted to solve the limitations of Pascal over starting from scratch is a testament to its success.


But at the time, it was claimed that it was a systems language! No less than C. A. R. Hoare claimed in 1977 that it was "the best language in the public domain for purposes of system programming and software implementation."

Source: Reference 18 of "Why Pascal is Not My Favorite Programming Language" (http://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pasc...).

Having had to use a "Pascal as defined by Wirth", by the way, everything in that criticism is dead on. Almost every one was an actual pain point for me when trying to write actual programs. (Not xor, but just about everything else.)


Hoare may have claimed that, but Wirth did not to my knowledge, and judging Pascal by the standards of praise given by someone who didn't design it is a bit weird.

(It's also worth considering that Hoare did posit e.g variants of Pascal, and it's unclear if he was writing of unextended Pascal, or in the abstract.or about a variant)

I don't think the criticism is wrong with the caveats Kernighan gave, because he went to lengths to address its use outside of the educational domain it was designed for.

Just not very relevant unless you happen to be dealing with the truly rare persons that were promoting an unextended Pascal as a systems language by 1981.


I am that rare person. I was using it as a systems language in 1986. Without the Turbo extensions. (We did have separate compilation.) Why yes, I am still bitter, does it show?


If you had separate compilation you were not using unextended Pascal. And it does not sound like you thought it was suitable, so you don't seem to fit the description?

Who in the world pushed a limited Pascal on you that late?

It does seem like something Wirth himself would have seen as an awful decision given that he'd worked on replacements for Pascal for a decade at that point, and wrapped up his work on Modula-2 to move on to Oberon.


I don't know who made the decision. It certainly wasn't me. But I only had a year of experience, and I had been laid off for three months, and I didn't realize the impact the language choice would have.

This was on a medical instrument, so maybe people thought of it as a "safe" language - one that couldn't buffer overflow and corrupt memory. But doing an embedded system with memory-mapped I/O in a Pascal without the Turbo extensions... yeah, that was a less than ideal choice.


Is a testament to its huge success, I'd say.


Perhaps you should compare Pascal with BASIC than C, I do think Pascal comes out better in that comparison.


> You are looking at the original Pascal from a modern perspective.

"Modern" as in 1981?

http://www.lysator.liu.se/c/bwk-on-pascal.html


By 1981, Wirth had long since moved on - Pascal was superseded by both Modula ('73-'76) and Modula-2 (first compiler released '79) in terms of Wirth's own efforts by then.

So, sure, already by 1981, the original Pascal was already old, and Pascal as a class of languages was still on a growth trajectory not because of "pure" Pascal, but because it was a very simple base to build something more advanced on top of.

You'll note that Kernighan very specifically recognises and addresses the distinction between the purpose of Pascal as a teaching language vs. his criticism of it for "writing real programs". That was somewhat relevant at the time exactly because a lot of people were using Pascal to "write real programs" and opted to extend Pascal rather than use more advanced languages as their basis. But it was of a lot less relevance with respect to the origins and purpose of Pascal.

Many of the lessons Kernighan describes are lessons Wirth was aware of as well, in designing Modula, Modula-2, and the various versions of Oberon. Though his approach was always to look for ways of solving these issues through careful simplification over ever more complex languages.

By 1981, when considering writing "real programs" for production use, Kernighan gave valid criticism. In 1970, when considering Pascal's aim as a tool to teach structured programming, it would not have been.

So, yes, "modern", even then.


Conjecture, both patterns you outline are the same phenomena, and stem from Pascal being a wonderful Algol canvas to create new languages from.

That Pascal was the LLVM of its day but with an added benefit that because of its simplicity, it could spread as a pure idea, being able to implement from a single mind, means that it would be used as the basis for language and system design.

The only way it could be more useful is if Wirth had spec'd a stable s-expr syntax as well.


It was too early for that at the time, unfortunately. p-code was an attempt at making it machine independent (but maybe a serialisation / s-expression would have done better), but the performance loss was too great for it to get wide traction.

Interestingly, one of Wirth's PhD students, Michael Franz, wrote his dissertation on Semantic Dictionary Encoding, a method that was reminiscent of lempel-ziv encoding an intermediate representation in a way that let the de-compressor / code-generator re-use templated generated code. Unfortunately at that point Java was underway (Franz thesis was published in '94) and Franz moved on to work on various code generation and verification for Java [I think, I haven't followed his newer work that closely]

But it doesn't end there. One of Franz' PhD students at UC Irvine was Andreas Gal, who did his thesis on tracing just-in-time compilation and went on to do TraceMonkey, and became CTO of Mozilla.

[I still think there's interesting stuff to be done with Franz' SDE work; I've periodically looked at it ever since I read it first time in '94, but have never had the time to devote to it, and keep hoping someone else will revisit it]


I was speaking metaphorically of Pascal being the LLVM of its day, more that it was a known base, portable, easy to explain and the first tool someone would reach for.

Templated IR code that has been LZ compressed sounds excellent, Thumb++ or uop buffer engine. It sounds like from the post below, that the instructions could even be pathologically the uop or VLIW instruction form and the SDE will effectively discover the compact CISC for your specific workload. Why not push the decode engine into the instruction decoder?

I found this awesome post that covers SDE, http://hokstad.com/semantic-dictionary-encoding

I think one way to explore this would be to implement compressed binary loaders, like done for demos etc, but target wasm, wrap it in a decompressor and have the wasm program decompress itself as it executed perhaps on a method by method basis.


> I was speaking metaphorically of Pascal being the LLVM of its day, more that it was a known base, portable, easy to explain and the first tool someone would reach for.

Yeah, I got that. My point was that they did actually try to take the next step as well, but it was too early for that part.

> I found this awesome post that covers SDE, http://hokstad.com/semantic-dictionary-encoding

That's mine, in fact :)

SDE doesn't necessarily "discover" an instruction-set per se, but certainly, the templates the decoder would put into the dictionary effectively does recover a lot of structure, and you'd be able to do simple stats and dependency analysis to identify constructs that could be turned into dedicated instructions, certainly.

I also think e.g. Gal's work on tracing has a lot of potential to be combined with this to be used as a profiling/feedback stage to allow the decoder to make better code generation choices over time if you maintain enough information to map optimised traced sequences back to the template fragments they came from.

> I think one way to explore this would be to implement compressed binary loaders, like done for demos etc, but target wasm, wrap it in a decompressor and have the wasm program decompress itself as it executed perhaps on a method by method basis.

That's an interesting idea. It would save having to build the final native instruction emission stage while testing. In fact, it'd be possible to build tools that simply demonstrates round-tripping the s-expression syntax as a proof-of-concept.


So glad I met you, :)

> decoder to make better code generation choices

Understand what you are saying, but there are lots of layers between the decoder and code generation and resulting uops. But if we are sending a compressed instruction stream, why not go up a level of abstraction send the IR, MIR, HLIR, s-expression, then s-expression with types, effects. Ok I am just gonna send the latex of the paper. ;)

speaking of round tripping the s-expr syntax

https://www.reddit.com/r/WebAssembly/comments/lil841/very_fa...


Pascal, even Wirth's Pascal, was a huge improvement over the other languages commonly used at the time. In Pascal's heyday in the 1970s and early 1980s, most programmers came to Pascal from BASIC or old FORTRAN (FORTRAN II or IV, not 77). Compared to these, Pascal offered: long identifiers (not 1 or 2 characters like BASIC or 6 like old FORTRAN), block structure and structured programming (not line numbers and GOTOs), record data types, pointers, and memory management with new and dispose (instead of statically allocated arrays and COMMON blocks), nested structure and lexical scope (instead of everything defined at top level). All this was a revelation and made whole new programming techniques and styles available.


... and a few more:

You can convert enum->int, but you can't go the other way.

All text input files must have eoln() defined at all times, so if an when you manage to open stdin, the program must halt until the user types a line of input (and eoln() will be true iff it's empty). Then when your program reads the line and the \n, everything halts until the user types the next line of input. It's virtually impossible to make a program that interacts with the user.

And if I had a nickel for every time a student got stymied by the fact that the syntax insists you must not have a semi-colon in front of an "else"....


About the choice of Pascal for TeX in 1980… was it the case that Pascal was the most widely available language at the time (at least at universities and the like where TeX would be used)? Is that why Pascal was chosen as the language for the "portable" TeX rewrite? Or was it just that Pascal was available for PDP-10 (was that a common computer?) and reasonably easy to translate from, so it was a good base?

About some of the other issues (system calls, a default for "case", etc), isn't it the case that most Pascal installations did have their own extensions that resolved this (even before Turbo Pascal)? (I'm actually surprised how many distinct Pascal compilers there seem to have been, with their own extensions… it must have been a language easy to write compilers for, which must explain some of its rapid popularity.) So some of this was more a failure of standardization (Wirth just didn't think about these factors enough to put them in the original standard, or update the standard fast enough) rather than non-availability in the Pascal that anyone used at a particular place. (Of course they'd become issues for porting programs over to another system.)


At the time, Knuth was exclusively using a PDP-10, "SAIL", running the WAITS time-share operating system. If you look at early ARPAnet maps, you'll see that maybe half of the machines were DEC PDP-20's; certainly at Stanford, MIT, CMU, etc., the first machines to be connected to (what was to become) the Internet were.

One main goal of the 1980's TeX rewrite was to make it as portable as possible, to support other popular architectures of the day. C wasn't available for DEC PDP mainframes; IBM 360s didn't have a C compiler at the time, either. PL/I, Algol, etc. were also not universally available. So, based on the fact that the Hedrick Pascal compiler existed for the PDPs, and a general notion that a reasonable Pascal was, or would soon, be available for VAX/VMS, IBM, and Unix (the gpc front-end for gcc), the choice was made. By the way, at the time it didn't seem like PCs and Macs would be viable platforms, being so memory-constrained (TeX wanted a whole megabyte to run in!) so that wasn't a big part of the consideration. And Unix workstations (Sun, HP, etc.) weren't yet really a thing yet.

It wasn't a super-comfortable decision, but seemed like the best choice at the time. Partly because of this, TeX is written to use just a subset of the Pascal language, so as to keep things simple, and thus ultimately translatable into other languages if necessary. It turned out that DEC eventually made a very good Pascal compiler for VMS, and similarly for IBM; but GPC lagged, and the popular Unix ports of TeX are based on C translations of TeX's Pascal code. Ditto for (all? some?) of the PC and Mac versions.

But, to your point, yes, all the various compilers of the time had their own work-arounds to the "default", "fopen", etc. problems. The fact that they chose different syntaxes wasn't a big issue, being easily addressable via the "WEB" macro language Knuth wrote on top of. Much more of a pain was Pascal's virtually useless string functionality and poor memory-management that required laborious work-arounds, resulting in code that is difficult to modify and debug. It's a real shame that standardization of the features needed to make Pascal a language that was really suitable for cross-platform production code didn't happen in time to make a difference. All of the code Knuth writes has been in C (via CWEB) for quite some time now.

[Edit: tried to say "star"-nix, but that made everything \it, so switched to "Unix".]


How Pascal critics love to ignore ISO Extended Pascal introduced to fix those issues.


That's because my scars of trying to use Pascal before the ISO extensions are still there.


So no access to either VMS Pascal nor UCSD Pascal then.


If I recall correctly, no. The only upgrade from Jensen & Wirth was separate compilation.


But then ISO Pascal is not 50 years old, far from it.

The Pascal from 50 years ago was not that great in practice.


It was released in 1990, a bit younger, 30 years.

More than enough to be aware of its existence, specially when ISO C first release is from 1989.

Also all ISO Pascal issues were fixed in ISO Modula-2, released in 1978, 42 years ago.


A huge, 2-decades gap. Way too late, because in the meantime, Turbo Pascal had been developed and made the language practical (and btw I'm still impressed by the quality of TP's "IDE" for the time), and in 1990 you had several competing standards that split the community, between original "academic" Pascal, Turbo Pascal, Object Pascal, ISO Pascal, and a little later Delphi. Add that to the fact Wirth himself had lost interest in the language for a long time and had developed several better ones (Modula-2 indeed, Oberon too).


Well, to be fair only clueless developers were using ISO Pascal without extensions, with VMS Pascal and UCSD Pascal being two well known ones.

Just like no one was using proper K&R C outside UNIX, rather dialects like Small-C and BDS C.


Thanks. What also ticks me off is that he isn't eating his own food, you cannot write the writeln function is pascal (even the function doing nothing), since you would need a variadic function which accepts strings of different length, both two big no nos.


FWIW Wirth's Pascal does not have a writeln, only write and you are supposed to use a special EOL character to indicate end of line.

But yes it is meant to be special (like read) and also has its own special syntax for formatting. However this isn't anything weird as the compiler was already implementing all other standard procedures and functions as part of the language itself anyway - there isn't a distinction between 'language' and 'runtime library' like you'd see in C for example.


Yes, but what I mean: he saw that the language was not expressive enough to handle a common and useful case. One should start to think about it, and not say 'oh well, since I am the language designer I solve it at compiler level'. I mean what if I want to write a function which formats things like writeln into a string (to allow for different decimal separators, or whatever).


Wirth did "start to think about it": He developed Modula, Modula-2, Oberon, Oberon-2, Oberon-07.

It was not addressed in his Pascal versions because he moved on from Pascal, not because he didn't recognise that Pascal was too limited for many use-cases.


Well, i can't speak for Wirth so i don't know why that didn't happen but my guess is that he either didn't think about it (i mean Pascal was already very expressive for its time) or it would considerably complicate both the language and memory handling for an arguably niche use case (remember that write not only accepts variable number of arguments but also different data types - and that doesn't even take into account the special syntax for padding, number of digits for reals, etc).

It is better to think read/write as akin to 'while', 'if', etc than as regular procedures. Making your own 'if' might be neat, but not something you'd see often outside of Lisp :-P


... and Forth :)


When people say there are two kind of languages, functional or procedural, then forth is the 3rd kind.


You are incorrect. Wirth Pascal has writeln, and in fact that is the only well-defined way to output a “newline”. There is no notion of \n or \r. (Gory details: when eoln(f) is true, the file variable f^ is supposed to be the space character!)


AFAIK this is Wirth's last document/version of Pascal:

http://pascal-central.com/docs/pascal1973.pdf

It doesn't mention Writeln anywhere. In section 13 (page 39) "Input and Output" it only mentions read and write for text input/output and at the end of it mentions that the end of each line must be indicated with the EOL characer. A few pages later (page 44) there is a table of standard identifiers and there is no mention of writeln either (but there is of write).


Writeln is certainly in Wirth’s “Silver Book” Pascal standard, hot off his horrible line printer in 1974: https://dl101.zlibcdn.com/dtoken/78641b52ce049f6f7825c9e8f81...


This looks like the manual for the Pascal 6000 compiler they had at ETH Zurich which extended the language that Wirth described. In fact in the preface it mentions that the standard is the "The Programming Language Pascal (Revised Report)" which in footnotes mentions is the 1973 version - ie. the document i linked above.

Confusingly enough though it does describe readln and writeln in an attached revised report (mentioning that the reason for their addition is that they can't rely on an EOL character being available) but this is not the text they cite as the standard Pascal nor the latest version of the report available from ETH themselves (which is basically a different scan of the same text i linked above): https://www.research-collection.ethz.ch/handle/20.500.11850/...

My guess is that they added these at some point later but didn't make a newer version of the standalone report (which is what i've seen in other places, e.g. http://pascal-central.com/standards.html refer as Wirth's last standard). I guess in practice it was simpler to have a single book act both as a tutorial and a reference, though i always though the 1973 report to be the last "standard" and didn't paid much attention to the other stuff released later.


Quoting the "Preface to the Revised Report" in my link: "...the language defined in THIS Revised Report is called STANDARD PASCAL." Two paragraphs later: "... the new procedures READLN, WRITELN, and EOLN have been included in the set of standard procedures..." Sorry that ETH has outdated stuff, but this is an accurate image of the actual, physical book authored by Wirth, as published and sold by Springer-Verlag, in which he says it describes STANDARD PASCAL. I have it in my possession if you don't believe the scans that are on the internet.


Uh yes, i believe you, i just mentioned that it is confusing that the first part of the manual refers to the 1973 report as the standard Pascal and then the report in the second part is not the exact same text and that the last revision available from ETH is not the latest one (most likely because there wasn't a standalone version made) doesn't help.


Interesting history, from Wirth's perspective. A little strange though that he mentioned Java and C# as inheriting the legacy of Pascal, but left out Delphi, through which C# indirectly inherits from Pascal. FreePascal is also worth mentioning but perhaps not academically very interesting.

Other, more modern, languages that come to mind as inheriting from Pascal are Go and OCaml. OCaml more directly inherits from Modula-2's notion of separate interface and implementation, and gains similar parallel compilation speed benefits from that: https://dev.to/yawaramin/ocaml-interface-files-hero-or-menac...

If you look at the imperative and some other parts of OCaml, the Pascal heritage is clear, e.g. comments (* ... *), for loop, while loop, mutable assignment with x := y, etc.


Delphi and Free Pascal are a bit too far removed from Wirth's idea of simple languages which is probably why he didn't mention them. Delphi 1 (let alone the modern variants) added more stuff to Turbo Pascal than the entirety of Wirth's Pascal and it is more of a new language that has his Pascal as roots - after all programming languages affect how you think about the solutions of the problems you are trying to solve and the way you'd write a Delphi/Free Pascal program would be very different than the way you'd write a Wirth Pascal program.

Notice that he sees Oberon as the successor to Pascal and Oberon is an even simpler language than Pascal was.


My respect for Wirth only grows over time. He has this philosophy that strikes me as Bauhaus, a focus on futurism w/o complexity, clean lines and modern technology.


> A little strange though that he mentioned Java and C# as inheriting the legacy of Pascal...

Java? That boggles my mind. Pascal was about simplicity, and Java is... very much not in the spirit of Pascal in that regard. Or so it seems to me. I would have expected Wirth to shudder when he looked at Java.


Delphi is a transition point between Pascal and C#, so.

In any case, I bet he wasn't that found of Delphi. You will also not find much love from him on the Oberon descendants, namely Oberon-2, Active Oberon, Oberon.NET, Zonnon, Component Pascal.

Modula-2 isn't that much big influence on OCaml, its modular ideas were already present in CLU, ML and Mesa.

Also OCaml, is actually Objective Caml, back when I was in the university Caml Ligth was still their stable ML implementation, with Objective Caml being released a couple of years later.


> Modula-2 isn't that much big influence on OCaml, its modular ideas were already present in CLU, ML and Mesa.

Modula inspired ML in this regard: <https://dl.acm.org/doi/10.1145/773473.178245>

ML modules alone don't provide separate compilation.


I stand corrected, thanks for pointing out the paper.


I'd argue that Turbo Pascal was that transition point not Delphi. The last incarnation of Turbo Pascal was Borland Pascal and it had much that was in Delphi thoughg OOP was improved on in Delphi. Delphi was just Turbo Pascal for Windows.


Delphi had a ton of features that you'd find in C# but were absent from Borland Pascal 7, like:

1. The 'class' defined objects which were always allocated on the heap

2. Classes themselves being values with their own metaclass types

3. Classes having their own class procedures which could also be virtual (belonging to their metaclass)

4. Being able to instantiate an object instance dynamically using a class value (metaclass instance)

5. Classes having properties with getters and setters

6. A rich (for its time) RTTI with classes exposing properties to it and being able to alter at runtime (you can query a class for the exposed properties, their types, etc and there are utilities for setting them by their name as as string) - this was the basis for automatic form and component serialization ("streaming")

7. "Fat" pointers for method callbacks that could be used to associate event handlers

And a bunch of other stuff i certainly forget. Even at version 1 Delphi's Object Pascal was a much more powerful language than any Pascal dialect before it (the Object Pascal in BP7 was a very small improvement on the one introduced in TP5.5 which AFAIK was mostly the same as Apple's Object Pascal - though i think one difference is that in Apple's objects must be allocated on the heap like in Delphi whereas in TP5.5/6/7 they can be allocated either in the heap or live in the stack).


> Even at version 1 Delphi's Object Pascal was a much more powerful language than anything before

Only if we ignore that many of its new features were already present in Eiffel, including those 7.

Here is another two Eiffel features that we still don't have in modern languages as widespread as it should, design by contract, and non nullable types.

Language archeology is an interesting subject, unless you are comparing Delphi with TP.

By the way, I really hated that with Delphi I had to deal with class and legacy object to express parallel concepts, and that OWL was thrown out of the Window and I had to rewrite everything back into VCL.


Sorry i meant anything when it came to Pascal, not in general. Many of the features found in Delphi 1 can also be found in other languages like Smalltalk (in fact sometimes i wonder if the reason of the massive changes between BP7 and Delphi 1 is Anders learning Smalltalk in-between :-P).

About objects, i agree that the two systems are weird and IMO it is made even worse with the introduction of 'advanced records' in Free Pascal and recent Delphis which also act as objects-except-not-really.

IMO the 'object' types should have been extended with the new stuff and have 'class' just be syntax sugar for 'pointer to object' and 'record' be just a synonym for 'object' for backwards compatibility (so that everything is really an 'object' - it is "Object Pascal" after all :-P).

I do not know how Delphi handles this nowadays but in Free Pascal 'class' has the most advanced features but must always be heap allocated (meaning you can't embed them in other compound types or allocate them on the stack), 'object' has a lot of the 'class' and/or 'record' functionality but arbitrarily missing some (e.g. support for management operators, so you can't have RAII-like objects) and 'record' (after explicitly enabling support for advanced records) does have more functionality available (e.g. management operators) but does not support inheritance.

You can work around it (e.g. put all your data of an object inside an advanced record with management operators and then put that record inside the object so that you can write code that does any cleanup when the object is declared on the stack and goes out of scope) but it can be annoying - and maddening when you work on the compiler itself and notice that all these compound data structures are implemented internally the same way and the compiler explicitly disables support for some!


Except you are forgetting the little detail that Object Pascal was born out of a collaboration between Apple and Niklaus Wirth for their Lisa OS, later picked up for Mac OS, and Borland picked up from there, and kept adding C++ like features to it.

So Wirth was influential in Object Pascal's birth, but cared very little with what happened later on.


> Modula-2 isn't that much big influence on OCaml,

It is ... check out https://dev.to/yawaramin/ocaml-interface-files-hero-or-menac... ... it quotes the OCaml creator (Xavier Leroy) directly on this.


Delphi is the usable version of pascal.


Well, different purposes and target audiences. Pascal was very much intended to be used in education, including as first programming language, while Delphi clearly targeted small-shop / individual professional software developers.


Honestly, the concept of a 'teaching' language (except maybe for primary school), is such an arrogant concept. It draws a distinction between master(teacher) who knows real languages and the student. It also wastes a lot of time for the student.


I think the three greatest living language designers are Niklaus Wirth, Anders Hejlsberg, and Bjarne Stroustrup.

Pascal definitely had a huge impact on programming, especially through Borland Turbo Pascal on the PC. I think Pascal's biggest weakness was that there was not a central figure helping guide evolution in a backwards compatible way. Wirth did a great job with Pascal, but moved on to what were to him greener pastures such as Modula-2/3 and Oberon, but which did not have nearly the impact of Pascal.


> I think the three greatest living language designers are Niklaus Wirth, Anders Hejlsberg, and Bjarne Stroustrup.

Andrers Hejlsberg is an interesting choice because of how uniquely excellent and influential he has been at his particular niche of what I'd call "last mile language design".

How anyone not suffering from Stockholm Syndrome could think that Stroustrup would come before all of, say: Kay, Goldberg, Ungar, Steele, Kelley, Hoare, Moore, Kernighan, Ierusalimschy, Figueiredo, Hewitt, Kowalski, Virding, Williams, van Rossum, Liskov, Omohundro, Dolstra and on and on is a bit hard for me to understand personally though -- but maybe these are just my biases speaking. There is no doubt his work has been highly impactful and (unlike say, php or perl) has been and continues to be the tool of choice for many of the world's top programmers.


It is not a Stockholm Syndrome at all.

I think Bjarne Stroustrup is a very good language designer, his problem is that his vision is not understood by those teaching C++ (which keep doing C with classes variations, inclusive using DosBox with Turbo C++!), those that use C++ but rather be able to use Haskell at work, and the dozens of people working on WG21.

You see this on his talks, blog posts and articles trying to steer the community into the golden path. Unfortunately they do not always listen, and his vote counts just for one person like everyone else.

Also note that C++ is the only ISO language whose original language author is around contributing to its development.

There are names on your list that I wouldn't consider for my list, for example.


You are an interesting person to have this discussion with, because you clearly have wide and deep exposure to programming languages. And although plenty of computing luminaries are not shy about their distaste for C++, you are in the good company of the likes of David Ungar, who also seems to esteem both Stroustrup and C++.

> I think Bjarne Stroustrup is a very good language designer, his problem is that his vision is not understood by those teaching C++ [...]

I really do no think that that this excuse cuts it; in my opinion to qualify as a very good language designer, you should either 1. really be able to lay claim to some interesting innovation (Iverson, McCarthy, Kay and Moore clearly qualify) or at least 2. have good design sense and avoid severe unforced errors (I'd put van Rossum in this category, but others will disagree). This goes doubly if you decide, as Stroustrup did, to start off with a programming language already full of gratuitious and fiendish footguns -- you should probably be extra careful in that case. Do you agree? I think it's very hard to argue that C++ doesn't add plenty of major additional, unnecessary footguns on top of those it already inherited from C, and it's not completely clear to me what major innovation C++ pioneered. I think there was real marketing genius in laying claim to being the zero-cost/pay-as-you abstraction language, and even C++ haters such as the Go authors seem to like concepts enough to basically adopt them for Go but what would you say the major breakthroughs C++ brought were?


For starters, C++ is the language that made RAII concept mainstream.

It was also the language that brought exceptions and generics pioneered by CLU and ML into mainstream computing of 90's 16 bit computers.

Stepanov ideas for generic programming initially prototyped in Ada83, grew to his vision in C++, thus bringing this ideas to a wider public, back when SGI became the keeper of the initial STL documentation.

Bjarne Stroustrup gave control of C++ in 1990 and became just one person vote, whatever happened afterwards was no longer under his control.

If you want his vision for the language before that event took place, you can refer to C++ARM and The Design and Evolution of C++ books.

He is the very first to call the attention of WG21 to the wrong direction of too many footguns with "Remember the Vasa" paper, but again, one person vote.


I never understood why C++ programmers like to make a big deal out of RAII (one of these purple acronyms only the C++ community can come up with). Object lifetime tied resource management seems like an utterly obvious idea, I mean how else would you do it? There is a natural synergy with deterministic destruction (via stack allocation or reference counting) and resource life time management and I suspect the only reason C++ gets credit is that it has long been the only popular language with both objects and stack allocation¹. But lisps have had exactly the same concept via with-* macros (which explicitly introduce a scoped resource and perform cleanup on normal or exceptional exit) for ages, and I assume this precedes C++ by quite a bit. Also, RAII in C++ is subtly broken: C++ programmers like to use it to relinquish system resources (such as file handles) where relinquishment can fail and C++ really has no way to deal with that.

> Bjarne Stroustrup gave control of C++ in 1990 and became just one person vote, whatever happened afterwards was no longer under his control.

Well, it's not like "C++90" was this beautiful and elegant language (MVP, fun interactions between overloading and C's implicit integer conversion rules etc) corrupted by later design by committee. Also, he pretty strongly endorsed at least C++11 ("Surprisingly, C++11 feels like a new language: The pieces just fit together better than they used to and I find a higher-level style of programming more natural than before and as efficient as ever.").

> If you want his vision for the language before that event took place, you can refer to C++ARM and The Design and Evolution of C++ books.

Any chapters/page-ranges in either (or elsewhere) that you specifically recommend? I've read a bunch of Stroustrups writings or interviews over the years, but I don't remember anything that struck me as particularly insightful or interesting.

> He is the very first to call the attention of WG21 to the wrong direction of too many footguns with "Remember the Vasa" paper,

True, and he also said that Boost whilst being high quality and you should totally use it sometimes goes a bit overboard in terms of complexity (no kidding), but given the footguns and convolutedness he is personally responsible for, it rings a bit hollow to me.

Concerning STL: that seems mostly Stepanov's baby (as you noted) and I'm not fully convinced on its merits; there are good things about it -- it is generic, it tries to separate algorithms from concrete data structures, and to make it possible to efficiently operate on a subset of a data structure. Lastly it made complexity bounds part of the API, which I believe may have been an innovation. But it's also quite cumbersome and error prone to use.

¹ Some older GC'ed languages expose stack allocation as an opt-in optimization (e.g. some lisps), but not as a true first class concept.


> But lisps have had exactly the same concept via with-* macros (which explicitly introduce a scoped resource and perform cleanup on normal or exceptional exit) for ages, and I assume this precedes C++ by quite a bit.

Indeed, but if you forget to use them no one is going to come around to remind you of that.

In Java and .NET there are static analysis checks that validate possible missing uses of try or using.

In RAII aware languages (it is not only C++ that has it), you don't need to forget to call anything.

> Well, it's not like "C++90" was this beautiful and elegant language (MVP, fun interactions between overloading and C's implicit integer conversion rules etc) corrupted by later design by committee. Also, he pretty strongly endorsed at least C++11 ("Surprisingly, C++11 feels like a new language: The pieces just fit together better than they used to and I find a higher-level style of programming more natural than before and as efficient as ever.").

It definitely was, hence why everyone was replacing C with C++ (Apple, Microsoft, IBM, Be) until the FOSS freedom fighters decided to push C as their main language.

As for the books, my copies are 2070 km away from my current location, so not much I can advise.

Anyway I am not here to change your mind regarding him.


> In RAII aware languages (it is not only C++ that has it), you don't need to forget to call anything.

Uhm, how about delete? C++ only introduced working smart pointers a quarter century in with C++11.

In the case that the lifetime coincides with the lexical scope C++ is robust, but so is, IMO, the with-* macro approach even without static validation (because unlike Go-style defer the construct itself introduces the object you want, so there is nothing to forget; if you really only care about the lexically scoped case you can also make this the only public API, and rule out errors that way).

I think the most compelling argument that can be made for RAII is that GC is a bad abstraction, because it special cases memory deallocation whereas "true" RAII can deal with fairly arbitrary resources (with the big caveat that de-allocation must never fail), but to really make a strong form of this argument I think you need something like Rust's ownership model; even the weak form needs more than C++<11 had.

> It definitely was, hence why everyone was replacing C with C++ (Apple, Microsoft, IBM, Be) until the FOSS freedom fighters decided to push C as their main language.

Beauty had nothing to do with it. Would I write serious software in C over C++? No, just for less terrible strings and resource management alone. But would you have picked C++ in the early 90ies over Turbo Pascal because of its greater beauty, or even substantial technical reasons?

The case of the enlightened companies vs unwashed FOSS masses is also a bit more complicated, and it's not that obvious to me the FOSS people made the wrong choice.

Microsoft: their software at the time was of notoriously low quality; early linux was significantly more stable (and performant). Be: I think I recall an ex-employee claiming on HN that their OS (despite innovative ideas) also had major quality issues. Apple: they sure did a lot of work over the years to avoid having to commit to C++. IBM: what software did they even write during that time? OS/2?

There are also major interfacing issues to other languages. And massive conceptual complexity, and difficulty agreeing on some "sane subset" over a huge community. Finally modern C++ these days takes, what, 100x longer to compile than C? I think a big part of the success of open source is due to the development of new languages with vibrant eco-systems, I suspect a massive move to C++ would have made that much more difficult.


I think I can understand what you are aiming at it seems like a stretch.

I guess one can make a case that modern C++ is Stroustrup´s idea, and one might even say "C with classes" (implemented by Stroustrup in the 80ies) is Stroustrup taking Simula's ideas and putting them on top of C so it wouldn't be really his doing. One might also add that he was still really you and frankly lacked experience. Following this, one might see the later additions to C++ in a nicer light, and frankly it was a challenging environment to improve upon pre-template C++.

The problem I have with this view is that I think he is responsible for strange choices about pre-template C++, syntax wise, etc. The fact that `X a();` is a syntax conflict between a function header for a function `X a(void)` or `A a = A();` is puzzling to me, and the list can go on and on, it almost feels like he was unable to extrapolate problems programmers might have when applying it.

Now, I am not sure how deep his contributions to modern C++ actually his or joint work, or even contributions by the large group of smart people that drive the standardization of C++.

Maybe for C++, Rust, D and Haskell one might open up a "Team-Category" in the best language designer competition and C++ would indeed get a very favorable mention for starting out from a bad state and trying their best at lifting the language to a better place.


Stroustrup´s idea is quite clear defined on "Design and Evolution of C++" book, everything else is WG21 own work.

> Maybe for C++, Rust, D and Haskell one might open up a "Team-Category" in the best language designer competition and C++ would indeed get a very favorable mention for starting out from a bad state and trying their best at lifting the language to a better place.

I doubt many C fervor users are jumping of joy for what WG14 has done since C89.


I doubt that it is much useful to point out a top-3 list in general, without clear metrics on which to rank people. More or less I'd rather talk about influential people who brought programming language design forward. I would also add "Dahl" for Simula, McCarthy for Lisp, Grace Hopper for Cobol for bringing in innovative ideas and the Standard ML folks as another example for tayloring a great mix of concepts.

Maybe that would also be a good strategy at talking about this, some people did revolutionize language design, others had a talent of remixing existing concepts to something that was really useful and and a programming language that's comfortable (Rossum and Wirth would fall under that category for example)


> I would also add "Dahl" for Simula, McCarthy for Lisp, Grace Hopper for Cobol for bringing in innovative ideas and the Standard ML folks as another example for tayloring a great mix of concepts.

I agree (maybe excepting Cobol), but sadly McCarthy, Millner, Dahl and Hopper are no longer fall under the category of great living programming language designers.

> some people did revolutionize language design, others had a talent of remixing existing concepts to something that was really useful and and a programming language that's comfortable (Rossum and Wirth would fall under that category for example)

Yes, I think to qualify as a great language designer you either have to be really good at "UX", without necessarily contributing major conceptual breakthroughs (like van Rossum or Hejlsberg) or come up with something original and valuable. I'm pretty sure Stroustrup is not that good at UX (the C++ edit-compile-debug cycle is almost singularly miserable) and I'm not quite sure about the original and valuable part either.


> Ierusalimschy

Interesting to see that name appear in that list


Lua has problems, but is an under-appreciated language, IMO. And the only of these problems which is an essential characteristic of the language (the table data structure) is an original enough idea that it can probably only be judged a failure in hindsight.


Wirth repents this himself, he though, like many of us, that developers care about quality and it would be obvious for them to adopt such languages.

Problem is this kind of programing languages go against the UNIX culture of move fast and break things, with considers them programming with a straight jacket.

So here we are fixing memory corruption issues, with hardware and OS vendors bringging back hardware tagged memory ideas from Lisp Machines to fix C machines.

"A Brief History of Software Engineering"

https://people.inf.ethz.ch/wirth/Miscellaneous/IEEE-Annals.p...


Programming philosophy is an interesting thing, and I don't think it's just a dichotomy. Wirth does have some things in common with the Bell Labs crowd and doesn't quite fit in the same niche as Dijkstra-esque Ivory Tower designs or very sound and secure computational mega-structures. Not a fan of functional programming, for one.

I think one quality of software engineering he would disagree is the "move fast and add things" we're so enamored of. Nothing is every removed and because everything is treated like a product, you have to add new features to "sell" something, including programming languages.

The "straight jacket" comment is interesting. I remember reading the "jargon file" that came with my slackware CD in the 90s and being quite pumped about the "punk" nature of the Unix crowd opposing "bondage and discipline" languages. A few years later, the same crowd of people forbids tabs in their C-clone or even has official formatting rules enforced via mandatory pretty printing.


Hmm, I've always appreciated Hejlsberg's implementation taste (at least in the first versions of his languages), but did he design that much from scratch? TP was a great compiler, but the language was pretty much taken from previous Wirth designs. C# was quite okay in the start, and I'll remain silent about C++.

I do agree with Wirth, but would probably nominate Kay, Goldberg, Wall and Meyer.

AFAIK Wirth didn't have much to do with Modula-3, by the way.


It depends on your scoping of languages. Heilsberg wasn't so much concerned with syntax, but Core API Building isn't any less important.


I do wonder how big his influence was there. Or in general. Given that the company bet highly on this and it involved the whole Windows platform, there were quite a lot of cooks in the kitchen there. So a "designer" probably acts more like a "language manager" there, as opposed to clean room academic settings. Do this, don't do this, have a deep look at Cω…


all relevant work, even if it is more coordination. I think knowing Delphi one can see a fingerprint in style that can be found again in C#.



Pascal was the first language I learned. I don’t remember my ch of it but I always liked := for variable assignment


I disliked it. Assignment happens more often than comparison, so it makes sense for assignment to have the shorter symbol (Shannon would agree). := may make more sense for students, but maybe not for professional programmers.

Then I injured a the little finger on my left hand. And then I loathed :=, because I had to get to the left shift key to type the :, over and over and over, and I couldn't use my left little finger to do it. (If you're a professional programmer, you may not think of yourself as a person who works with your hands, but you do.)


At least it is not “equal” which is conceptually confusing in many other languages .


Same as me. I really liked that too. I always read it as x becomes equal to y.


The sheer amount of bugs that could be avoided by using := in let's say C!

if (a = b) c=1;

oh boy... :)


I definitely enjoyed learning and using Pascal and Delphi. The language made so much sense! The code was structured, disciplined, easy to follow. I get the verbosity may not be for everyone, but it also made it much easier to read.

Delphi was also out of everyone's league for application development, and not just basic "forms + DB" apps. Here is a full-scale Civilization game clone written in Delphi: http://www.c-evo.org (I did write a mod for that game, but I was not the core dev)


A lot of my early formal learning was using Turbo Pascal circa 1990. it was a really nice language to learn in. Very easy.


Same for me as well. In school, we started with GW-BASIC and moved to Turbo Pascal. Pascal was a great learning language and the fundamental concepts I learned in Pascal translated very well to other languages. Looking back, I didn't even realize there's a difference between Turbo Pascal and official Pascal. The Turbo Pascal IDE is all we knew and it worked very well on those IBM PC's.


Same here, although 5 years later. I've learned static typing, byte handling, string handling, array handling, arguments by reference !, arguments by copy and even OOP with it. It was very easy to read and well structured. Clearly structured but powerful learning language at this time.


I used the P-code based UCSD Pascal on my Apple ][+ and it was pretty interesting, you could even compile Fortran to run on the virtual P machine.

However, when I got my hands on Turbo Pascal it was one of those eye-opening moments I'll never forget. It was such a smooth experience, I couldn't believe it was running on my dinky 8 bit computer, although it was a bit sad it was running on the Z80 and not the 6502, the best microprocessor of all times, ever.


I too loved programming on the 6502 processor. I did a fair amount of assembly language programming on micros that had it, like the BBC micro and Commodore 64, very early, when learning programming. I liked its instruction set, with all those addressing modes, such as indirect indexed or vice versa. Later when I learned a bit of 8088/8086 assembly, the latter seemed more complex, with its segment:offset format of specifying addresses.


The Pascal critics may like

"Why Pascal is Not My Favorite Programming Language" by Brian W. Kernighan, April 2, 1981 http://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pasc...


Which, despite the author, is not a comparison of C and Pascal. The backstory is that Kernighan wrote (with Plaugher) a book called "Software Tools". They wrote it in RATFOR, which is a front-end for FORTRAN. They wrote significant chunks of code - not just examples.

Some time later, Kernighan got the idea to re-write the book using Pascal - a much more modern, easy-to-use language. And after doing so, he wondered "Why was that so hard? Pascal should be way easier to write in than RATFOR." And he wrote this paper, analyzing why it was hard to write Software Tools in Pascal.


"No reference to any computer or mechanism should be necessary to understand it."

I share the sentiment, but even Oberon has pointers. It's also a strongly typed programming language. You could argue that types are a way of mapping how computers represent data (inseparable from the advantages of type checking in program design). For example, you don't simply declare an integer in some languages, you declare it as 8-, 16-, 32- or 64-bit integer. And many compiled languages have specific memory-related syntax.

Although I prefer strongly typed programming languages, it's actually dynamic languages that have shielded programmers (for better or worse) from types, pointers and manual memory manipulation. Without the need to expose computer workings, it's dynamic languages that fulfil the sentiment in Wirth's statement, much more than strongly typed languages.


I think Ada get’s quite close to that approach. Mostly you describe data structures you need, rather a representation the computer understands. The compiler is supposed to sort out the transition from programmer need to computer need. E.g.

type Day_Type is range 1 .. 31;

type Integ_Type is digits 5 range -1.0 .. 1.0;

The compiler then chooses the best representation.


Depends on the dynamic language, Common Lisp can be used to write full stack OSes.

Python also exposes pointers and manual memory manipulation via ctypes package.


Happy Birthday Pascal!

Thanks for teaching me almost safe systems programming with your dialects and descendants.


I did most of my programming in Delphi when I was a kid. Unfortunately, I then got stuck on it.

Any other language is either too slow or too unsafe.

Nowadays, I mostly use FreePascal, because so many people refuse to use non-open-source software, but FreePascal never caught up to Delphi. And there is no open-source ecosystem, probably because there is no standard package manager. Even if there are Pascal libraries, it is hard to find them or they become quickly unmaintained, so it is more economic to write everything myself. And then I could not find a job because no one is searching for Delphi programmers


Where do you think Free Pascal hasn't caught up to Delphi? AFAIK they're more the less the same with the main thing missing from Free Pascal is anonymous functions (they're in a separate branch though and will be added at some point "soon") and inline variable declarations (which wont be added because the Free Pascal developers feel that they aren't "Pascal-ish").

BTW Lazarus does have a package manager, just make sure you have the 'onlinepackagemanager' package installed in the IDE (should be installed by default with Windows installations but you may need to explicitly install it if you build from source).


>inline variable declarations (which wont be added because the Free Pascal developers feel that they aren't "Pascal-ish").

That is the big one

And just in quality. FPC compiles slower or even crashes during compiling. Debugging is worse with gdb. On Linux it uses longjmp exception handling which is rather slow. It tries to warn about uninitialized variables, but does not track properly which ones are initialized

>BTW Lazarus does have a package manager, just make sure you have the 'onlinepackagemanager' package installed in the IDE (should be installed by default with Windows installations but you may need to explicitly install it if you build from source).

Lazarus is not FreePascal. They only include libraries that are Lazarus Packages (with an *.lpk file), and I use libraries that do not have one

On the other hand, I used Lazarus (only LCL and lazbuild) is another project, and tried to submit it to an open-source app store, and they refused it, because they would not install Lazarus, because Lazarus was an IDE.

Freepascal actually has a package manager, too: https://fppkg.cnoc.nl/

But it is not advertised, so there are even less libraries on it

Anyways, what is missing is a single command line call that installs all dependencies I used in my projects. And these dependencies are neither in opm nor fppkg


> That is the big one

Well, it isn't a matter of "catching up" then since the developers have explicitly decided to not implement that feature in terms of language design grounds (though IMO considering how many additional dialects and subdialects the compiler supports not adding it even behind a modeswitch is weird - but i wouldn't really call it a "big one").

> And just in quality. FPC compiles slower or even crashes during compiling.

I'm using FPC for almost two decades now and the only time i had FPC crash was when i fed it around 300MB of randomly generated code to see how it'd react - it was the 32bit binary too so that probably helped :-P. I'm not saying it wouldn't crash for you but personally over the years over many projects, machines, installations and OSes (i've used it on Windows, Linux, macOS, OS/2 -well, eComStation, same thing- and even Haiku) and never had it crash so i doubt it is really a thing common enough to stand out.

In my experience FPC was always rock solid - Lazarus too as of late 2000s (it used to be more crashy before that though).

For performance i agree that it isn't ideal but on the other hand Free Pascal supports a much greater number of architectures, a bunch of different dialects and subdialects and all that make the compiler much more complicated and thus slower. And in practice even Lazarus which AFAIK is almost 2 million lines of code compiles in a matter of seconds (on my PC) so i do not really mind that much (though sure, it'd be nice if it was as fast as Delphi).

> Debugging is worse with gdb.

This is really gdb's issue though, there is a new debugger written specifically for Free Pascal that from the little i've used it is much more reliable.... and i just read as i was typing this that it should work under Windows properly now, so i might try it (i also have issues with gdb).

> On Linux it uses longjmp exception handling which is rather slow.

Hm, perhaps though i've often done profiling in my apps and never noticed anything related to exceptions to enter the picture (i'm not using exceptions myself much though - especially not in any 'hot path').

> It tries to warn about uninitialized variables, but does not track properly which ones are initialized

There have been a bunch of changes very recently and in fact they introduced even more warnings :-P but TBH i only notice those whenever the initialization is based on conditions that may have other side effects and the use is based on those side effects - you writing the code know this can't happen, but the compiler cannot tell that (i'm not even sure if it is possible in all cases). I usually just either initialize the variable to "default" or hide the message through an IDE directive (it'll show in command-line builds though).

> Lazarus is not FreePascal. They only include libraries that are Lazarus Packages (with an *.lpk file), and I use libraries that do not have one

Yeah but Lazarus is by far the most common IDE used with Free Pascal so it is a safe assumption that someone using one also uses the other. And besides the Lazarus package manager doesn't contain every Lazarus package either, however what you originally wrote was that there isn't a package manager, not that everything should be available from it.

> On the other hand, I used Lazarus (only LCL and lazbuild) is another project, and tried to submit it to an open-source app store, and they refused it, because they would not install Lazarus, because Lazarus was an IDE.

Well, that isn't an issue with Lazarus either, though you should have told them that Lazarus does have a command-line project builder called lazbuild so it can also act as a building tool.

> Anyways, what is missing is a single command line call that installs all dependencies I used in my projects. And these dependencies are neither in opm nor fppkg

AFAIK fppkg was introduced in recent years and from what i remember from the mailing list not everyone was interested in package managers (e.g. personally i am not, i tend to stick with whatever comes with the language since external stuff may stop be developed - and Free Pascal and Lazarus come with a ton of packages out of the box anyway) so chances are you won't ever be able to find all dependencies in a package manager. It is there more as a convenience to avoid downloading stuff manually.


>Well, it isn't a matter of "catching up" then since the developers have explicitly decided to not implement that feature in terms of language design grounds

I am very frustrated by their opinion

And that is also for the uninitialized variables. With inline variable it would be much simpler to always initialize all variables.

>I'm using FPC for almost two decades now and the only time i had FPC crash was when i fed it around 300MB of randomly generated code to see how it'd react - it was the 32bit binary too so that probably helped :-P.

The more complex my project becomes, the more it crashes. When I made too many units, I got this https://bugs.freepascal.org/view.php?id=37478 and now it has been going on for years (with the related issue)

And I use the trunk version (3.3.1). It is much less stable than the official releases.

>Hm, perhaps though i've often done profiling in my apps and never noticed anything related to exceptions to enter the picture (i'm not using exceptions myself much though - especially not in any 'hot path').

The problem with longjmp exception handling is that it is slow, even if no exceptions are used

Like this

    procedure abc;
    var
      s: String;
    begin
      s := '123';
    end;      
becomes

    project1.lpr:15                           begin
    0000000000401240 55                       push   %rbp
    0000000000401241 4889e5                   mov    %rsp,%rbp
    0000000000401244 488d642490               lea    -0x70(%rsp),%rsp
    0000000000401249 48c745f800000000         movq   $0x0,-0x8(%rbp)
    0000000000401251 488d55e0                 lea    -0x20(%rbp),%rdx
    0000000000401255 488d75a0                 lea    -0x60(%rbp),%rsi
    0000000000401259 bf01000000               mov    $0x1,%edi
    000000000040125E e8dd340100               callq  0x414740 <fpc_pushexceptaddr>
    0000000000401263 4889c7                   mov    %rax,%rdi
    0000000000401266 e8b50b0000               callq  0x401e20 <fpc_setjmp>
    000000000040126B 4863d0                   movslq %eax,%rdx
    000000000040126E 48895598                 mov    %rdx,-0x68(%rbp)
    0000000000401272 85c0                     test   %eax,%eax
    0000000000401274 7510                     jne    0x401286 <ABC+70>
    project1.lpr:16                           s := '123';
    0000000000401276 488d7df8                 lea    -0x8(%rbp),%rdi
    000000000040127A 488d35971d0900           lea    0x91d97(%rip),%rsi        # 0x493018 <.Ld1$strlab+24>
    0000000000401281 e8fa9c0000               callq  0x40af80 <fpc_ansistr_assign>
    0000000000401286 e825380100               callq  0x414ab0 <fpc_popaddrstack>
    project1.lpr:17                           end;
    000000000040128B 488d7df8                 lea    -0x8(%rbp),%rdi
    000000000040128F e88c9c0000               callq  0x40af20 <fpc_ansistr_decr_ref>
    0000000000401294 488b4598                 mov    -0x68(%rbp),%rax
    0000000000401298 4885c0                   test   %rax,%rax
    000000000040129B 740f                     je     0x4012ac <ABC+108>
    000000000040129D e8be390100               callq  0x414c60 <fpc_reraise>
    00000000004012A2 48c7459800000000         movq   $0x0,-0x68(%rbp)
    00000000004012AA ebda                     jmp    0x401286 <ABC+70>
    00000000004012AC 4889ec                   mov    %rbp,%rsp
    00000000004012AF 5d                       pop    %rbp
    00000000004012B0 c3                       retq   
That one, innocently looking function includes 4 different functions calls for exception handling, although there are zero exceptions involved.

Any function that has automatic memory handling, has to do: Register itself for exception handling at the beginning; unregister itself at the end; catch any exception and then forward it to the next registered exception handler

>There have been a bunch of changes very recently and in fact they introduced even more warnings :-P

They are making it worse every year.

>I usually just either initialize the variable to "default"

Sometimes not even that works. Especially with nested functions accessing variables of the outer function.

If I had a say, there would be inline variables with mandatory initialization. Any variable not initialized at its declaration, should be a compile time error.

>Well, that isn't an issue with Lazarus either, though you should have told them that Lazarus does have a command-line project builder called lazbuild so it can also act as a building tool.

They google Lazarus, see it says "IDE", and then there is no point in talking to them anymore

>(e.g. personally i am not, i tend to stick with whatever comes with the language since external stuff may stop be developed - and Free Pascal and Lazarus come with a ton of packages out of the box anyway)

Me, too. But that is really bad.

The RTL is coupled with the compiler, so you cannot really update it without updating FPC. Like the JSON parser does not work properly. They fix it (or I did it), and with a package manager you could update it. Or with an external library you could update that. But with FPC, you need to update FPC. But you cannot, because there is no release, for years. So you need to update to trunk and compile FPC yourself. But then you get an unstable, untested version, which is going to break a lot of unrelated code.

Today I actually looked at another JSON parser. It is probably better than FPC's, but I am not going to use it, because it is too much trouble to install it.

I looked at its code, and one of the first classes they implement in the JSON parser unit is a string builder. In 2021, string builders really should have been a solved problem. Implementing string builders is a big waste of time. FPC has a builtin string builder, but it is too slow, it is basically unusable.

I also wrote a string builder. It is around 40 times faster than FPC's. (I actually profiled it recently. Mine has about the same performance as the default Java one. I started using Pascal, because I thought a native language would be faster than a byte code language. But apparently not. Inlining the char append method is important. Without inlining, the Java version was twice as fast. On the other hand, I reimplemented append in amd64 assembly. It is 5% faster without inlining than the inlined version. Now with an active open-source community, we could write one assembly version for each platform. That would give maximal performance. But I do not want to maintain it. 5% speed is not worth the risk of having a bug on some platform. I do not have time to test x86 and arm versions. I need one version that works the same everywhere. So I removed my assembly version again ) They could have just used my string builder in the JSON parser instead developing a new one. But they probably do not know about it. And they will probably think it is too much trouble to install it. So people keep reimplementing the same things again and again, and the ecosystem does not get anywhere


> I am very frustrated by their opinion

Right, but having different opinions on the language's design is not the same as Free Pascal having to catch up with Delphi.

> And I use the trunk version (3.3.1). It is much less stable than the official releases.

Well, yeah, it makes sense that an in-development version would be less stable than a stable release. This is why stable releases exist, otherwise their releases would just be nightly builds - and in fact the FPC developers even recommend avoid using development versions.

(though i also use the SVN version in one of my projects and didn't have any issues)

> The problem with longjmp exception handling is that it is slow[...]Any function that has automatic memory handling, has to do

Ok, so those two are different things. A function being "slow" is something you have profiled in your application and ended up being "slow". What you describe is something that could cause your application be slow, but not something that is slow by itself.

As i wrote previously, i've profiled my code on Linux - in fact i use Linux almost exclusively for profiling my Free Pascal code because prof shows me very detailed results with mixed assembly and source code that i cannot find a way to get under Windows (every tool i've tried assumes i use PDB files...) - and exception code has never been part of my profiles. I've profiled things from string parsers, to physics code, to rendering code, etc.

But even beyond that if that is the case then... you can just solve it. If you think that using automated memory handling slows down your program's execution, then do not use it - automatic (or even manual) memory handling being a potential bottleneck isn't something surprising. Knowing the environment (which includes the way your compiler produces code) and adjusting your code to be performant for it is part of optimizing your program.

Though in practice chances are fpc_ansistr_assign is going to be a bigger bottleneck than the exception support calls. And indeed, doing a quick benchmark with your code under Linux with perf (in a VM, though that shouldn't make much of a difference in practice) shows exactly that despite this being a synthetic example that will never be an issue in real code. In fact just modifying the code to do something more than just an assignment, like counting the number of spaces in a string pushes the exception stuff even lower in the profiler and modifying the code to do something that you may actually find in real code, like tokenizing the string pushes the exception support calls to sub-0.8% levels alongside some noise stuff from the kernel.

This is basically a red herring, it looks like it might be a problem but in practice you'll end up against a lot of other issues trying to optimize your code before that becomes a problem by itself.

> They google Lazarus, see it says "IDE", and then there is no point in talking to them anymore

Ok but again, this is a problem with the people you're talking with, it has nothing to do with Lazarus or Free Pascal.

> Me, too. But that is really bad.

I do not see as bad at all, otherwise i wouldn't do it. The reason i stick with whatever comes with the language is that it tends to be stable and the FPC developers (as well as the Lazarus developers) care about not breaking people's code and have a very good track record when it comes to preserving backwards compatibility. I have code from almost two decades ago that compiles fine under Free Pascal without changes. Even the big change that was introduced in Free Pascal 3.0 with the codepage-aware strings only had me change one line of code in my codebase where i was treating a string as a byte array - i just changed that to rawbytestring.

Packages by random people on the other hand might break or become unavailable or whatever. I do not want to rely on that.

> The RTL is coupled with the compiler, so you cannot really update it without updating FPC. Like the JSON parser does not work properly. They fix it (or I did it), and with a package manager you could update it.

You can always download the fixed JSON parser sources from the trunk.

> But with FPC, you need to update FPC. But you cannot, because there is no release, for years. So you need to update to trunk and compile FPC yourself. But then you get an unstable, untested version, which is going to break a lot of unrelated code.

Yes, having to either wait or build your own is something you'll have with any project that doesn't release new versions just for the sake of releasing new versions. But on the other hand Free Pascal's (and Lazarus') approach of actually releasing when the software is ready to be released (and this after making several release candidate versions before) means that you have a much higher chance of the version you get to be stable.

> Today I actually looked at another JSON parser. It is probably better than FPC's, but I am not going to use it, because it is too much trouble to install it.

Ok, but you not wanting to use a library that supposedly works better for your case because you think is too much trouble to download it isn't really Free Pascal's problem.

> I looked at its code, and one of the first classes they implement in the JSON parser unit is a string builder. [...other stuff about string building...]

Free Pascal is made entirely by volunteers, there is the open source community you seem to want - make a bug report, submit your improved string builder and if it is deemed good it'll become part of FPC. A faster string builder which is a genuine improvement (assuming it doesn't break any existing code - remember that backwards compatibility is important) would be gladly accepted.

If not, just put it somewhere others may be able to take it (we already established there is a package manager but you can just put it in your site or whatever).

If you do not want to participate then ok, that is fine too.


Thanks for the detailed answers, though I am not the one who asked.


Sounds like you have a wonderful ecosystem, only that it is very small.

My recommendation is to make your next side project a package manager, or coopt an existing one. It is amazing that the project is as vibrant as it is without the network effects of a package sharing ecosystem.

I would really recommend using git itself as the basis for a package repository.


Pascal was the first language where programming clicked for me. Pascal also happens to be my given last name, which makes it just a bit more special to me.

Funnily enough I learned Pascal using Turbo Pascal which makes me sound older than I am, because I learned Pascal around 2010 and my high school's CS department was just way behind the times. By that point we were using a nearly 20 year-old 16 bit compiler in Windows XP.


I always find weird when schools use Turbo Pascal instead of Free Pascal considering Free Pascal's IDE looks pretty much the same as Turbo Pascal 7 but also works on in any OS.


Inertia and/or lack of knowledge, likely. There are colleges in India which even today use Turbo C (maybe v2.0) as the tool to teach C, for example, even though there are things like Bloodshed Dev-C++ and GCC - GCC even on Windows via probably Cygwin, MinGW, or WSL (not checked if GCC works on all those, think I have seen it on 1 or 2 of them).


Good old memories. Circa 2006-2008 Azerbaijan. College students used to attend competitions, mostly ACM like questions, solved on paper. Only finals were solved using computers. Most of us used Delphi later on, but dumped quickly. Need to check what's happening in Delphi scene.


I think the article at https://people.inf.ethz.ch/wirth/Miscellaneous/IEEE-Annals.p... is even more interesting (thanks to user pjmlp).


Niklaus Wirth is a huge inspiration to me. I'm particularly struck by the compactness and internal simplicity of the Oberon system, as well as its associated hardware, Wirth's simple RISC (not to be confused with RISC-V) architecture implemented in the LoLa HDL.


The thing I most miss in modern languages is that unlike Pascal they don't automatically include a built-in Set type, (typically using fast bitmap arithmetic).


Ah! A first language, kind of like a first love! :)


Not quite a billion dollar mistake (like null in other languages) but the with statement in Pascal was on reflection a big mistake.

I have wasted a day trying to untangle some code that was blown up when I removed a with statement. Unfortunately it didn't completely cause compilation errors but in a few cases linked in the wrong thing and totally misbehaved at runtime.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: