"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
-- Edsger W. Dijkstra
As someone who got their start with BASIC -- mainly from "BASIC Computer Games" no less -- I was always kind of offended at that quote.
As someone whose first language was BASIC, which led me down a path that eventually culminated in a CS Ph.D, I have also not been a fan of that quote.
It implies that Dijkstra was a terrible educator who could only truly teach blank minds. For all his imagination and creativity on theoretical CS and math, he was very rigid in other ways. Imagine a history professor saying that students who have been exposed to various myths and ideologies are mentally mutilated beyond hope, or a literature professor saying that students who have been exposed to pop culture retellings of the classics are mentally mutilated beyond hope.
Remember that Dijkstra looked down on anyone who wrote using word processors, because he thought any academic should be able to work out their argument in their head and just write it down. Eventually he came to even reject the mechanical typewriter.
Dijkstra's maxim is both harmful and clueless. Many of the greatest programmers in the world started with BASIC. When I "graduated" from BASIC to Pascal, I very quickly dropped the BASIC approach to program flow, as any software engineer worth their salt would do. Dijkstra infantilizes programmers, as if it wasn't obvious that BASIC's approach is cumbersome and people just keep programming C++ as if it was BASIC...
Once you go to a "proper" programming language, there is nothing to "unlearn". The new approach is so obviously better! At the same time, having an interpreter up within 2 seconds of turning the computer one was AMAZING and got a lot of people interested in programming.
Much of the software world you see today was built by people who started with BASIC!
BASIC is explicitly a beginner's language. It may not be the best design for advanced programming, but as an instructional tool for someone who doesn't even know what a program is, it's almost second to none. The lessons it teaches are profound: that programs are ordered sequences of instructions that computers can follow, that control flow can involve branching, loops, and conditionals and what those are, and the rudiments of breaking out a program into subroutines. This is a real mindblow for people who've never actually programmed before -- it means they can make the machine do what they want, even very sophisticated (from their perspective) things. It's only when you've seen better -- Algol, Pascal, Lisp -- that BASIC seems miserably wrong in comparison.
As for Dijkstra -- arrogance in computer science is measured in nanodijkstras.
Same, and I always hated that quote also. If anything, BASIC exposes you to low level assembly concepts: instructions are processed in order based on their 'address' (line number), unless you have a conditional, where you jump (goto) a new address. Variables are global by default, as they would be as data at an address in memory.
I suspect that BASIC may have got a bad reputation in the same way as PHP or JavaScript did, where the accessibility of the language and infrastructure around it allows people who just want to achieve a specific goal to easily participate.
That influx of people with the attitude of "I don't care how computers work, I just want to know enough to solve my problem" shifts the stereotypes around those language users and may erroneously put the fault of it onto the language itself. It certainly feels that way during hiring, where it seems like developers of vastly differing skill or aptitude tend to cluster heavily around certain "friendly" languages.
BASIC didn't really raise the abstraction level away from imperative statements and loops. It let us hack faster, but it didn't help us to write programs that were provably correct. I think that is what Dijkstra was getting at. He was worried students would think this was good enough. And he was right. BASIC influenced a generation, my generation. The generation that gave us arguably the next BASIC, Python. A language with little coherence, that mixes imperative commands and mutation with the lambda calculus; and has no static types. A language that makes easy tasks easier, but hard tasks harder.
It's a very Sapir-Whorf sentiment, that somehow we are infected by what we learn, unable to imagine outside of those formative experiences. And like Sapir-Whorf, also false!
Sapir-Whorf in the conventionally conceived sense of language-as-local-limitation is clearly false, partly as it invalidates itself as an origin story. (How would you invent a language if you can't think outside of it?)
Sapir-Whorf in the sense of language-as-influence however is clearly true, even if academics occasionally state otherwise, because at their core all languages used by communities of a size greater than one individual rely upon semantics that are socially dictated.
Personally, perl screwed me up a lot more than BASIC - but who can argue with the expressive power of regex for text extraction and matching problems?
I have noticed, however, that the more experience one has with C, the more flummoxed they are by Rust. It's why I tell C old hands who grouse about Rust: Rust isn't for you, it's for your replacement.
Do you think so? I would consider myself an Expert C programmer. I haven't written a C compiler but I can see how I would go about it, I know a lot of the weird arcana, and I followed WG14 enough to be heartened but not astonished when C23 got #embed while C++ still can't unstick their version of this work (and thus in practice if you're a C++ programmer you should hope your vendor's C++ compiler just offers the C feature anyway)
But I fell very much for Rust in about 2021, and now definitely wouldn't write any more C. Rust has coherent answers to a lot of questions that, to my mind, should be in any C programmer's head.
Now, maybe it helps my CS course's first language was SML/NJ and of course Rust is basically an ML in a trench coat pretending to be a semi-colon language like C. But that CS course was at least half a decade after I began writing C, which was in turn after many years of BASIC (and somewhere in there a little bit of Z80 assembler, but man assembler sucks when the CPU is as limited as the Z80 was, you young people who have a fucking floating point multiply instruction don't know what you've got etc...)
To me, the veteran C programmer, Rust's implementation makes lots of sense, while at the same time it's type system appeals to the more principled Computer Scientist I was taught to be, and its tooling to the Engineer I became as an employee.
Two examples: Wrapping<i32> is exactly how a 32-bit signed integer actually works in any vaguely modern computer. The machine can do this, and so unsurprisingly on a real computer this is nice and fast, despite also being completely safe. But, Wrapping<i32> is a nice elegant type, Wrapping is a polymorphic type, which grants to integers the property of wrapping (modulo) arithmetic for their normal operations, and so i32 is just a type parameter. Beautiful, and yet also fundamentally exactly how the machine works so it's fast.
Second, Option<&T> is exactly the same implementation as a C-style pointer. But it has the same ergonomics as rich Maybe types from a language like ML. So you use it in an inherently safe way, with the compiler catching many common mistakes you could make because they don't make coherent sense in the type system - but at runtime it's exactly the same performance as the nasty C code you'd have written where those mistakes wouldn't be caught.
Rust had include_bytes! from Rust 1.0, include_bytes! is philosophically the same thing, given a filename, it gives you an immutable reference which lives as long as your program, to an array of bytes from the file -- &'static [u8; N]
Me too! I even maintained an application written in Visual Basic in my spare time while working with Java professionally. I can honestly say I have 20 years of experience in BASIC :D
As someone who got their start with BASIC -- mainly from "BASIC Computer Games" no less -- I was always kind of offended at that quote.