A while ago I started making a game (https://www.youtube.com/watch?v=GwBiJR_rj_w) that was going to be about typing in program listings from magazines and books (like "back in the day")... but the "twist" was going to be that the whole world turned out to be alterable and scriptable with BASIC. The WIP title was "BASIC Instincts".
The problem was I spent way to much time having fun making the interpreter (in JavaScript) and then Else.HeartBreak() was released and it took the wind out of my sails (because it was great!)... I might go back to it now though, writing interpreters is really enjoyable!
I still have fond memories of the various BASICs. I first used BBC basic at school (unofficially), and it seemed like magic. I had an Atari ST and a got a copy of STOS at home.
Whilst BASIC encourages all kinds of spaghetti, I'm still convinced it is a better language than Python to start on. Control flow is so obvious to non-programmers when it has line numbers.
Not all BASIC dialects have line numbers, there are some very elegant version of basic out there. GFA basic for instance and BBC Basic. Both have named functions and do not use line numbers.
Yes your are 100% right, the line numbers were there in BBC Basic, but mainly for editing rather than control purposes (though you could use the dreaded GOTO statement if you absolutely had to). Keep in mind that this was just before the days of the full screen editor embedded in your favorite programming language.
GFA did away with them entirely.
The BBC had an odd editor, it was partly line and partly screen based, allowing you to move a secondary cursor across the screen with the arrow keys, and then a ´copy´ key to copy whatever character was under the second cursor to the input cursor as though that key had been pressed. It sounds terrible but it was actually quite quick in actual use.
What you’re describing there was pretty standard across all of the 8bit micro computers. A few machines had the copy feature (eg the CPC464) and the control flow in the BBC BASIC wasn’t anything that couldn’t be done in pretty much every other dialect from that era (be it Microsoft BASIC, Locomotive or whatever).
Personally I think it’s a bit of a stretch to say line numbers were mainly there for editing. Even if you ignore GOTOs the most common method of running pseudo-functions on the BBC Micro was GOSUB.
The BBC personal computer (that ran BBC Basic) also had a cool feature that allowed you to drop down into 6502 assembly language right in the middle of your BASIC program, just by enclosing your assembly language statements in [ and ].
I remember trying it out at a British Council library which had one such machine available for members to use. Although I had done a little 6502 assembly language programming earlier (using both the PEEK and POKE approach as well as using an assembler), just for fun, on other home computers, I never got to do a lot of it, since by then, I moved on from BASIC and assembly to languages like Pascal and then C. Good times.
There were actually a few machines that ran BBC BASIC. I’ve got a BBC Micro model B hooked up in my “man cave” and I still play on that from time to time.
I’ve also got a few Amstrads CPCs - another range of British 8bit micros. They ran Locomotive BASIC which was largely inspired by BBC BASIC so there is a hell of a lot in common between those two dialects.
BBC BASIC was pretty advanced but most people’s exposure to it was in schools so if kids wrote anything, it was usually using a common syntax that they’d learned on their Sinclair, Commedore or whatever (BBC Mircos weren’t cheap from what I recall).
So I think while the FN / FUNC approach was arguably better, I’d be surprised if it was more commonly used than GOTO or GOSUB. But who knows, maybe your circle of friends did things differently?
and the BBC manuals were pretty good. So was Newman and Sproull as well as David Levy's computer gamesmanship. And now that we are talking about this I do remember when I used the line numbers. I came to the BBC from the TRS80 Color Computer/Dragon 32, and that one had the good old Microsoft ROM basic, which did not have such luxuries.
It really clicked for me when writing a musical score editing program, that I tried to put together using the 'old' way and then tried it again using the 'structured' way. The second time around the program made it to completion and it was lots easier to understand so I never really looked back after that. So to me that became the 'normal' way of doing things but I totally see how it could be that this experience was not typical.
You are also right that they were not cheap, my fully decked out 'beeb' (double drives, 256K ram expansion) cost me more than a years worth of savings. But looking back it was most probably the best investment I ever made. It gave me a career.
Interesting stuff. Thank you for sharing. I do think BBC Micros (Or at least 8bit BASIC machines in general) we’re responsible for a great many careers - mine included.
I never used a Dragon in the 80s but do have a Dragon 64 packed away, waiting for me to hook up and play. I keep meaning to get it out but between work, family, and the other retro hardware, it sadly never gets a look. I’ve heard Dragons were nice machines though.
Yes, I read that there were some more structured-programming-oriented BASICs that came out some time after the structured programming movement started [1]. IIRC, one such BASIC was by a company founded by John Kemeny [2], one of creators of the first BASIC. It might have been called TrueBASIC or PureBASIC or some such name. Think I read about it in some computer magazine.
[ Kemeny's family settled in New York City where he attended George Washington High School. He graduated with the best results in his class three years later.[2] In 1943[1] Kemeny entered Princeton University where he studied mathematics and philosophy, but he took a year off during his studies to work on the Manhattan Project in Los Alamos National Laboratory. His boss there was Richard Feynman. He also worked there with John von Neumann. Returning to Princeton, Kemeny graduated with his BA in 1947, then worked for his Doctorate under Alonzo Church, also at Princeton. He worked as Albert Einstein's mathematical assistant during graduate school.[1] ]
And the company and BASIC dialect he created later were both called True BASIC:
Kemeny was a really interesting guy. His obituary[1] gives some good background on him. I read his 1972 book, "Man and the Computer"[2], in the early 90s. Some of his predictions about how computers would be used in the future (ubiquitous networking, online shopping, etc) were spot-on. His idea of "mainframes" providing centralized services seemed backwards to me in the early 90s, when I was just getting my PC onto the Internet for the first time, but it appears that he got that one right too.
Thanks for the info. I read his obituary that you linked to. It really is interesting, as you said. He had great vision. And did a lot practically too, of course.
I know, and they (like Python) all make excellent second steps. The line numbers are bad for structure, but good for stage 1 learning is what I am saying. In fact, letting people find the limitations of line numbers and GOTO statements is an excellent way to show them why you would want something more structured, rather than just telling them.
I do not know about that. Teaching people something bad first only to show them a better way later still wastes their time and ingrains bad habits. Personally I prefer teaching the better way right from the beginning (assuming I know what the better way is).
> Static types and Object Oriented Programming are better ways, I think most of us will agree?
No....
I'm with you on static typing but disagree with OO. I know other developers I strongly respect with every permutation.
Personally, I probably wouldn't object to OO so much if Java weren't such a commonly taught first language, which gets right back to the discussion at hand.
I do not think the Scratch vs Java comparison is a fair one here. There is nothing wrong with Scratch.
But bad Scratch vs good Scratch or bad Java vs good Java would be a fair comparison and in those cases I would advocate for teaching the right thing from the start.
I still have a BBC basic re-implementation in C floating around here, code written in 1989, would that help you? If so I'll be happy to tar it up and send it to you.
Programming without language-level support for structure isn't “something bad”, and in practice it resulted in learning how structure is implemented at a low level.
Plus, mandatory line numbers is what enables having a REPL that's also a program editor, which is a big win.
Does anyone remember some installs shipped with a game called gorilla.bas? A fun little game where you had two gorillas across a cityscape and the whole idea was to take turns throwing bananas at each other to see who can get a critical hit.
What made it hard is you had to enter both an angle and speed and hope you calculated the right combination to hit the other player.
Eventually I wrote my own version of the game in flash/actionscript....but it’s long gone.
When I was first learning about programming, I had hours and hours of enjoyment just from modifying that game even though I didn't quite know what I was doing. Just things like changing the banana speed or making bananas spawn other bananas when they hit.
Another fun one was called Labyrinth (I think it was spelled LABRNTH.BAS or something). I liked it so much that years later I started making two remakes (one in HTML canvas/JS, one in DCPU-16).
QBasic was a great introductory language for a kid wanting to learn programming IMO.
To get a bit theoretical, one way to write a lexer is to use a state machine called a deterministic finite automata - a fancy term for a state machine that is finite and always gives the same output given an input. Each new character moves the state, and there are terminal states when you reach e.g. a space which will spit out what kind of token you just read.
You can show that the things a regex can compute are the same things that this kind of state machine can compute. I believe that most regex under the hood use such a state machine, so it's very natural to use regex to tokenize.
You can't do it in general, e.g. to tokenize Python you'd need a layer that'll count tabs (because INDENT and DEDENT are tokens). That it, unless your regex dialect is Turing complete, and if my memory serves, some of them are? I can't really remember. In general though, you can't do it with "normal" regex.
It's the way I always approach it when parsing a DSL, but always line-by-line, rather than a regex across the whole script.
I know it isn't going to be the most performant approach, but if you're comfortable with regular expressions it's really simple, and performant enough for most cases.
A fun project! I had a buddy tried this, wrote it in C++ over a week. Had console, graphics, formatting and networking (I think).
The trick was first a parser, then use C++ objects for each construct and instantiate one for each parsed element. The objects had a 'list' and a 'run' method. They were linked in programmatic order. So run the first one, which returns the next one to run (since it may vary with IF, GOTO etc)
Sounds like interpreting from an instance of the AST. Pretty easy but there are gotchas; like handling break/continue in loops. It’s an interesting excercise everyone ought to try IMO.
I once wrote a Logo interpreter in BASIC... For my Computer Studies GCSE (UK national exams at 16) final project. It never actually worked, but it was a heroic effort!
I once wrote an assembler in BASIC. We extended the instruction set of a microprocessor by some external circuitry, so the standard assembler was not very useful. BASIC was fine for the job. And it actually worked :)
Don't know if it's the same bug, but "PRINT 1/0" returns 1. "PRINT (1/0)" prints nothing. Using "LET a=1/0" raises a div by zero error. Also, "LET a=1%0" panics.
Thank you! At the airport and no access to my desktop so I could not run the code but I suspected that was going to cause a problem. Deskchecking ftw ;)
In the first case, maybe it's treating the '+' as string concatenation - so 3+4 is giving 34? Then it's just dropping the 5 because it can't multiply a string and a number.
Oooh wonderful! I wrote hundreds of games as a kid in QBasic and have had the idea for years to write a QBasic to JavaScript transpiler so I could actually showcase some of them online. With Go adding its WASM support, I’ve actually been considering a Go interpreter as an option.
Technically, I believe they (Paul Allen & Gates) ported the Basic from the Harvard Dec-10 using a cross compiler. This was before there was licenses in code.
The problem was I spent way to much time having fun making the interpreter (in JavaScript) and then Else.HeartBreak() was released and it took the wind out of my sails (because it was great!)... I might go back to it now though, writing interpreters is really enjoyable!