When I was around 12 years old, a group of 3 friends decided we wanted to create computer games on the 64 (and later 128). We started to learn about sprites and how to do rudimentary animations. Nearly every day after school we'd share some new thing we had learned the night before. We'd pool our knowledge. It was such an incredible time. Do you know what phrase I NEVER heard during those days? "What, you didn't know that?" I loved that collaborative feeling. If my friend was having a hard time understanding something, I just figured out a different way to explain it. It's such a far cry from the nonsense I see in the various software dev communities today. Perhaps it's because none of us felt like there was a competition between us.
This is exactly it one thousand times over. The C64 was a gateway drug and everybody that was 'into it' really was a kindred spirit.
C64 BASIC has ingrained 1 evil in me. "goto"
Atleast once a month I'll be working on some script and I'm stuck but a little voice in the back of my head will say: "a goto would fix this part you are stuck on!" And damn if that doesn't reverberate into the past.
I went the hardware route.
I bought additional hardware for my C64.
First the 1541 (5.25" floppy drive), then the 1581 (3.5" floppy drive), then a 300/1200 baud modem, then a 2400baud modem.
I've been hardware hooked ever since. I'm now a sysadmin working with *nix servers.
I tried programming. I really did. I meticulously copied one of the sample programs out of the back of the C64 manual. It never worked.
Programming BASIC was a lot like programming assembly language in terms of how the GOTO and GOSUB keywords worked. GOTO led to a lot of criticism over "spaghetti code", but the limitations of GOSUB were worse.
GOSUB would put the calling location on a stack so you could RETURN to it later but there was no stack for parameters, local variables or return values so you had to use global variables for all of those.
You could not write recursive functions in BASIC unless you implemented a stack yourself using arrays. It was easy to compute Fibonacci iteratively, but people had to sort in BASIC all the time and wrote bubble sort, shell sort and other algorithms that were slow but easy to code in BASIC as opposed to Quicksort.
If you have the kind of functions that exist in C, Pascal, LISP, ML, Python and many other languages then you can write a simple Quicksort in a few lines of clear code.
> Programming BASIC was a lot like programming assembly language in terms of how the GOTO and GOSUB keywords worked.
One of the things many people fail to understand when they criticize GOTO and GOSUB is that it is essentially an expression of what is happening at the hardware level. Modern languages didn't really do away with them. They simply added a layer of abstraction that made it easier to develop reliable software. Unfortunately that abstraction also has overhead, which as problematic when you had a few kilobytes of RAM to work on with early personal computers. I would imagine that it was also problematic on the early multi-user systems that BASIC originated on. It isn't that GOTO is evil. It simply became less necessary for developers to use it in high level languages as technology improved.
Of course, the other thing that made spaghetti code inevitable was the line number based syntax. Until development tools improved, people were basically plopping new statements in random locations because they needed more "space" between existing statements. Yet line numbers were used in early systems since the BASIC REPL also served as a line editor. (At least on personal computers. I'm not sure how it worked on mainframes.)
editing programs with a text editor, saving them on cassette tapes, assembling them, etc. In the same amount of RAM you could have fit a FORTH implementation and with a disk system you could have an experience similar to BASIC based around editing individual disk blocks. With 64k of RAM I would run a C compiler on that color computer and people did the same with CP/M.
So far as mainframes at first they didn't have text editors, instead you would put together a deck of punched cards and submit that to the FORTRAN compiler which would output the object code to another deck of punched cards.
It wasn't unusual for people in the 1980s to use BASIC preprocessors that would read a text file, append line numbers, and let you use structured loops, and GOTOs with named labels. I read about
MSX Basic had "RENUM" or some such, which would renumber all the lines to for instance 100, 110, 120 etc. It also automatically updated all GOTO and GOSUB. So if you ran that all the time, it was almost like line numbers in a text file, but with an extra manual "RENUM" chore thrown in.
Dijkstra doesn't describe the go-to feature as "evil" but as "harmful" and it is.
The problem, which you even mention, is that it's impractical to work with larger programs because their control flow becomes too hard to understand. Now, if you have (as many of the earlier BASIC systems did) only 4096 bytes of RAM you can't write such complex programs anyway, you don't have enough RAM. But even by the time these 4K home computers start to appear on the market the price of a modest business computer is tumbling, and such a computer might have dozens of kilobytes of RAM.
Once a flow diagram you can draw on a whiteboard isn't a correct description of your whole program, but merely a high level summary, go-to is just a foot gun.
Definitely a gateway drug. I had an Apple ][+ around that time too. The manuals for that were incredible. Luckily, my dad's startup had tons of old hardware for it. Printers, Koala Pad, plenty of disk drives, modem, EPROM programmer, 80 Column Card with C/PM and Pascal.... even though it was about 10 years old, it was great. But the C64 always seemed far more approachable. That brown box with the clacky keys... And I was too young to know about Djkstra's "Goto Considered Harmful". It probably could have made me a far better engineer if I had picked up better habits sooner.
> C64 BASIC has ingrained 1 evil in me. "goto" Atleast once a month I'll be working on some script and I'm stuck but a little voice in the back of my head will say: "a goto would fix this part you are stuck on!"
goto is perfectly fine if used right. Kernel C code tends to use a lot of goto, to consolidate return points and to clean up resources on the way. A long function with no goto to one common return point, or multiple staggered ones, is suspicious.
The problem that led to the famous paper was rather that goto was abused, used at places where "higher level" control construct like "for", "while" and so on would be better. Partly because the languages just didn't have any, like for example... C64 BASIC.
Yep, used a lot of goto's writing framework software that worked with CoreFoundation on Mac OS. We used it a lot in early Mac Toolbox framework code as well.
Often within the scope of a function we would need to allocate/create dictionaries, arrays, other objects requiring disposal. MacOS's CoreFoundation API often returned NULL when creation (or insertion, etc.) failed. The sane thing to do when you got an unexpected NULL was to "bail" from the function. But rather than return immediately, there was clean up code at the bottom of the function — code like "if stackDict != NULL {CFRelease(stackDict);}". So we often added a label (called typically "bail") just before the clean up code and would "goto bail".
If you programmed in machine code (without a fancy assembler), you can't simply move your instructions around in memory to make room for bugfixes, so instead you place a couple of nops here and there (usually in strategic places like the start of a function or right after a conditional branch, so that you can later replace those nops with jumps or subroutine calls to add patches.
OK, I get it now. You brought me back. I now do remember peppering my X86 asm code with those nops when writing early code generators, for example if a JMP instruction might go past 127 bytes.
You could also theoretically understand everything (with enough time and patience). Down at least to what all the chips in the C64 did, if not even the transistors inside them. That's what I miss, the feeling I was in control of my computer. Now the complexity is so high I feel like I can barely scratch the surface.
I feel really badly for anybody starting off today. The technology landscape is terrifying and it's really hard to know what to learn or even where to start. With a C64, you could master that machine inside and out very easily and while you were at it, really understand how a CPU works. It's amazing what they were able to do with a CPU with 3000 transistors (compared to the 50 odd billion in modern processors)
> Do you know what phrase I NEVER heard during those days? "What, you didn't know that?" I loved that collaborative feeling.
There was also far fewer variations of things to confound that info sharing. You had a c64, or 128, or Atari 800? You had the same everything as everyone else - same manual, same BASIC, same registers, same books available. You didn't have to worry about what version of something you had, or whether something got upgraded, or what video card you had, etc.
When there was a disagreement about something, it generally wasn't hard to at least point to a common base to start from.
The C64 Programmer's Reference Guide[1] was the game changer for me. 6510 instruction set, the computer's full memory map, register maps for all the chips, detailed information about I/O, the KERNAL, and so on. I don't recall it coming in the box, I had to save up my $$$ and actually buy the book.
Yes, and don't forget the great schematic diagram in the back! It definitely didn't come in the box with the C64; I had to save up as well. I still have my copy, tattered though it is, sitting here on my office bookshelf. That, and the book _Assembly Language Programming with the Commodore 64_ [1] changed the way I viewed computers.
Over the in the other significant English-speaking economy, we were a lot poorer in the early 1980s, and as such, anything priced in US$ was too expensive.
So things like Atari 8-bits didn't sell well here. In fact only the budget C64 did, and it was an expensive machine in early-1980s Britain.
Which was a good thing, because it encouraged a flourishing local market in locally-made computers.
Although the C64 sold in the millions, and so is familiar to many, Sinclair's ZX Spectrum was even more common over here. And although we didn't know it back then, it was huge behind the Iron Curtain too, in the form of dozens and dozens of unauthorized clones. Every Communist nation had its own ZX Spectrum clone, or maybe several. Some adapted to display Cyrillic, some built from imported bits and some from Soviet bits, some with discrete logic in place of Sinclair Research's ULA.
And all of the Euromicros had better BASICs than the C64.
Sinclair BASIC wasn't great but it did graphics and sound. The best was BBC BASIC on the BBC Micro from Acorn. Named procedures, with local variables and recursion. IF...THEN...ELSE, various loop constructs, and inline 6502 assembler.
I reckon it's partly the terrible BASIC of the C64 that turned everyone against the language:
It really was an amazing time. Seeing that bootup screen with the blue on blue still turns on that feeling of, "Exciting discovery awaits".... Prior to accidentally taking "Computer Science" in school (it was that or drafting), I had no idea what it was about. It was like trying heroin for the first time. Commodore Pets at school and a year later, I worked my ass off all summer the save up to buy a C-64 when it first came out. Parents chipped in for the tape recorder.
I had that experience. We were an apple2 middle school and a bunch of us would stay after and work in the computer lab. I wrote a little game in low res graphics. We'd trade tips. Lamentably the teachers weren't much help. We'd ask the music teacher who would be in there sometimes as he seemed to have some good knowledge ("music and the apple 2 wasn't easy.. you could "click the speaker" and by doing so rapidly you could get tones..)
We had access to "Nibble" magazine which had a lot of printed code. I got my "sound" routines from that. Between that and "Beagle Bros." programs. It was fun. We never figured out machine code...We got the basics, but it was just too much.
Though we just had each other as our lab wasn't internet enabled (as was the style in the early 80s)
Ah good old `GR`... `PLOT`. When I found about `HGR`, `HCOLOR` and `HPLOT`, I thought it was the greatest thing ever. And having that tiny little built in speaker made everything I did with PEEK and POKE sound so terrible. But, it kept me trying new things over and over again.
My first thought back then with my C64 and BASIC: "I'll make it so when someone types LIST they won't be able to see the code, by writing more BASIC code that prevents it".
REM<SHIFT-L> - it's weird the things that stick in your head after all this time. Likewise SYS 64738, 64760, 64767 and a few others. One to really mess with people was:
POKE 53280, 0 # set screen to black
POKE 53281, 0 # set font/foreground black
SYS 64767 # fast restart which doesn't unset colours. At that point unless they know what's happened they cant RUN-STOP/RESTORE to get out of anything.
I do miss the 64 (and the Amiga which followed). The variety of machines back then was really refreshing compared to now.
Well said. Perhaps it was also because you and your friends had the humility to realize that it was you who didn't understand something a few days earlier. Helping others to learn and working with a group where learning is allowed and nurtured is incredibly important.
“The best thing for being sad," replied Merlin, beginning to puff and blow, "is to learn something. That's the only thing that never fails. ... There is only one thing for it then — to learn. Learn why the world wags and what wags it. That is the only thing which the mind can never exhaust, never alienate, never be tortured by, never fear or distrust, and never dream of regretting. Learning is the only thing for you. Look what a lot of things there are to learn.”
> Do you know what phrase I NEVER heard during those days? "What, you didn't know that?" I loved that collaborative feeling.
If someone says "What, you didn't know that?" about something they only learned yesterday, they're being a jerk. Because this was all new to all of you, it became harder to feel like saying that.
Where there times when people who discovered sharing an interest were more keen to mine into each others insights or differences, than of late? Where people generally keener to increase and test and propagate their outlooks in person. Perhaps just my own foibles or relative age, but in later years it seems more contentious to conversationally dig into and challenge others takes on their subjects of interest. Its a though as individuals we have no call to query an others interest, no more than to pleasantly listen and affirm them. Perhaps with interests now developed more through information technology, where contrasting perspectives are already collected and competitively marked, it has become harder to be intellectually generous and curious in person.
What's more sad is that nowadays you're expected to have not only breadth of knowledge (sacrificing depth) but to also learn fast. We romanticize speed. We want the chef who creates quality food in fast food speeds!
Something that the author of the article could also touch upon is that, lack of resources was also a boon. I recall reading interviews of great programmers of the yesteryear saying how they would "read the same book twice over to understand deeply" and "sometimes I'd read the same concept from another book to make it really click", one of them being John Carmack!
A naive assumption I made after reading statements like those was this: When performing, sometimes you go fast to pressure yourself but under that pressure the odds of delivering quality work is greatly reduced. However, when comprehending something new, you cannot romanticize speed. You almost have to value comprehension and depth. Speed comes as a byproduct of the hours spend understanding.
Instead of that we favor speed of questionable learning quality, we tag ourselves "jacks of all trades" because you know enough Next.js to ship the product but not enough to tell what you could have done better or even why we chose to do A over B outside of "the stackoverflow answer told me to". That is, until the client complains. We really misuse the whole "premature optimization is the root of all evil" to "premature readability", "premature anything". Gotta go fast!
tl;dr aside from the programming environment, not having access to "quick and dirty" answers to any problem affected everyone's expectations of delivery and performance, and that also played a role in how we learned.
“We” in your statement is the business community. Developers are craftsmen, and I don’t think I know a single one personally that values speed over quality. It’s the businessman that don’t want to pay for quality, rather the minimum to turn a buck. The bottom line. Which isn’t necessarily a bad thing, but it is a bad culture to embrace. There’s a reason Toyota is the number one car manufacturer in the world today. And it wasn’t from bean counting and focusing solely on the bottom line as these western so-called business geniuses embrace.
I wonder how much of the difference is because you were friends working collocated, and not mashed together arbitrarily as coworkers communicating via Slack and cameras-off calls.
There was probably a great deal of that. We also played instruments together in an attempt to be "in a band". I didn't find the uppity know-it-alls until college. Discussing Delphi/Object Pascal with people via NNTP was definitely my gateway into, "everyone is smarter than you".
Your age is off just a bit - I almost thought you might be the founder of a trio of C64 (and C Pet) programmers with the same idea. It was an exciting time to be 'into' computers!
parents were happy that I do not waste so much time anymore on the computer.
so no computers for me any more. I think they never understood what I was actually doing. Coded some nice games and also a "use your joystick as a music instrument".
also party and girls became much more interesting at that point in my life.
with 19 I needed a job - as I became a father myself.
Applied for a dot come job. Started coding again. Found out that not much has changed. Code was still code and with enough trial and error, lots of reading and thinking about a problem you could figure everything out.
> parents were happy that I do not waste so much time anymore on the computer. so no computers for me any more. I think they never understood what I was actually doing.
As a parent I now wonder about the ways that I might do something similar to my kid.
Or perhaps more importantly, what is the c64 of my kid's generation that I can buy for her?
Yes, it's "retrocomputing on easy mode": an ARM Linux computer running VICE and preloaded with a carousel of C64 games. But it can be set to boot into BASIC just like a real C64, and it gets you about 80% of the way to the experience of the real thing without needing to recap mainboards or track down a missing pulled SID on eBay. And it hooks straight into modern TVs and uses modern peripherals (game pads, etc.). It can be programmed in BASIC or any other language/environment for the C64 that's sideloaded onto it.
If she wants to mess with more modern paradigms of computing, get her a Raspberry Pi 400 or something also. Old laptops with Linux are good for this use case also -- a used ThinkPad will work wonderfully. With modern tools, she can even more easily write 6502 programs to sideload onto her THEC64!
>Or perhaps more importantly, what is the c64 of my kid's generation that I can buy for her?
This may not be what you mean, but there are a couple of modern devices that function pretty much the same way, just with faster CPUs and better graphics:
> Or perhaps more importantly, what is the c64 of my kid's generation that I can buy for her?
Raspberry Pi. The Pi 400 is especially close, with the same keyboard form factor and all.
It’s not as immediately programmable as the 8-bit micros were (hard to beat booting up right into a BASIC prompt for that), but Linux is still more tinkerable than most other devices you’ll encounter and the default Pi distro has a bunch of educational stuff preinstalled.
Using an old desktop that my dad had installed Linux onto had a huge impact on me in my formative years. I credit that for at least part of my career success.
I appreciate this is a 3-day-old comment now, but consider MISTer - https://github.com/MiSTer-devel/Main_MiSTer/wiki - which allow access to all of these older systems rather than being locked down to one of them. You could then also move through time, from an original Apple I, through Commodore 64, Amiga, etc. Also permitting your child to learn about FPGA, hardware, etc. at any point.
Are you me? Similar experience - Commodore 64 was my first computer (albeit a hand-me-down). Folks kept yelling about spending too much time on the computer while I was trying to get stuff to run lol.
> The emulator implemented here only supports POKE for accessing screen memory. To keep things simpler, it starts at offset 0 rather than at offset 1024 as in the real Commodore 64. When you access an invalid address, e.g. by waiting until the ball runs off the screen, the operation fails and the program execution will stop.
He doesn't need a full emulator to show the principle.
I actually did skip those, I never made it that far.
I was using the page in C64 reader mode (due to the colors making it impossible for me to see anything) and so I never realized that the image to the right was actually an emulator, I thought it was just a screenshot.
My fault ^_^
Still though, in my opinion it'd then be better to skin the thing as something other than a C64 so as not to avoid the confuzion for no reason.
> I'm not trying to create an accurate Commodore 64 simulator though. The point is to show a few things that we can learn from for future programming systems.
Apparently you haven't bothered to read the manual.
I agree with the author, one thing great about these systems was having BASIC as a kind of systems programming language, with high level niceties, and a REPL.
Something that younger generations taught in the ways of C have no idea how it went.
It is no small feat that what is essentially pre-debugged C64 Basic psuedo-code, is yet dynamically illustrated quite nicely by bespoke javascript rather than a pulled-in browser c64 emulation module.
Besides the unusual errors in the displayed code, the lesson seems to flow nicely and I think may be great when "actualised" to actual c64 Basic. I only missed not being able to scroll the screen with arrow or space keys, mobile and mouse wielding readers may not notice that issue.
This is a really weird little simulation of a faux-64.
The virtual machine is not in uppercase mode, typing stuff into it goes in as lowercase and doesn't get properly parsed. (This might be a Safari bug).
As far as I recall there was no DELETE keyword, which this thing uses instead of NEW.
Different revisions of the C64 ROM did different things with regards to clearing the color memory when clearing the screen, my early c64 would set it all to white, while later revisions would (I think) set it to whatever the current character color was.
I guess this is trying to simplify things somewhat to make its points about the power of booting up straight to a BASIC prompt, but it sure does annoy anyone who spent time hacking on a real c64!
It’s kind of interesting to remember that those 8-bit computers essentially booted up in a BASIC REPL. Starting an input line with an number was adding to your code, otherwise, whatever you typed would be executed immediately. I remember being confused about how Pascal worked because how did you input a program without line numbers? The whole concept of a programming editor was alien to me.
The early PCs and BASIC provided such a simple and effective computing environment where any newbie could start reading the online manuals (known today as built-in manuals) that came with BASIC interpreter and start writing some simple code immediately into the integrated editor that also came with BASIC!
I think the loss of a simple, integrated and interactive programming environment like BASIC has been a tragedy. Is there anything like BASIC today that any child, secretary, or grandmother can pick up and learn with as little hassle as possible?
I used to think that Python was the answer to this. Not dealing with dependency hell or any other "enterprisey" features, just plain Python with maybe a graphics lib for outputting graphics to the screen. It requires almost no rituals to get something going. But of course, you have to boot into your OS first, open a text editor or IDE, etc -- it doesn't replicate the "instant on" appliance feel of the C64.
I loved my C64 and BASIC was my first programming language... but boy was it limited. It wasn't a good BASIC for the standards of its time. It was hard to program games with it, and in order to do any graphics you had to PEEK and POKE, which can hardly be considered "programming in BASIC" or user-friendly...
Thrilling. I am old but not normally prone to nostalgia. This site gets across the instant gratification one felt with the C64. I started with a Vic20 and soon chafed at the 3.2K (or so) RAM limitation. Obtaining the C64 was a quantum shift. As others have pointed out, the instructional materials were abundant and often good. Tomas Petricek's lovely site is also a model of interactive instruction.
When he explains
PRINT "HELLO WORLD"
he says "To a modern programmer, it is amazing how little it takes to get from booting the machine to printing hello world. "
When I wrote my first compiler in the 80s I made sure you could run a program that simple. I am somewhat surprised that only Python and arguably Javascript give you that kind of immediacy. With all the advances in languages it constantly surprises me that there's no universally available language that lets you start out so quickly.
I've been loving this series and I hope he continues it.
Being relatively young (25) and not living through the times of these classic computers, I knew the basic (heh) ideas behind programming for them, where the whole thing is essentially a REPL and you write line numbers to add to the program. But I never saw demos farther than printing a line and then doing a goto to print it infinitely. I didn't understand the actual workflow/iteration process programming for these machines, writing something more advanced where you're figuring things out as you go.
I have to say my conclusion is I'm glad that in modern times we have the ability to go back and insert as many lines as we want in between any other lines :)
Just love the minimalism and simplicity of BASIC.
I tried to program in Scratch and Scratch Junior with my 7 years old and it's just overwhelming for me as an adult software developer not to mention a child. Click here, move here, this container and that, not to mentions the browser/mouse issues.
I don't understand why it's so popular, maybe it's because that's the only application they teach kids in schools?
I'm thinking about just booting up my old 486 PC and show MS Basic.
One thing I recall from learning C64 BASIC when I was in 6th grade was that, back then, any sort of documentation was difficult to get your hands on. I was given one BASIC programming book as a birthday gift, and it was made clear to me that was all I was gonna get. Technical books were expensive back then, at least for my family they were.
That it why it was so much a joy to buy magazines. Those magazines with their program listings, the adds, the letters. I remember to this day when I was with my girlfriend for a long time typing long listing of POKEs to hear music from the magnificent SID.
Not sure why, but I didn't really know about the magazines back then. All my knowledge came from friends and reverse-engineering others' code (de-blitz!)
It was, but if you could start building a collection, it didn't take that much to basically have documentation for Everything. All the ROM routines, timings, pin diagrams, BASIC features, etc, etc. Maybe a dozen books, maybe even half that? I don't remember. But at some point you pretty much had everything you'd ever possibly want to know. That was really the last computer I ever had that feeling with. Everything since is just layers and layers of complexity that changes faster than you'll ever get a handle on it and truly understand the whole machine inside out.
Some of my first programming experience was on a TI-84. I didn't have any documentation for it, so I spent hours looking at the function listing and guessing what arguments that function might take. It worked, I was able to write the programs I wanted.
This is beautifully done. I'll see if I can coax my kids into trying it.
However, this gives a lot of credit to Commodore for something that was completely standard at the time on all 8-bit computers (and then some). The real credit belongs to
Dartmouth BASIC [1].
Especially when considering the original context (very very limited machines), the genius of the BASIC model (not so much the language) is something to appreciate.
Interestingly, John G. Kemeny, one of the two behind it, used to work for Richard Feynman and is one of the "martians" [2]
Memories of being 10 and writing a C64 program to generate D&D character stats and generally dumb stuff like handling dice roles and random encounters. Felt like magic.
Growing up with one of these, I really thought that a career in computer programming was going to be more or less like getting stuff to work on a C64. Shame it isn't.
I think nowadays the equivalent is to buy an inexpensive handheld PyGame console and write one's own games using Python. It at least teaches a more useful language and shields the internals from the user. As a bonus users get a hang of general game programming process by using a library.
I'm thinking about buying one for my best friend's son and teach him PyGame programming.
For myself I prefer something more bare metal, such as asm or C game programming. I think there is probably a NON-CARTRIDGE handheld somewhere to quench my thirst but need to find it.
This blue screen brings so much nostalgia. It was such a limited computer, but as a kid, it felt like it has unlimited potential with many secrets yet to be discovered.
I just picked up my first C64 last weekend at the midwest vintage computer festival. It needs a little work because it's one of the revisions with the weak video output, but otherwise it's in great condition.
I can't wait to start playing around. Seems like a great sandbox to finally get comfortable with assembly. I think I'll try diagnosing the issue this weekend if I can muster up the energy.
Hey I was there, too. For those in the area this is an event you cannot miss and this year it was packed. I went there with my 15 year-old, he spends a ton of time on Xbox so it was great to see him enjoy playing with a PET, Sinclair Spectrum and the C64.
So it’s not all old timers who love these old machines, the new generation can enjoy them, too. Given that it’s a mystery why some company just build and sell a C64 / Amiga/ Sinclair clone for a reasonable price. Next batch of YC anyone?
This makes me sad on a regular base. The C64 was my first computer when I was like 16ish. I spend nights and weeks learning programming with Basic and did some really cool things like a text editor (of course besides playing games like Gianna Sisters or North and South).
But I was young and didn't see how important that was for my development. So I never took care of archiving or even backups, ultimatily I just got rid of the discs while moving or cleaning up. I don't even remember. No this old "Brotkasten" is catching dust in the drawer.
Nowadays I really miss my early work and take a lot more care when it comes to backups and archiving my work.
//EDIT
No wait, my first computer was some kind of weird robotron my father brought home some day. This thing really fascinated me, I wasn't able to do much on it, because there was no documentation and of course I had no internet. But I really loved the green letters on the black screen.
I never did any C64 programming when I was a kid. Hearing the Commander X16 might be ready for manufacturing soon, and it being C64 BASIC compatible (except POKEs), I decided to go through some old guides for the C64 but used the X16 emulator to do them. My absolute favorite was a course originally on VHS "Commodore 64 Introduction to Programming" I found on YouTube, there's 2 levels[0][1].
I wrote almost exactly this on C64 in the early 80's. The problem was that you couldn't see the ball moving very well because the pokes happened so rapidly compared to the rest of the code that it was a quick blink of the ball. It didn't look as clean and tidy as it does in this emulator. Also, my 12 year old brain wasn't smart enough to think of double buffering, or to draw the new position before erasing the old.
This is how I first learned programming, but on the C128. It came with wonderful manuals which included a Commodore BASIC manual. I also was subscribed to Commodore magazine which had pages of PEEKS/POKES I'd blindly type in. Didn't learn a single thing from that, but the toy games they ran were cool. :)
Seeing PEEK and POKE two pages into the article made me laugh. I always treated those like black magic, mysterious lines that I got out of a magazine that did cool stuff.
I loved the Simons Basic cartridge, that gave me a lot of the PEEK/POKE power wrapped up in new Basic keywords. Sprites were so cool!
I remember trying to write something ambitious on a Commodore 64. When faced with BASIC I tried to come up with some way of organizing my design. I really tried. But there was so little there I couldn't get traction and gave up.
We might have lost the ability to solve problems and create solutions with approachable yet powerful programming environments. But think about all the abstractions, frameworks, platforms, paradigms, and design patterns we now get to learn!
Here's a lost way of programming: I heard of someone who typed control characters on the C64 screen that matched 6510 assembly and then ran them from out of the video buffer.
I agree with the sentiment, at least partially. Using a high-level library in JS to display a complex graph is nice when you build a production system, but for learning the direct, almost tangible control is so important. It enables our intellect to build high-level concepts on low-level ones, and allows us to slowly understand how hidden functionality complicates high-level design before actually having to deal with it.
This is exactly how I learned to program. My brothers only liked playing the games, but I preferred making my own (generally based on source code listings in the back of BYTE magazine).