I went through the process of gathering old sources for the games I wrote in that era. Using 68K Mac emulators (BasiliskII for example) I was able to also get the compilers and other dev tools of the era such that I could compile and run from source.
Looking through the sources, it was on one hand fun to marvel at the simplicity of the OS (and Mac Toolbox) from that era. I thought it was hard at the time though. Funny how hindsight (and lots and lots more experience) can change how you now see the past.
(I see so many cleaner styles of coding I could have adopted - have since adopted - when I look at my old sources.)
But I was amazed at how often the apps/OS crashed as I worked through trying to get them up and running within the emulators. At first I thought it was the emulator(s). But my next thought was ... maybe System 6 crashed all the time back then? I think so.
Example, I would run ResEdit (a development tool) and open a file that was also open in the IDE (THINK C or THINK Pascal). Crash!
Crashes required a reboot (often then requiring running Disk First Aid on the disk image and rebooting again...). I kind of see why Amiga guys laughed at us.
In fact I found a few bugs in sources from 35 years ago.
> But I was amazed at how often the apps/OS crashed
Which is one reason why Step 1 after unpacking the computer was to install the "Programmers Switch" to get access to the reset button.
Rebooting the machine was completely routine. And it wasn't particularly slow (in context, by todays standards, everything was slow, all of those machines were glacial), at least for the time. I don't recall having to ever really run Disk First Aid on anything though.
In fact, I remember making a "toy" that was a simple app that all it did was bring up the "Bomb box" (the infamous dialog with the sizzling bomb an reset button akin to the Blue Screen of Death). The detail was that if you moused to the Reset button in the dialog (which was, essentially an autonomic response for most Mac users), the Reset button would move away from the mouse. So, you "couldn't" hit the reset button.
There was another button, typically disabled (I think is said Resume or something, I can't recall -- it was always disabled), if you clicked that, it would "melt" the screen and exit the app, no harm done.
I spent a lot of meticulous time duplicating the dialog box and bomb icon. Run the program, note the box and icon, hit the interrupt button (on the Programmers Switch) to bring up the original, tweak, rinse and repeat.
Someone eventually uploaded it to the assorted BBSes. I found it in a book in a bookstore once.
The explanation I understood was that invalid expressions like “Finder” evaluated to an odd number. Since jumping to an odd address is disallowed on the older 68K machines, the default failure behavior was to exit the current application. People used to claim that the code above was more reliable, but maybe this was superstitious: in practice everything is unreliable once you’ve encountered a serious bug on a system without memory protection.
I just did a search and found collected PDFs of all of the Inside Macintosh volumes. (For those who don't remember, these were technical manuals on how to develop hardware peripherals and write software for the Macintosh.)
My first computer was a Mac LC; I got it as a 10-year-old, and learned to program on it (and its replacement Centris 660AV) as a 12/13-year-old. At the time, the Inside Macintosh series was way too advanced for me; I lived on little Pascal game creation tutorials. It's interesting to look back on them now, as a professional programmer with ~25 years of experience. It actually looks pretty simple now. Also interesting to find that I can still read Pascal, even though I never used it professionally and haven't worked with it in 30+ years.
I remain convinced that the 1990s rewrite of Inside Macintosh (which can be found here on your site: https://vintageapple.org/inside_r/) was the best documentation Apple ever produced.
For those unfamiliar, Mac files of the era had two sections: a resource fork, and a data fork. Resources were structured chunks of information, like icons. You could open ResEdit and mod any app you liked.
I missed resource forks when they moved to the UNIX way in what's now macOS.
I used to use ResEdit to make new fonts. The fonts were fully defined as bitmaps in the resource fork; pop open the System file in ResEdit and you could make whatever you wanted. Turned in all my school papers during middle school in a font of my own design.
Confession: Back in the day, before Macs had a real virtual memory system, I abused the Resource Manager and a file's resource fork to serve as a crude VM for an object management library I created so that I could write apps that used more C++ objects than would fit in memory. It was a hack, but it worked surprisingly well.
“Thankfully, I knew of a similar system in Smalltalk, an object-oriented virtual memory called OOZE that was designed by Ted Kaehler, that swapped objects in and out of main memory as required. This was my inspiration for the Resource Manager. Find out more about OOZE here”
I think the Resource Manager was designed with the "objects" in mind being understood to be an application's assets, such as code, graphics, menus, text, and so on. My abuse was to extend this to arbitrary, dynamically allocated blobs so that an app could, for example, scan thousands of files and store each file's metadata in virtual memory, way beyond what would fit in RAM.
This would have been in the early 1990s, I'd guess.
EDIT: I just checked the sources: it was mid 1993. Back then, a Macintosh Classic II would have been a common machine and shipped with 2 MB of RAM. Older machines often had only 1 MB of RAM.
Ha, I worked in a computer lab a student assistant and used to modify the Netware for Mac client on some of the tech staff we worked for. I remember swapping the logout and login commands on our boss's computer.
Resourcerer had better templates for lots of common GUI elements, but system resources were complete in ResEdit. Don't forget rez and derez tools for the list.
I missed resource forks when they moved to the UNIX way in what's now macOS.
You could mod Windows apps the same way (and at least one resource editor was also named ResEdit), yet the filesystem didn't provide multiple forks --- it was just a separate section in the binary.
In fact macOS apps are just a directory containing both the main binary and resources.
I remember Borland’s resource editor. I themed my VGA.DRV file quite extensively.
Later on, Windows NT gained resource forks. An early security flaw of IIS was that you could ask it to serve the DATA fork of any source file and see things like hard-coded credentials (because IIS would not execute the file, just serve its contents).
> In fact macOS apps are just a directory containing both the main binary and resources.
This is a much saner design decision, but it wouldn’t be possible before the file system got folders.
Forks caused all sorts of issues when you needed to transmit a file from one computer to another without using a floppy disk.
> But I was amazed at how often the apps/OS crashed
Early microcomputer systems usually didn't have memory protection hardware, but they also were typically single-user, single-app systems.
Cramming a GUI and OS into 64 or 128 KB (!) of RAM (and some ROM) meant that apps didn't have a lot of memory to work with; even to this day apps behave poorly when they run out of memory - we just have a ton of it now, as well as virtual memory, so things tend to get slow before they crash.
As I understand it, Apple's Pascal compiler supported range checks but developers typically disabled them for speed and code size, eventually switching to C, which didn't have range checks at all. Failing a range check would result in a graceful system bomb box rather than an ungraceful one. I also think that the the Mac (following the the Apple II) kept important data in low memory, making null pointer errors (perhaps a failed memory allocation) particularly destructive.
It was such a big deal when Apple finally implemented system-wide memory protection that they added an app crash dialog box that said "The application SomeApp quit unexpectedly. Mac OS X and other applications are not affected" – which Steve Jobs demonstrated by running a "bomb" app while playing a video in quicktime player.
The basic problem with those old home computer operating systems was the lack of memory protection. Any program could overwrite the memory of any other program, or even vital components of the operating system.
While memory protection was touted as a huge improvement for end users, its actual benefit was to developers. Just try to debugging memory issues in the application's own address space. Now try to imagine doing the same, where your only hints of a problem is someone else's code crashing because of memory you inadvertently overwrote.
Yes. It had an ugly UI and preemptive multitasking, but it didn't have memory protection and -- unlike MultiFinder -- didn't clean up after apps for certain resources, so you'd eventually need to reboot to get those resources back even if your system remained stable.
I remember there being "recoverable" errors in later versions of the OS. But if you got a "Guru Meditation" you were almost always done. With newer machines (68020+ w/MMU) you could protect the first page of memory which helped ("Enforcer" was the name of the program, I think.)
Netscape was crashy everywhere. I finally switched to IE in ~2000 despite being a die-hard Microsoft hater, because DoubleClick shipped some code that would hang Netscape and as a result the whole Internet broke.
Yeah, the main issue was the lack of memory protection on Mac so the crashes could take down the whole machine. I switched to NT4 though that had drawbacks of its own.
I remember my Mac Plus and Classic being very stable - but I think that had more to do with the testing that the applications went through than anything to do with MacOS.
I'll try to repro. I think with System 7 and on a lot of improvements were made to the OS (things like an Extensions manager comes to mind). Perhaps we're remembering those better days.
At the same time, it took me a while to unlearn a kind of reflexive <command>-S muscle spasm that I think twitched about every 4 or 5 minutes.
Pre MacOS X did indeed crash all the time. I learned to only use one application at a time. It would still crash a couple of times a day, requiring a restart. On the other hand it was snappy and simple, in a way modern OSs just aren’t.
True. When a friend of mine and I showed my wife OS X when the first beta came around, I remember when I went to shut down the machine the wife said she didn't like how OS X took to quit (older Mac OS would shut down near instantly).
"Well, you'l have to find something else to like about Mac OS X," was my friend's reply.
System 7/apps would crash from time to time. I think we just had lower standards then. IIRC Macs prior to OS X didn’t separate one process’ memory from another?
Windows 95 was a big step forward though because it had actual process isolation and preemptive multitasking. While it wasn't very stable compared to Linux or NT, for the most part an application crash wouldn't bring down the whole machine the way it typically happened on classic Mac OS or Windows 3.1.
Looking through the sources, it was on one hand fun to marvel at the simplicity of the OS (and Mac Toolbox) from that era. I thought it was hard at the time though. Funny how hindsight (and lots and lots more experience) can change how you now see the past.
(I see so many cleaner styles of coding I could have adopted - have since adopted - when I look at my old sources.)
But I was amazed at how often the apps/OS crashed as I worked through trying to get them up and running within the emulators. At first I thought it was the emulator(s). But my next thought was ... maybe System 6 crashed all the time back then? I think so.
Example, I would run ResEdit (a development tool) and open a file that was also open in the IDE (THINK C or THINK Pascal). Crash!
Crashes required a reboot (often then requiring running Disk First Aid on the disk image and rebooting again...). I kind of see why Amiga guys laughed at us.
In fact I found a few bugs in sources from 35 years ago.