Hacker News new | past | comments | ask | show | jobs | submit login
What we can learn from vintage computing (github.com/readme)
86 points by ibobev 22 days ago | hide | past | favorite | 36 comments



I think the most important lesson to be learned from vintage computing is to make much more wise use of resources.

Over the last decade I've spent a lot of time tinkering with various branches of the Minimig FPGA core, which aims to be an Amiga on a programmable chip. There's a certain minimalist delight in not only running and using the classic OS and the super-lightweight applications of the day, but also having to extend the same principles of frugality and elegance to other parts of the design, to make it fit within the confines of a relatively small FPGA and a single SDRAM chip.


I've got a G3 iMac running macOS 9.2 that I recently restored sitting on my desk. Of course it's a lot of fun to play the old games but even browsing through applications, sifting through menus, and playing with the old UI patterns is just so refreshing. It's technically slow but latency feels so much better than modern machines.

Sure, the OS is technically inferior, and an infinite loop can freeze the entire machine, and my goodness that filesystem with all the goofy metadata is infuriating. There are all sorts of hidden resource metadata on each file that makes even program to file association bonkers. But otherwise there's a lot I like about it and a lot it gets right compared to modern operating systems. Older operating systems are just such a thin layer on top of the machine, compared to the mountains of abstractions and complexities of modern operating systems.


It's easy to pick up a bit of an old computer habit. I'm in the process of putting a machine I used to own back together, a Mac IIcx with Trinitron monitor, running Emagic Logic (a midi sequencer) as this was a hugely influential machine when I was getting into audio.

In the process of finding the right bits for this, i've ended up with an SE/30, a Mac LC and a Mac II, basically in order to get hold of other bits, and the price was good. They are now all running, and i've got no real use for any of them, but it's great to convert broken doorstops into fully functional doorstops :)


I used an SE/30 well past its sell-by date as a dedicated writing/programming appliance using an old version of BBEdit, and then later as a local mail server using NetBSD.

Of the two, the writing/programming appliance turned out to be more useful. With a PDS Ethernet card and a cheap wifi AP, it's even wireless, sort of.

The SE/30 was truly an insanely great machine. Topped out at 128MB of RAM! And I really enjoy the toaster Mac format.


A/UX lan?


Yeah, that's definitely an option. I've got an ethernet board for the SE/30, and also for the IIcx, and enough RAM to keep A/UX happy.


Arguably MacOS 9.2 was peak UI - it's visually attractive, maintains clarity throughout and doesn't waste screen real estate. Playing with it today is quite the antidote to the flat, boring monochromatic UIs that dominate now.


And it was fast. It's insane there are still moments where opening a window, or changing tabs, takes more than 1s on modern hardware. Usually it's apps like Photoshop that have web UIs instead of native windows...


If you haven't, look around. Macintosh (was still called I believe?) 9 has (had?) a crazy following still. There was a fork of some old Firefox being updated and maintained up until 2-3 years ago. macos9lives.com is a site I have handy.


So jealous! God I loved that machine. Used them a lot in libraries and always wanted to have my own.


> ... and my goodness that filesystem with all the goofy metadata is infuriating. There are all sorts of hidden resource metadata on each file that makes even program to file association bonkers

I was doing typesetting, using QuarkXPress (before InDesign ruled the day) on Mac G3 running MacOS 9.

Good memories but, geez, what a turd that filesystem was. Basically the only way to exchange files and stay sane was to stay in MacOS land. Having a G3 with a built-in ZIP drive did help though.


My local snackbar still has this DOS application where they enter all orders and print receipts etc. It looked so refreshing to see that interface again. The DOS font, the blue background, no border radiuses or drop shadows, super snappy and you don't even need a mouse!!


Lots of older companies still do. When I worked for Sherwin Williams a couple years ago, every aspect of the business was run by some DOS style software on some custom nix OS.

Once you spent a little time to memorize the hotkeys, you can do things incredibly* fast. No action is ever more than three or four keystrokes away. Menus and actions appear in ordered lists and never change. If an item is removed, its menu entry is simply blank until something else gets added five years later. Everything is always exactly where it's been since 1980.

People want to talk about clean and minimal interfaces, but that art peaked in the 90s. There is not any traditional GUI application anywhere on the market that can compete with the sheer speed and productivity of a DOS application that's been progressively optimized over half a century.


If you look at how these machines were programmed, rather than used, you'll learn a lot


Any resources you’d recommend?


Ben Eater's 8-bit projects are the easiest place to start if you don't already deeply understand how such computers work. He starts with the hardware and eventually builds up to writing non-trival assembly for it.

https://www.youtube.com/watch?v=LnzuMJLZRdU&list=PLowKtXNTBy...


i have been slowly designing a fantasy retro terminal/computer idea. imagine something uncomplicated enough that kids could learn the entire architecture on, without the diatractability for the web. and maybe a fantasy network that would be more like what could have happened if we extended fidonet out instead of getting IP (overlayed on the modern internet of course)


Somewhat related: https://x.com/id_aa_carmack/status/569658695832829952

> Teaching my kids programming on an Apple //c is like kung fu training in the primitive wilderness.


I used to joke that retro computing was my answer to being stuck as a support engineer at AWS working on Kubernetes all day (I was bait-and-switched on what was originally billed as a Linux support job and dropped into the "Containers" team instead. I had no prior experience with containers and would prefer to never work with them again!)

I built the "Worlds Fastest Windows 98 PC" [1] kicking off a small YouTube channel. And have demoed it live at the local computer festival here in Seattle [2]. Thanks to Re-PC and others in Seattle I've slowly accumulated dozens [3] of old machines of all stripes that absolutely fascinate and delight me with their quirks.

Despite being thrown overboard from AWS over a year ago now, I still very much enjoy retro computing. I would love to be able to turn my hobby into supporting "legacy machines" into a "real" Sysadmin career, but it seems like YouTube is a more likely bet on that front these days.

The biggest thing I've learned from retro computing is an appreciation for viewing things in their historical contexts. Something that is surprisingly difficult to teach "younger people" or programmers who've only ever really worked at FAANG's.

Younger people today bang the drum about rewriting the world in Rust. Rust is great, and I'm not here to argue for or against it.

But when they bemoan using C they seem to misunderstand exactly how big a leap forward C was from pure assembly programming. Just having portable code in the era of everybody having their own bespoke UNIX system (with their own processor architecture to match, of course!) was an absolute game changer. C is entrenched because it was an evolutionary leap in a landscape that was shifting by the day.

Retro computing has taught me that people weren't nearly as "clueless" or "incompetent" as they're derided for these days. These were smart people making the best decisions they could at the time with what they had available.

People in the 1980's didn't "want" to cause Y2K. They simply never thought their software would be in use 20 years later. How could they? Technology was moving at such blinding speed. Just 20 years before them they were still using punch cards and printout teletype machines on mainframes as big as houses!

[1] https://www.youtube.com/watch?v=YETxI4rA_gs

[2] https://sdf.org/icf

[3] https://wiki.cursedsilicon.net/


> But when they bemoan using C they seem to misunderstand exactly how big a leap forward C was from pure assembly programming. Just having portable code in the era of everybody having their own bespoke UNIX system (with their own processor architecture to match, of course!) was an absolute game changer. C is entrenched because it was an evolutionary leap in a landscape that was shifting by the day.

This is very true, but it's also like saying that magnetic tape was a great improvement over punch cards. It definitely was, but that was also a long time ago.

C portability always had a huge asterisk attached to it, which was that the language was not going to help you at all with this, nor was it even going to define basic semantics for anything. It was a long time before it defined arithmetic as two's complement, which was bad news for anyone retrocomputing on a ones-complement machine but while you could compile code designed for one on the other there was no guarantee that it would actually work.

People built portability systems on top of this such as autoconf.

> Retro computing has taught me that people weren't nearly as "clueless" or "incompetent" as they're derided for these days. These were smart people making the best decisions they could at the time with what they had available.

I don't think the Rust movement thinks that Kernighan and Richie were incompetent. I've never seen anyone say that directly. What they do argue is that new information is available and circumstances have changed which means that some decisions which may have been reasonable are no longer adequate.

And this is not new information! One of the books I learned C from was "C Traps and Pitfalls", by Andrew Koenig .. in 1989. The Morris Worm was 1988.

While old software often had higher standards for shipping reliability, because you couldn't simply update it over the Internet (or at all on cartridges!), security was not at the same standard because it is much harder to achieve.

(There is also what I might call the "we were always against C" faction: people who, even in the 80s, were using LISP or Pascal (1970) or Modula-3 (1988))


Rust portability is to have types like i16, i32 and i64, which is a trick that C programmers have used for decades to dodge real portability.

The first machine that see right on didn't even have 16 bit types. The int type was 18 bits.

Modern C does not guarantee that the inttypes.h header will make available int16_t or any of the other exact width types.

Real portability means not knowing exactly how wide the types is yet making your code work anyway.


s/see right on/C ran on/


>Technology was moving at such blinding speed.

And still is. Only 14 years ago basically no one had a smartphone. Now you are "forced" to use it for some real life applications. Also LLMs, etc.

There is just so much technology atm, that it doesn't feel like such a big leap, but it's moving incredibly fast. Apprently just the hardware leaps got smaller


the hardware used to leap and the software grew new layers of abstraction to keep up. then the hardware speedup slowed down, but the software keeps growing like a cancer.


C was not a great leap forward from Algol 68 and what have you.


Retro computing to me has to benefits:

1. As mentioned by robinsonb5 -- "wise use of resources". I believe developers should test their changes on old hardware to ensure it will be fast.

2. How much I really miss laptop monitor sizes 4:3, 16:10 is a decent 2nd choice. I hate 16:9, which is the most common.

Updated per kazinator post. I got 16:9 and 16:10 mixed up


> 16:10, which is the most common.

I'm so glad we're back at this world. I had a laptop in 2011 that developed a fault on it's 16:10 (1920x1200) screen, Dell ran out of replacement parts and laptops had standardised on 16:9 screens by this point, so they ended up replacing the entire laptop with one with a 16:9 (1920x1080) screen.

I wasn't thrilled at the loss of pixels.


Big fan of more vertical space. The retina aspect ratio on the newer Macbooks is at least a decent compromise being closer to 3:2.


What's your beef with 16:10? I prefer it over 16:9 unless I want to watch 16:9 video most of the time.


16:10 allows for 16:9 video with captions/subtitles and basic video controls off the frame.


I was thinking of watching TV shows, where I don't want those things. But you're right, when watching e.g. educational videos, having more room for controls and subs is nice.


I got them mixed up. I ment I like 16:10.


Ah, ok.


Should have (2022) in the title.


Doesn't matter to vintage computing.


It helps to identify if you’ve read it before.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: