Hacker News new | past | comments | ask | show | jobs | submit login
Working from home at 25MHz: You could do worse than a Quadra 700 (arstechnica.com)
141 points by Tomte on Dec 11, 2020 | hide | past | favorite | 125 comments



The rose tinted lens of vintage computing strikes again.

I too fell for this numerous times. In fact I actually went back to an Acorn RiscPC 600 for a bit once instead of a modern windows machine. It was what I considered to peak productivity for me. Built in assembler and programming language, decent quality desktop applications and no distractions.

After about 5-6 days it was back on eBay because quite frankly I realised what a pain in the ass it really was. 99% of the stuff I was actually doing was collaboration and it's really difficult doing that on vintage computing platforms. On top of that the hardware is almost always like a hand grenade with the pin out. It's going to die, but you just don't know when.

I cite this as an example for the comedy timing: https://www.youtube.com/watch?v=TU55-7dWMi0

Now emulation I can get behind because when you are completely fed up of the idea you had it's easier to dispose of it than getting rid of it on ebay :)


I owned a Macintosh Quadra 650 in 2001 -- bought it for C$60 from an outgoing senior -- and it was already long in the tooth when I had it (the original Quadra 650 was released in 1993).

It ran System 7.1. It was a beautiful machine but I couldn't get any real work done on it -- the browser (IE for Mac) was too slow, and ClarisWorks was too simple.

I believe there are pivot points in computer technology. A machine just after a pivot will last a long time, while a machine just one year earlier will age badly.

My main desktop was a Dell Inspiron 530 Core 2 Duo from 2005-2020. The Core 2 Duo was a long lasting chip, and I could watch YouTube, browse the web on the latest Firefox, etc. with no problems at all the way up to the start of the pandemic this year (I recently upgraded to a 2014 Dell i7, which I'll likely keep for another 10 years). If I had a Core Solo, I'd likely have dumped it.

I eventually sold the Quadra (and the Mac II, Mac SE, the Sun SPARCstation 1 pizzabox, and all the other vintage machines I had in my college apartment). They were fun to own for a while, but ultimately impractical to keep around.


I finally upgraded my home machine this year, up from an i7 4770k, ~7 year old processor. The memory was starting to fail, and I figured it was finally due. I've replaced the GPU a couple of times, and upgraded storage but that's about it.

It's a nice boost for certain workloads that thread well, but it's not a huge boost for single threaded workloads except for certain specialised stuff where there are newer instructions to take advantage of.

Contrast that to a roughly comparable time frame: 486sx 60Mhz -> Athlon 1Ghz.

Short of a major revolution, I expect this CPU/Memory will last me just as long.


> If I had a Core Solo, I'd likely have dumped it.

I do note that there was never a Core Solo desktop CPU. There were handful of single-core Conroe Celerons released, but all main "Core" series parts were at least dual-core (i.e. Core 2 Duo or Core 2 Quad)


There was no Core Solo desktop CPU but the Core Solo did exist in Mobile Editions:

* https://ark.intel.com/content/www/us/en/ark/compare.html?pro... * https://ark.intel.com/content/www/us/en/ark/compare.html?pro...


Thanks. I remember that the Core Solo was used on some Apple machines, including one of the first Intel Mac Minis (a pseudo-desktop machine?).

https://lowendmac.com/2006/mac-mini-core-solo/


Correct. I had one. It sucked. But less than my XP machine.

I have the spiritual successor, the M1 mini now.


Pseudo is accurate. Mac minis have always been made with bits from the MacBook parts bin.


Mac mini core solo did exist [1]. My parents owned one.

https://everymac.com/systems/apple/mac_mini/specs/mac_mini_c...


99% of the stuff I was actually doing was collaboration and it's really difficult doing that on vintage computing platforms

Maybe it is, maybe it isn't. Are we talking Zoom or IRC here?

If you need to do the sort of work where you just need to concentrate with no distractions, a "vintage" platform is a superior choice. A few prominent authors swear by their obsolete word processors for example. Many people (myself included) find it hard to focus and resist procrastination on any modern Internet-connected system. But I can get my head down in First Word Plus on my Atari ST (or Word 5.1 on a classic Mac) and upload the file later for formatting in Word and sharing by whatever means.


> Maybe it is, maybe it isn't. Are we talking Zoom or IRC here?

Much as I have fond memories of IRC, collaboration requires the other people to use the same tools as you, and you don’t always get to choose what that is.


collaboration requires the other people to use the same tools as you, and you don’t always get to choose what that is.

Indeed - my point being that whether a vintage platform is viable or even superior for your workflow is very situation-dependent, it's not "all or nothing" nor "one size fits all".

Hell I am reading HN now when there is something I could be working on. I should go into the other room and boot up the ST.


Doesn't slack allow you to email a channel?


Exactly that


Can you really say its rose tinted lens if the author actually did use the system and writes from current-day experience?


Yes. It may be fine for the author, but the article promotes this as possible for everyone, or at least many other people.


> makes it a great choice for some classic Macintosh gaming and nostalgia tripping.

> the machine will always be hobbled by its technical limitations.

> I would never go as far to say that the Quadra 700 should have a place on your desk


And yet you should wonder if the fact that we all think regularly of going back might suggest something.


> Now emulation I can get behind

Makes sense. Other than perhaps latency, if software was truly better in previous decades, you can just run old software on new hardware [1] and get the best of both worlds. Given infinite time, you could probably find a way to boot directly into some kind of hypervisor that runs RiscOS or whatever full-screen. While also running LVM underneath it for online backups. Oops, a new software feature.

It's related to my pet peeve about Gemini: You can just choose to do all the stuff Gemini does with your own safe subset of HTTPS. If you have the political power and dev hours to maintain Gemini, you can also just maintain HTTP/1.1 with TLS.

Running Acorn hardware won't fix software bloat created by anyone but yourself, and running Gemini won't fix HTTP bloat created by anyone but yourself. Might as well run good software on good hardware, instead of running bad protocols or bad hardware out of spite.

[1] I'm on the young end of millennials, so my nostalgia is for Windows XP. But there's a valley for retro nostalgia - It's okay to use modern Linux, because you need security patches. It's okay to use Windows 10 because your employer forces it. It's okay to run an Amiga or RiscOS ironically, because it's cool and funny. But Windows XP? No! Evil security hazard! Deprecation means the software became bad overnight! Why won't you upgrade??


As a fellow young millennial doing red team security stuff professionally, XP is different from the others for a few reasons.

The first is that XP has a tendency to, by default, open a lot of forward facing holes and services that to my knowledge most other "old timey" OSes don't. I'm not familiar with riscos or amigaos, but I do know old versions of OS X ship with basically no attack surface, while XP (IIRC) will enable SMB and guest access automatically.

The second is that unlike those other OSes, XP was and is very widely used, and there are exploits which target XP and a range of other commonly used Windows machines. Unlike basically any other niche retro hardware, Windows is both 1) pervasively networked and 2) very widely attacked, and lacks basically any modern mitigation for exploits, so a simple buffer overflow in a networked service can easily become a RCE.


> and running Gemini won't fix HTTP bloat created by anyone but yourself

No, but it does help building a community of like-minded people. And that is really the core idea of gemini, isolationism is sort of a feature there. I see gemini more like a flag under which people can rally under, the technical aspects are incidental.

(note: I don't use/run gemini so I might be mis-representing them here)


Windows operating systems may themselves represent the 'pivot points' that wenc talks about. Every so often, a version of Windows seems to come along that gives you a genuine improvement over what came before - in my view Windows 98 SE, Windows XP and Windows 7 for example, which may represent the nostalgia valleys. Other intermediate versions seem to at best introduce new problems to replace the old ones.

Of course, it would be great to have the Risc OS experience while doing modern computing tasks, so I'm pleased to hear that the lessons it brought to the world have not been forgotten.


I'm inclined to agree, but I do think that 95 needs to be on that list as well simply because it was such a massive step up from the MS-DOS plus Windows 3.1/3.11 era. 98 itself was a bit of a misstep, but I do agree 98SE was the next major improvement point.


Perhaps I'm a bit biased against 95 as I was happily using Risc OS at the time :) My memory of Microsoft from back then was that the Office apps could be quite capable, but the underlying OS was just something you had to put up with.


I don’t think you can reasonably expect to get useful work done on a vintage computer, although I occasionally use my 512k Mac for writing markdown. Mostly old computers are fun to repair and tinker with, and play old games.


> Now emulation I can get behind because when you are completely fed up of the idea you had it's easier to dispose of it than getting rid of it on ebay :)

Attaboy. I went the other way and realized that I wanted to buy some very modestly priced emulator-on-a-SoC for an old gaming console near and dear to my heart. That gave me the best of both worlds -- form factor and tactility of the real console with cost, speed, battery life of a present day android device. I must say, for a $40 Android device, I use it a lot more than I thought I would.


> …I use it a lot more than I thought I would.

I can't help but be curious: What are you talking about?


Doesn't the new Raspberry Pi 400 fit this role pretty perfectly?


Most people have missed the most obvious head-cannon in that Jurassic Park scene - when Nedry is talking on the 'video phone' to his guy he's really just watching a quicktime movie .... you can see the horizontal scroll bar running as it's being played.

Of course at the time almost no one had really seen video on a personal computer so to the vast bulk of the movie going audience it was just futuristic.

Also in order to film that scene the video card had to be slowed down to 24 frames/sec and synced to the camera, looked great on film but was probably headache inducing to the people present


Could have run at 72hz (3x 24hz), a normal refresh rate for many video cards & CRTs of the day. The SuperMatch 20-T in the movie could do 72hz (why do I know this...)


could have .... but we at SuperMac had one of our best engineers on set for the filming doing what the film crew asked for ....

(running at 3x would likely have resulted in banding)


oh very cool! Must have been a pain to get the timing right to sync with the film. I assume you modified the video drivers then, or was it solely hardware based? Was it using a Thunder II? Is there anywhere these stories are written up?


From memory this was very much pre-Thunder cards (I designed the Thunder graphics accelerator).

We had done a lot of early video-work prior to quicktime actually being released, we had hacked together a motion-JPEG card as a daughter card on one of our unaccelerated cards that could play video in a window, I suspect this was based on that (but could be wrong, it was a long time ago!).

We certainly hacked on the drivers to slow down the refresh rate, but that was just a matter of changing some numbers in the video timing controller - in a time prior to multi-sync displays it may have involved getting a bespoke display manufactured.

I don't know of any archive, here's a few stories:

- when we first went to Sony to buy 19 inch monitors we asked them for 15k/year, they laughed at us, they had never ever made that many, unlike TVs 19 inch monitors were made by hand on a single manufacturing line, the didn't believe there was a market for that many , in the end we sold 25k+

- when we first sold monitors to Australia/NZ they were crap, people returned them, turns out all monitors were aligned in Tokyo facing east .... southern hemisphere ones needed to be aligned in a special magnetic faraday cage

- we first demoed quicktime in a window at MacWorld on an HDTV monitor, one of only 5 in existence in the US at the time - most people had never seen video in a window, or a screen that big before - the monitor was so heavy it required a forklift

- Premiere was written by a Supermac employee, partly in his spare time, to work with our video capture card, we sold it to Adobe, he quit and followed it very soon after

- one April fools the boss carefully locked his door and set traps (people went in thru the false ceiling) next morning he started work, his screen slowly got greener and greener, he rebooted his machine, it happened again, he started removing suspicious inits from his system folder, rebooted again, it did it again .... just before he wiped his harddrive and did a full install people offered to put his original video card's rom back in ....


Speaking of faraday cages - one of the NYC firms got a sweet deal on office space. Turns out it was close enough to the metro/rail system that it would cause a magnetic disruption to the monitors. The "cheaper" fix was putting in flat screen monitors that were crazy expensive back in those days. Apparently the hardware cost wiped out any rental savings they thought they were going to have.


heh - before I worked for SuperMac I worked for the company that ported A/UX to the Mac for Apple, the SuperMac guys brought in one of their early (not Sony) monitors for me with the hope that I'd make it work on A/UX for them (I did) .... I was sitting there showing the port of X I'd just done to some co workers when suddenly the screen did this weird twisty spiral thing .... just outside the window was a truck carrying some ex-BART railstock was driving past ... essentially a giant bar magnet


UniSoft? Cool, what parts of A/UX did you work on?


Lots of stuff, I did many of the device drivers, the low level display console, the appletalk stack, the kernel event manager, the code that would build new kernels with new device drivers when stuff changed - A/UX 1.0 was essentially our stock System V port with BSD networking with drivers for the Apple hardware


Thank you for the stories! The folks over at 68kmla would probably love to hear from you, if you're interested in reliving those times a bit.


My naive assumption would've been that they could composite it into the scene. After all that movie was a relative breakthrough in compositing more difficult digital content into live action.


This reminds me of my first ever video file on a computer.

Don't Copy That Floppy on a LCIII via a Nautilus subscription CD.

It was like a minor version of what I suspect the moon landing felt like. I watched with my jaw down thinking about the day all video is on computer.


Kind of surprising to me, given the thing that was most historically significant about that film, that they would have opted to do something like that, rather than just dropping some computer graphics in after filming.


You may be underestimating the cost of digitally replacing images in post production (in 1993 no less) -- why bother with CGI when practical effects get the job done?


yes, exactly, and as I mentioned most people at the time had never seen video playing in a window on a computer, the moving scroll bar didn't mean anything, it was just part of the magic


Exactly. It was way too early, in 1993, to be doing that sort of thing in Post. Much cheaper to do it on set.


That scene drives me nuts every time I rewatch Jurassic Park.


>was probably headache inducing to the people present

I doubt it. It probably just looked like film, no?


the phosphors on a monitor designed for 60+Hz are designed to degrade quickly, at 24Hz they'll be off ~1/2 of the time

Remember that when film plays back the projector shows you the same image for most of the 1/24 of the time before switching to the next frame


Random trivia: many of the old film cinema projectors are running with a flicker rate of 48Hz (showing each frame twice) or 72Hz (showing each frame three times).

https://en.wikipedia.org/wiki/Movie_projector#Shutter


PAL TVs ran at 25hz. They most likely could find a monitor that would have looked fine. Its hard to say what they actually did just from what you see in the film.

Filming monitors and TVs was and is a very common thing. You can probably get away with filming a 30 fps monitor if your shutter speed is tight enough and you're ok with a few pulldown frames sneaking in.

>Remember that when film plays back the projector shows you the same image for most of the 1/24 of the time before switching to the next frame

Is this true? Isn't it 50/50 or so? Take a look at a rotary disk shutter. Film is pretty delicate and you can't pull it through that fast.


TVs use different, slower, phospors than computer monitors, they also scan alternate (50 or 60Hz) fields within a frame to help with the illusion


Film projectors also actually show the same frame 2-3 times in a row to reduce percepted flicker (effectively giving you 48 or 72 Hz)


Want something fast? Mac IIci, 8MB memory, System 6. Word 4 Screams.

Boots in literally seconds, cold boot is longer due to memory check.


It will never stop annoying me that my telephone takes longer to boot than the computers I grew up with.


Sigh. Bu the computer you grew up with couldn't even handle the colour palette and resolution of the screen that you now carry around in your pocket.

The computer you grew up with couldn't deal with taking, manipulating and storing photos of your phone's camera(s).

Sure, your phone could boot faster, but you also need to compare to the capabilities of both machines. Plus there's no need to reboot your phone on a daily basis anyway.


> Plus there's no need to reboot your phone on a daily basis anyway.

IMO this is the only vitally important point. If my phone’s screen was 640x480 and 8-bit colour, it would still be fantastically useful.

The fact that 4G cellular internet is higher bandwidth than the system bus on my first Mac (Performa 5200), the fact that every image the camera takes us too large to fit into that Performa’s RAM even when it was fully upgraded, the fact that a single Netflix film takes up more storage on my phone than my Performa had in total, is an impressive feat; but I use most of the impressive technical performance to share dumb photos of weird cooking experiments [0], for the purpose of social connection with distant friends, and I could manage those connections in other ways if the tech was not there.

In practical terms, what I care about is the instantaneous responsiveness — the fact that the screen lights up when I lift the device, the way the system unlocks with facial recognition so I don’t have to enter a passcode, the way most apps are about a second away from use, that they auto-save, that crashes are rarer.

And that is because, as you say, there is no need to boot up the phone whenever you want to use it.

[0] e.g. https://travellingcurious.wordpress.com/2020/12/05/greenies-...


It's weird though isn't it...

> the computer you grew up with couldn't even handle the colour palette and resolution of the screen that you now carry around in your pocket

the other way of looking at this is that the phone is enormously more powerful, can render hi-res full colour depth 3D scenes at 60fps that would take minutes per frame on a 90s computer, can store and transfer amounts of data that would have been unimaginable back then

so why is it so slow to boot up?


In part…updatability.

Let's say you bought a 100-baseT network card in 1995. The features were fixed in metal, and so were the bugs. Even if a new revision of the board or ROM came out, you still had the exact same card. If a flaw was found in one of the chips they used, chances are someone would need to program around it. Settings were probably done once using DIP switches and never touched again. You probably never even knew if a new revision came out.

The wifi card you use today is going to boot up in a pretty dumb state. It'll identify itself on the bus and not much else. Your computer's kernel is going to poll every device on the bus to see who's out there. The wifi card driver is going to match the ID of the wifi card to one it knows and then upload firmware. Then the card will need to briefly reboot into the new firmware and identify itself to the driver again. Then the driver is going to configure it.

It may take a couple of seconds for all this to happen, and it needs to happen for almost everything. Even your CPU which booted the machine in the first place might get patched microcode uploaded when the machine boots.

Nothing is just a device anymore, and nothing is just a device driver.


Thing is, though, there's nothing really stopping devices from having a middle ground: have the device persist the firmware, and only have the kernel feed the device a new version of that firmware if actually needed (e.g. because there is indeed a newer version of it).

Ultimately, though, the device enumeration and initialization is a tiny fraction of the startup time on most computers, be they desktops, laptops, servers, phones, tablets, or what have you. Usually the actual source of long startup times is the incessant need for these machines to spin up oodles and oodles of background services doing who-knows-what, and it's remarkable how much faster a machine boots when these services are pared down to more reasonable minima. And worse, it's these background services that often make modern computers feel so slow even after they're booted up.


A phone needs to reboot like once a month.

A computer from the 80s probably rebooted multiple times a day.

How much effort is someone going to put into an event that happens once a month vs investing that same time into something like optimizing battery life or taking better pictures?

Also, in my personal experience with Android/Linux mobile devices, phones in 2020 boot significantly faster than phones in 2010.


Yes, the telephone is also superior in every way to the computers I used until well into adulthood! But it's difficult to break the persistent feeling that while those antiques were very limited, the software was much better optimized for what it did. Like most Mac users in my generation, my word-processing productivity peaked on Word 5.1a and everything since then has been a disappointment.


I don't know about word processing performance, as I never did word processing.

I worked with (3D) graphics and computation intensive processing as well as modelling packages and the experience was horrible compared to today.

I have particularly fond memories of the workflow we used for a film project back in the late 90s. The lab only had so many editing stations and they were shared across multiple groups. So for each editing session we first had to copy the project from multiple CDRs onto the editing machine (which took ages on the 4x SCSI CD drives), edit, then burn the project to a set of CDs - performing various rituals to ensure the ROM image wouldn't be corrupted by a butterfly coughing in the corner or something - clean up the HDD so the next group could get working and hope the CDRs were even readable.

The editing experience itself wasn't great either: forget about real time scrubbing or full resolution previews. Today you can edit films on a phone much quicker, more comfortably and with significantly higher quality. No disappointment there for sure.

Same goes for 3D modelling packages, rendering and just plain number crunching in Mathematica, Maple or MATLAB.

Word processing is the one area that simply is so primitive by its very nature, that it's trivial to reach peak-efficiency without throwing tons of compute power and tech in general at it. If word processing is all you do, you don't need more than a glorified electric typewriter.

For many, many other applications - and yes, that even includes just taking and sharing pictures in real time - today's technology is vastly superior to anything the 80s and 90s had to offer.


I work in scientific computing so I definitely don't miss SGI workstations either. My complaint is just that even the simplest tasks, like word processing, end up taking orders of magnitude more computing power (or just don't work very well). The average commercial web page is an extreme example of this - I'd rather go back to mid-90s layouts and animated GIFs than endure most news sites. (I wonder how much of the demand for steady increases in computing speed was simply driven by the need to slam the consumer with as many advertisements as possible.)

But yes, it is impressive that my $500 phone blows away the $50,000 workstations I started on.


I don't see why that makes a difference, though. The screen... okay, maybe a couple more milliseconds to draw the extra pixels, a bit more data for the hi-res textures/whatever, but given the CPU speed and dedicated GPU (no more rendering all graphics directly from the main CPU) that shouldn't matter. The camera isn't even on during the boot process, much less in use. On my phone at least, the system takes a few seconds after boot to connect to network, and I get that. But beyond that, it's not like the system needs to load every app into memory as it boots up, so why should the extra capabilities matter?


I reboot my phone all the time, because the fingerprint reader stopped working, and for some reason that makes it work again. I could enter my password after X number of failures, but rebooting is easier.


My 10 year old laptop boots faster than my phone. I think they have similar capabilities.


I am 100% certain that your 10 year old laptop uses at least two orders of magnitude more power to achieve that.

I'm also not quite sure about the capabilities. A 2014 $150 entry-level phone like the Motorola MotoG G4 has eMMC controller, SD-card interface, two separate camera systems, WiFi, BT, 4G modem, GNSS (GPS/Beidu/GLONASS), USB 2.0 (client), USB OTG (host), accelerometer, gyro, proximity sensor, compass, FM-radio plus the usual stuff (DRAM controller, GPU, audio subsystem, battery controller, etc.).

That's a lot of components that the OS needs to initialise and load drivers for on boot. And all that needs to happen on 1 GB 533MHz LPDDR2 RAM and 4 Cortex A7 cores at 1.2 GHz.

Just to give a point of reference: that's less computing power than a $35 Raspberry Pi 4 and significantly slower memory and storage.

A 10 year old laptop would likely feature a dual core Intel Core i3, i5, or Core i7 with 35W up to 75W TDP clocked at between 2.5GHz and 2.9GHz, dual channel DDR3 memory running at ~530MHz with a 4x multipler (i.e. 2133MHz effective vs 1066MHz on the phone) and your BIOS is most likely set to "fast boot", meaning it'll skip 90% of the hardware initialisation anyway. I also doubt your 10 year old laptop would last that long on a 2070 mAh battery (my own 9 year old model has a ~44000 mAh battery - just for reference).

TL;DR yes, even a vintage muscle car is still faster at 0-100 (or 0-60 if you prefer) than a 5 year old compact car. That shouldn't come as a surprise if you compare the specs.


What phone are you using? My pixel starts up in under 15 seconds. Faster than the first macs I had access to (power mac).


I remember my Microbee starting up from floppy disk in substantially less time than that.


You may like Librem 5, GNU/Linux phone booting in <15 seconds and gibing you full control of the OS.


Mini vMac https://www.gryphel.com/c/minivmac/ exists for anyone who wants to try an office suite that doesn't lag.


System 7 is better and not as taxing as 8.


Looking at the IIci right next to me, it takes more than seconds to boot (well, with the config I have on it) but yep, I worked from home on that for years, running X over SLIP and a modem. Fun times.


I still think Word Perfect on the Mac was the better word processor and a bit faster.


Install System 7.5 on there and let us know what happens to boot time, though.


I had system 7 on an SE, so, yeah... slower.


Yeah!

No login, no passwords.


https://arstechnica.com/gadgets/2016/09/an-os-9-odyssey-why-...

There are apparently some DNA sequencer machines that only interface with older macs. The linked article is just as an 'exhibit A'... It doesn't go into the technical details as to why that is so. I can't find the article on why virtualization is difficult for this case but it was very interesting. Cooperative multitasking (mac system 5, system 6, system 7 ish era) meant that the running task had the responsibility to yield. If the task was something like 'sample the value of x instrument register every 10 ms' (or something like that) then it could do the quick maths based on the base system clock rate and know exactly how many cycles to sleep between each sampling. In other multitasking scenarios where the os has it's own things it wants to do, the program cannot deterministically know how many cycles to sleep. What if the process will be arbitrarily delayed or interrupted, so some samples are taken 17 clocks apart and some samples are 19? With the priority being JITTER FREE, it doesn't help if the os thinks, ahh i have a million clock cycles this second, and only 1000 things to do, I'll just do some housekeeping... so there are background processes using bus resources, network interrupts, etcetera, all batched up and dispatched because the os thinks it has time to procrastinate this one task that just wants to sleep and sample a register and sleep... but sleeping for EXACT duration.


>There are apparently some DNA sequencer machines that only interface with older macs.

I used to work in a Human Genome Project lab. Two of the 1990s workhorse DNA sequencing machines, the Applied Biosystems models 373 & 377, connected to Mac machines of similar vintage (my recollection is we had a lot of Power Mac 7100s, or at least machines in the 7xxx form factor).

IIRC the sequencing instrument had an interface to plug into the LocalTalk port on the Mac, but I might be wrong on this, it's been 20+ years. And the software needed the classic MacOS.

The ABI 373 and 377 are so old and obsolete at this point that you might find one in your local science museum. I know my city's science museum has a display with an ABI 373 from my old lab.

ABI's successor Sanger sequencing machines, models 3700 (1998) and 3730 (2003), had Windows-based companion computers and connected to the instrument via Ethernet port, so you didn't have as much lock-in to super old hardware.


I've always felt that when dealing with hard real-time requirements, the real-time functionality should be offloaded to a separate dedicated processor that acts as a buffer.

As another option, it would be nice if modern OS's could dedicate cores on multi-core systems to specific real-time processes. Then the OS can still do whatever it needs, while the real-time process gets guaranteed CPU treatment.


I have been toying with this idea for far longer than I'd ever care to admit. It works very well.

Preemption is evil in a multicore world. Still necessary in spots, but evil nonetheless.


Even without OS, or SMM on x86's, preemptions modern desktop cpu caches can cause large jitter. Writing/reading to the GPIO registers can take a variable number of cycles. Though I've toyed with disabling Linux scheduling on RPi3's last core and manually run a program on it. Fairly easy to do. It does really help jitter, but still not as reliable as a dedicated MCU.


> Email applications such as Mulberry are still just as useful today, just make sure to set up your IMAP and SMTP correctly and you're good to go

really? there's email clients for MacOS 8.1 that can understand IMAP and SMTP over TLS1.2? Because otherwise you'll be using some old thing that only supports plaintext authentication.

the most secure way to do email on such a thing would probably be to ssh to another local machine on your LAN and run a terminal based email client from that.


Assuming you trust your lan, you could probably run stunnel with modern TLS to add encryption. Maybe you could get that to run locally too.


I had completely forgotten about A/UX, what a blast from the past. Has A/UX gotten any recent open-source attention?


One feature of A/UX that I'd like to see in current Unix and Unix-like systems is Commando. It was a graphical help system for command-line tools.

You would invoke Commando from the shell for a command and it would bring up a dialog to help you construct the command line, which would be written back to the shell.

There's a screen shot of it for ls here [1], and mount here [2].

[1] http://toastytech.com/guis/aux3.html

[2] https://retrocomputingforum.com/t/running-a-ux-apples-at-t-u...


I wonder if Apple got the idea from IBM's OS for the late-80s AS/400 minicomputer line (today's System i servers). The OS/400 command language shell used its own rich knowledge of each command's syntax to present a full-screen form you could fill out to interactively compose a command: https://www.ibm.com/support/knowledgecenter/ssw_ibm_i_74/rza...

(Also, OS/400 commands all follow a very strict naming convention, such as the example WrkActJob in the link. The consistency of the words used and the abbreviations for those words were intended to make learning easier.)

You see a bit of this today in PowerShell's command auto-completion for built-in commands, PowerShell script files, and cmdlets (user-defined commands written as PowerShell script functions or .NET CLI methods). The graphical PowerShell Integrated Scripting Environment (ISE) even pops up windows that let you interactively fill out forms to enter command options, just like Commando: https://blog.netwrix.com/2018/02/21/windows-powershell-scrip...

Regrettably, PowerShell ISE is deprecated, and the recommended replacement of Visual Studio Code with the PowerShell extension isn't an exact one.


It's not recent, but there are some pages with "older" open-source attention like http://www.nleymann.de/appleAUX/AppleAUXMain.htm

I suspect that, like other pre-SysVR4/Solaris-y unixes, compiling anything newer than GCC 2.95-ish will be pretty much impossible. But, depending on your interest/curiosity, it looks like there's some fun stuff to play with.


What attention would it get? I mean, it's a neat OS, but it's an ancient enough unix that porting packages to it would be somewhere between difficult and impractical (even if you could compile ex. modern gcc for A/UX, it would need more disk+RAM than is likely to be on a machine of that age), and I'm not aware of any features (or even applications) that would justify trying to clone it (beyond the cool factor).


Random trivia. Back in the day, Apple was being boycotted by GNU, so the open source situation on A/UX wasn't great even compared to other weird unixes.


I run NetBSD on my Quadra 950, found a 100base-T ethernet card for it on eBay. It could have 256MB of RAM but I haven't gone that far yet.

Original Macintosh software like WriteNow feels really fast on it on System 7.


> I run NetBSD on my Quadra 950,

> on System 7

Dual boot, or VM?


The machine boots System 7, I have some native Macintosh applications installed. NetBSD/mac68k provides a Mac application that takes over the whole machine, loads the NetBSD kernel into memory and starts running it, the only way to exit from this is to reboot.


Oh, that's really cool. A sensible abuse of the lack of memory protections, I assume - reminds me of GRUB4DOS, which did much the same.

EDIT: Yep,

> The Booter also will certainly fail if Virtual Memory is enabled, so turn that off too while you're snooping around in the Memory control panel.

- http://www.netbsd.org/ports/mac68k/booter-manual/index.html


"This computer was released almost 30 years ago. On paper, it should be inconceivable that this can at all fit into a modern workflow. Present-day computers are gigascale monstrosities that should smoke something as old and plucky as the Quadra. And yet, they just... don't."

So terribly true. Feel the same way about my old 14 MHz Amiga.


That Infini-D screenshot is giving me a stronger pang of nostalgia than smelling my high school girlfriend's perfume would.


I absolutely love the Quadra 700 case, but getting one here in Denmark is almost impossible. I'm also not comfortable getting a working or repairable Quadra 700, just so I can attempt to fit a mini-itx board into it.


There could be a market for modern 3D printed mini-itx cases in the Snow White design language.

Just saying.


Interesting that he says it struggles with Wolfenstein 3D. I spent a majority of my time as a teen playing Bungie's wonderful Marathon on my Powerbook 165c, which was a 68030 running at 33mhz. In a pinch, it could be played on a IIcx with a 16mhz (?) 030, but only with a reduced resolution.

I still have the Powerbook, but it won't boot. I regret ever selling my PowerMac 9500 though.


I regret ever selling my PowerMac 9500 though.

I also regret getting rid of my PowerMac 7600. Still have my SE/30 however! And a network card I have been meaning to install in it but can’t find the right screwdriver. With that I think I could do some “real work” on it.


"Apple recommends not running the 700 for longer than 20 minutes with the case off, otherwise the passively cooled 68040 processor melts down. Not the best design, but it works fine with the case shut."

Can you imagine if they tried something like that today? The cheese grater will not even turn on with the case removed


I used to write on a IIcx with an Apple 'Page" display and Word 3.0. Honestly, while I have left the Mac platform since the PowerPC disaster, that writing setup was very comfortable and would still be if you would focus on a pure single tasking word-processing task.


What was disastrous about the PowerPC? Sure, they eventually moved on to x86, but I don’t think PPC itself really caused big problems. Apple was already in serious trouble at that point; PPC on its own didn’t save them but at least it didn’t kill them.

The early PPC Macs were good machines, and the brief flirtation with licensed clones at least allowed me to buy a Mac-compatible computer (Power 100) that I wouldn’t otherwise have been able to afford.


> What was disastrous about the PowerPC?

The G4 was amazing, but right from the start Motorola couldn't get acceptable yields on the higher-clock versions, and that kept happening. The IBM-sourced G5 was a power hog that was clearly not meant to be a consumer CPU.

The first computer I owned was a G4 Power Mac, I ordered with the student discount the day after it was announced. I got the middle 400MHz version. Apple was forced to abandon the high-end 450MHz version shortly after launch.

If Motorola could have manufactured the G4 according to plan, Apple probably never would have switched to Intel.


I lived through that era and I’m pretty sure they launched with 400, 450 and 500Mhz. After getting back ordered and not being able to fill demand for the high end configurations they dropped the clock speeds to 350, 400 and 450Mhz. Imagine the shitstorm that would be stirred up today if Apple dropped the specs of a Mac or iPhone after launch.


Yes, this is accurate. We purchased the 500 MHz at launch and then it got delayed for several months. I believe we were offered to downgrade to a 450 or wait it out, and we waited it out.


Thanks for reminding me! I did get the 450MHz version.


The transition was the disaster. The first gen PowerPC Macs performed significantly less running common user applications, while costing a premium. This was a software issue, but left a very sour taste and many, including me, transitioned to Wintel as a result.


It depends on what you did. Photoshop wasn’t fully recompiled at first but they did get key performance tasks written to a PPC plug-in running inside 68k emulated Photoshop.

And it screamed. Faster than anything before it.

The fact that emulated 68k code could call native PPC was nuts to me but it was how most of the classic OS worked on PPC, with native parts being slowly rewritten over time.


I doubt there's benchmarks from ye olde days, but are you sure? I thought the first PPC machines managed to outrun 68040 machines, even in emulation. I fully admit my recollection could be wrong there.

But regardless, that entire pre-G3 era was the nadir of the platform. Those were dark days.


It was surprisingly similar to the current ARM transition if I’m remembering correctly. Many, but not all applications actually ran faster even in emulation. But there were certain tasks that didn’t emulate well and the pundits cautioned to wait to upgrade until the tools you used were fully ported.


I don't remember anything that bad either. They didn't fit the thermal requirements of laptops that well, but even the G4 with Altivec was plenty fine for many things.

The Intel move still made lots of sense, but that didn't make PPC unuseable.


I still bail out when a Linux install won't pick up my wifi without intervention, yet I put OpenStep on a DEC, LinuxPPC on an iMac, NetBSD on a Performa, and BeOS on something I forgot I had.

Had these phones never existed, I'd get around to the Amiga that's in the garage.


I badly miss my old PowerMac 8600 -- I filled that sucker with RAM and had a big, persistent RAM disk to boot from, and it ran MachTen (for Emacs) and everything else I needed for work (CodeWarrior) and of course, it was the best Myth II machine ever made.

Sigh.


I was a teen in the early 90s, and was super into computers. I remember seeing one Quadra in real life. It was at my friend's house -- his dad was composer for TV, and the Quadra was his main computer for that work. He ended up getting a couple of Emmy nods.

We weren't allowed to touch the computer or even get near it, and at the time that really pissed me off, but as an adult I totally get it now.


I begged and begged and got a Centris 610 (4MB RAM/80MB HD) without a CD-ROM drive when I was young. It was certainly a step up from the LC series with also a Motorola 68040. At least it wasn't a Performa from Circuit City lol. Those were awful.


Damnit. I've been watching some Quadras on eBay, I'm guessing more will join me now.


And meanwhile I've been on-and-off watching out for the other famous computer from Jurassic Park: the SGI Crimson.


I wonder how many worms are still looking around for System 7/8 machines to infect. One? Ten? It would be neat to get a virus on one of these in 2020.


Ah the Motorola 68040. The last CPU I could wrap my head around. I still have somewhere is a drawer the full line of 680x0 CPUs.


I used Linux Nubus on a 6100 pizza box for a while.

I had to first boot MacOS6, and then jump to Linux via some kind of loadlin for Mac.

That was around 2000.


should you be interested in including blogging into your workflow, you may be interested in a platform which would be compatible with that version of netscape. or write your own!


I imagine that not using web browser for authoring blog posts would make most sense. Maybe something like "post by email" would work better? https://wordpress.com/support/post-by-email/



Thanks for sharing this! I only now realize this is also available for Windows, which is much easier for me to emulate than anything Mac.


My engine supports back to Netscape 2.0, Mosaic 3.0, and IE 3.0, among many others. Support for even earlier browsers is intended.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: