Hacker News new | past | comments | ask | show | jobs | submit login

Let's be clear, those workstations were hella expensive. (The Amiga was not in the true workstation range, but rather more of a glorified home computer. Their workstation equivalents would probably be the stuff from NeXT.) Their closest modern equivalent would probably be midrange systems like whatever Oxide Computer is working on these days. A workstation was simply a "midrange" level system that happened to be equipped for use by a single person, as opposed to a shared server resource. The descendant of the old minicomputer, in many ways.



I'd say when you get into a fully kitted 2k video toaster you get into 'workstation' territory for my potentially personal definition of 'workstation'. For me a 'workstation' is a machine built and optimized for a task that primarily runs that task and that task only. It is sometimes the 'core hardware' that is interesting, but often many of the peripherals are more interesting. Things I consider workstations include Avid and other video editing systems, machines built for cad, and yes many of the 'desktop' sgi machines which generally did nothing but run software like softimage all day every day.

The 'workstation' largely died because general off the shelf machines because fast enough to perform those task almost as well. You now see a more open market for the peripherals that help 'specialize' a general purpose computer. Wacom tablets, video capture devices, customized video editing controllers, midi controllers, GPUs, etc


Yep, the closest I ever got to a SGI was drooling over their product brochures as a kid. The cost of a modest Indy was about the same as a mid-range car. It's hard to grasp as a modern PC user that these workstations could handle classes of problems that contemporary PCs could not, no matter what upgrades you did. Today, it would be like comparing a PC to a TPU-based (or similar ASIC) platform for computing.

From what I've read, Oxide is making racks of servers and has no interest in workstations that an individual would use.


When a game company I worked at went out of business and couldn't unload their aging Indigo Elans and Indys I picked up one of each for about a hundred bucks. I now have some regrets simply because their monitors have strange connectors, so i keep them around and they are heavy and annoying to store. That said I could probably pay off my initial purchase and then some by unloading one of their 'granite' keyboards (ALPs boards, collectors love them).


That 13W3 connector is the worst. I also had an Indy many years ago and getting an adapter together for it was a real challenge. These days I expect it to be somewhat simpler though.


Anecdotally a friend had a computer store that sold Amiga's and had his entire inventory bought out by the CIA of whom never paid him so they must have been powerful for something. This was in the late 90's. No idea what they were using them for. I used one to help a friend run a BBS. I could play games with incredible graphics whilst the BBS was running in the background.


If it was late 90's, as much as I love Amiga, it would have been for niche stuff like replacing a bunch of information screens or something like that where they could have replaced it with PCs but would then need to change their software setup. In terms of "power" the Amiga was over by the early 90's, even if you stuffed it full of expensive third party expansions. It still felt like an amazing system for a few years, but by the late 90's you needed to seriously love the system and AmigaOS to hold onto it, and even for many of us who did (and do) love it, it became hard to justify.


Could have been a case of designing a platform around the hardware in the early 90s, then being desperate for parts to keep the platform going while designing the upgrade.


Maybe, in which case it'd likely still be something video-oriented. The Amiga was never particularly fast in terms of raw number-crunching. As a desktop computer it felt fast because of the pre-emptive multitasking and the custom chips and the use of SCSI instead of IDE. Even the keyboard had it's own CPU (a 6502-compatible SOC) on some of the models - "everything" was offloaded from the main CPU, and so until PCs started getting GPUs etc. it didn't matter that much that Motorola from pretty early on was struggling to keep up with the x86 advances.

But for video it had two major things going for it: Genlock, allowing cheap passthrough of a video signal and overlaying Amiga graphics on top of the video, and products like the Video Toaster that was initially built around the Amiga.

So you could see Amiga's pop up in video context may years after they were otherwise becoming obsolete because of that.


Amiga's were used for a lot of weird video things like touch-screen video kiosks. Genlock a serial controlled laser disc player to the Amiga and put it in a cabinet with a serial port touch screen.

A PC could certainly replace it by 2000 but if you developed your content in the mid-1980's then Amiga was probably your solution and you needed to keep it going for a while.


It seems entirely plausible to me that three letter agencies could also have done render farm type things like this: http://www.generationamiga.com/2020/08/30/how-24-commodore-a...

(I think this is a subset of your comment rather than an extension but Babylon 5 reference ;)


That'd be Video Toasters. But Newtek ditched Amiga support in '95, and by late 90's PCs or DEC Alphas would cream the Amigas for the render farms.

Even Babylon 5 switched the render farms for seasons 4 and 5.

Not impossible someone would still want to replace individual systems in a render farm rather than upgrading, but given the potential speed gains it'd seem like a poor choice.


Yeah, I had a friend in the late 90s that used an Amiga with Genlock to fansub anime. I wouldn't be surprised if the CIA had some generic media rack kit or whatever that did something similar.

People also kept Amigas going past their prime for apps like Deluxe Paint.


Hell even in the early aughts you would still see video toasters in use at small local television stations until they were finally killed by HD.


More likely they needed to develop exploits for Amiga driven SCADA systems.


No kidding:

https://daringfireball.net/linked/2019/12/17/sgi-workstation...

> The Octane line’s entry-level product, which comes with a 225-MHz R10000 MIPS processor, 128MB of memory, a 4GB hard drive, and a 20-inch monitor, will fall to $17,995 from $19,995.

Really makes the M1 Ultra look affordable.


The Indy, which predated the Octane, started much lower ($5k according to Wikipedia, presumably in mid-nineties dollars), but yeah your point very much stands.


The Indy, though, was notoriously underpowered. Very much the “glorified home computer” the GP described, albeit running MIPS.

Still, sure did stand out in the MIT computer labs!


Indys weren't truly that slow. The problem was the base models were memory constrained to the point where IRIX could barely boot. 16MB was not enough, and IRIX 5.x had memory leaks that made it even worse. An Indy with 96MB+ will run IRIX 6.5 pretty well.


That sounds right. I believe most or all developers at the place I worked had either 32 or 64 MB in their machines. At first (~1995) most were probably using IRIX 5.3, but by 96 or 97 I think most if not all had moved to 6.5.

Whatever I had, I don't recall lack of memory ever being a problem. And the GUI was quite snappy.


The GUI was fantastic. Minimal got out of your way as much as possible and used the hardware acceleration to great effect. IRIX 6.5 was rock solid, I used it as my main driver for years before switching to Linux, we also had some windows boxes floating around because we supported a windows binary but that and admin were the only things done on those, everything was either SGI or Linux. I was still using my SGI keyboard two years ago but it finally died.


I seem to remember a very classic rant about how bad Irix was back then but I can't seem to find it.


SGI some places did a great job at giving good deals to computer labs. When I was at university in Oslo, there were rows and rows of Indy's on one side of the biggest undergrad computer lab, and then a bunch of Suns with multiple monochrome Tandberg terminals hooked up on the other.

No big surprise that the Indy side always filled up first, and that "everyone" soon had XEarth and similar running as backgrounds on the Indys... Of course "everyone" loved SGI and were thoroughly unimpressed with Sun after a semester in those labs.


There's a running joke about the Indy that it's the Indigo (its much-more-expensive brother) without the "go".


> Really makes the M1 Ultra look affordable.

The amount of power you can buy today for under $1000 let alone under $10000 is insane compared to back then. The M1 Ultra is not that expensive compared to mid-range workstations or even high-end PCs of previous eras.


Yeah, I don't quite get the way people sometimes reminisce about the hardware costs of the past. We used to have consumer CPUs topping $1000 back when that was some serious money, and big-boy graphics workstations could easily run the tens or hundreds of thousands.


Non-toy computers were only available to the relatively wealthy for nearly a decade. The original Apple II was the equivalent of around $5000, which certainly wasn't a casual purchase for most people.

If you look in back-issues of Byte the prices of early PCs with a usable spec are eye-watering, even before correcting for inflation.

Prices didn't start dropping to more accessible levels until the 90s.


An Irix workstation is like buying a Countach. It's a thing you have because it's the thing that you always wanted.


I used to run an e-mail service with ~2m user accounts '99-'01. Our storage was an IBM ESS "Shark" stocked with 1.5TB of drives and two RS/6000 servers as the storage controllers.

Add on web frontends and mail exchangers, and the entire system was slower and had less aggregate RAM and processing power, less (and slower) disk (well SSD in my laptop) than my current $1500 laptop.


That's just over $31,000 in 2022 dollars. I don't think I can even imagine what kind of modern desktop you could build for that much money.


Could easily get there with a Mac Pro: https://www.apple.com/shop/buy-mac/mac-pro/tower


I'm not even sure you could build a $31,000 desktop computer even if you wanted to without resorting to some ridiculous "expensive for the sake of being expensive" parts. Even quad RTX 3090 Ti's would only set you back $8,000 if you got them at MSRP.

EDIT: Just saw the other comment and I stand corrected.


You can run up costs pretty much arbitrarily with big memory and big storage. 2TB of RAM in a workstation will run you at least $30k if not more (it was $40k last time I checked), and you can go as high as 4TB in current systems. And big storage and NVMe arrays, it's almost a matter of "how much you got?", you can really scale capacity arbitrarily large if you've got the cash (although performance won't increase past a certain point).

This was always the dumb bit with the "apple wants HOW MUCH for a mac pro!?!?" articles about the "$50k mac"... it had $40k of memory in it alone, and the "comparable" systems he was building maxed out at 256GB theoretical and 128GB actual. That's great if it works, using a lower spec will push costs down on both sides, but it's not comparable.


Top-of-the-line 512GB LRDIMM DDR4 will run you about $2,500 before tax if you buy name brand Samsung. I know this because that is what is in both of my dual Xeon workstations. It gets pricey when you go through Dell or HP of course.


> 2TB of RAM in a workstation will run you at least $30k

The trick here is to use a board with 32 dimm sockets -- which requires an oddball formfactor-- but it radically lowers the cost of reaching 2TB.

But your point remains, change your target to 4TB ram (which really isn't an absurd amount of ram) and the astronomical costs come back (unless you go to 96 dimm socket systems, which have their own astronomical costs).


>I'm not even sure you could build a $31,000 desktop

A decked out Mac Pro can reach over $50,000 and it's not even that powerful as your 2x 3090Ti example, but that's the Apple tax for you.


Have you tried to price out a comparative PC? If you even can? Because there aren't many that will take as much RAM as that $50K Mac Pro and when you do find a PC that will all of the sudden you realize there isn't much of an Apple tax at all for the equivalent hardware.

Want to argue that Apple should have more variation in their offerings and price points? Sure - I heartily agree. But blithely tossing out a contextless $50K price tag as being some sort of "tax" is just silly.


>Have you tried to price out a comparative PC? Because there aren't many that will take as much RAM as that $50K Mac Pro and when you do find a PC that will all of the sudden you realize there isn't much of an Apple tax at all for the equivalent hardware

You're deluding yourself with your claim. If you would have done 30 seconds of googling, you would have found out that Dell, HP, Lenovo and Puget Systems, all sell workstations that can beat the top spec Mac Pro or can build one yourself for way cheaper than them but as a business you want the support and no headache so it makes sense to buy prebuilts.

So yeah, there's an Apple tax on it.

https://www.imore.com/53000-pc-competition-apples-53000-mac-...


It’s reasonably easy to build a threadripper workstation with 1TB of RAM (upgrade able to 2TB) and 2 3090’s for $20,000.

LambdaLabs will even sell you a prebuilt around that price with the same specs.

That said, I wouldn’t personally get into a pissing match over x86 Apples. I think their ARM offerings are far more interesting.


Quad RTX A6000s would be $24k and that’s what would go in a “workstation”


Venturing off-topic a bit here, but what exactly makes a "workstation" GPU? What's the difference between an RTX A6000 and an RTX 3090?


ECC RAM, different cooling setup (blower vs side fans), very different thermal characteristics, 24GB or 48GB, more bus width usually, optimized paths for data load & unload, GPU interconnects for direct GPU to GPU communication, shareable vGPUs between VMs, GPU store & halt, h/w support for desktop state, GPU state hand-off to another machine. It isn't just a "more memory" kind of thing.


> GPU interconnects for direct GPU to GPU communication

Note that nvlink is also available on the 3090, which does also have the same bus width as the A6000.


So interestingly to get multiple RTX 3090 cards using NVLINK you really need the blower style, which NVidia recently stopped making in the consumer line, otherwise you are fighting thermal properties and getting in to exotic cooling solutions. And you also get in to placement in slots on the motherboard on three-slot cards. And then you get into the number of PCIe PSU connectors on most supplies. You also get in to weird corner cases of four way NVLINK on 3090's acting as a "single card" but four way NVLINK on RTX A5000 is still independent cards as far as the OS goes. Regarding NVLINK, 3090 cards only support NVLINK (NVL 3.0 IIRC), rather than NVSWITCH. With NVLINK/NVSWITCH, with two RTX A5000 cards I get a total of 48GB of usable RAM for storing the machine learning models, and so on for multiple cards in the RTX A5K & A6K & AX00 line. But with the consumer cards, even though each card has, e.g. 12GB of RAM on them, your deep learning model is also limited to 12GB of RAM, no matter how many 3090 cards you put in. You're also limited to a maximum of four 3090 cards in total, whereas with the NVLINK & NVSWITCH on the professional grade cards, you are limited only by how many cards you can stuff in your data center - technically it is 16 GPUs on a single switch IIRC, things get funky with the fabric once you progress beyond 16 GPUs on a single switch but that's outside of the scope of this conversation. I may also have some details wrong, my brain is a little fuzzy afer dinner.


The A6000 has ECC memory and the 3090 does not. I think that's the chief differentiation between workstations and any other kind of desktop computer. Like a server, they will have ECC everywhere.


In addition to the other stuff people posted you also get to use the certified GPU drivers. Which means they actually tested that the card would work 100% with AutoCAD or whatever


The main difference is you get twice the memory. If you don't need that, there is very little reason to get an A6000.


The RTX is really slow for FP64.


https://zworkstations.com/configurations/3010617/

24x 4.5GHz cores, 96GB memory, 48TB NVMe storage, 2 giant GPUs, etc.


That's wild, although it seems to be server parts in a workstation case. I guess none of Intel's "desktop" chips support a dual/quad CPU configuration though, so that's your only choice. Quad 8 TB NVMe drives is definitely one way to get to $30K of parts pretty quickly.


Neither Intel or AMD support SMP with consumer chips. To go dual-processor with AMD you have to buy EPYC skus, which are several times more expensive than their threadripper core-count equivalents.


FWIW, EPYCs sell on ebay with $/core prices much closer to threadripper prices-- presumably that's closer to what AMD is selling them for to large companies after discounts.

The MSRP on them is ... quite staggering though!


Well dual XEON SP2 CPUs, multiple RTX A5000 GPUs, 30TB of SSD storage, 512GB of RAM and dual BlackMagic quad-input 4K capture cards can get you pretty darn close when it comes to your computer vision work.


You can easily get well past $40k once you start adding some Quadro GPUs, 192gb RAM and a few TBs of PCIe storage into any of the mainstream manufacturers’ workstation products.


A loaded up Amiga (i.e. add a CPU accelerator board, more RAM than most PCs could handle, specialized video processing cards etc) could get into the low end of workstation territory. But you are right that architecturally, they had more in common with high end PCs than workstations of their day. The Amiga's main claim to fame from a hardware standpoint was their specialized chipset.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: