People on here have provided good reasons to why the Amiga failed, and I don't really dispute any of them, but it still makes me very sad.
I feel like there was a roughly three year period where Amiga really was better than all the competition. AmigaOS was really impressive, the hardware was cheaper and more capable than Apples, and categorically better than whatever Microsoft was doing. It feels like the management at Commodore didn't really realize the head start that they had; it appears that instead of continuing to grow the OS, they just were happy enough enough keep things business-as-usual and eventually Apple and Microsoft caught up.
That all makes enough sense, but I feel like there exists an alternate universe where Commodore was competently managed, and in 2024 we are all running Commodore machines instead of Apple.
The engineers knew what they had. Marketing failed to market it properly because they didn't understand what they had. Management, as you said, also didn't understand what they had. Thomas Rattigan understood and started the A500 and A2000 projects. Then, head of the board, Irving Gould, fired him.
The root of Commodore' demise goes back to the 1960's when Jack Tramiel let Irving Gould invest to keep Commodore from going bankrupt after Jack made some bad business deals. Without that mistake, Irving Gould wouldn't have caused Jack to leave by preventing Jack's sons from entering management, and wouldn't have pushed out Rattigan or cut R&D which prevented the Amiga chipset from keeping ahead of the industry.
Gould made Commodore his own personal piggy bank, but under Tramiel I have a hard time seeing Commodore ever purchasing Amiga in the first place. Tramiel's double-dealing and brazen nepotism didn't do Commodore (or Atari) any favors either.
That is an excellent point. Warner's Atari would have undoubtedly ended up with the Amiga but would Warner's Atari have survived long enough to bring out an Amiga console or computer? It was hemorrhaging money. Once it died, who is mostly likely to have bought up their IP? Commodore? Someone else?
I owned an Amiga 500 but had to switch to a PC when I saw the writing on the wall. For many years it was almost painful to think about the lost opportunity the world missed out on. If they had managed the platform better, we would probably see much better computers and operating systems sooner due to the increased competition.
I remember how primitive the PC looked compared to the Amiga. It was not until Windows 95 that I felt that the PC had caught up with Amiga. It would probably be hard to compete with the PC in the office, but I feel it could have competed with Apple to provide an affordable alternative for the home. Having 3 viable platforms would have "forced" software developers to think cross platform, like they often do today.
By the early 90s the design and graphics-focused software for color Macintosh systems was light years better than Amiga. And the high-end video modes on Macs simply worked out of the box. If you needed a Gen-locked system for your local TV station, sure - get an Amiga. Ironically the custom chips that made the Amiga so special and a natural fit for broadcast design is also what killed its ability to use interlaced video modes that didn't make your eyes bleed or require special hardware. Most folks by then didn't want to learn how to make the Amiga "do all the same things". They wanted to go to the store, buy a computer that worked like the ones at school or work and use it. Period.
I have a Quadra 700 (same era as the Amiga 3000, which I also have) that can run circles around the 3000 with far less power or needed upgrades. For example by adding a decent video card the Q700 can output to dual monitors, each running their own resolutions and color depths - while multi-tasking. I can install triple-A CD-based software titles (like Photoshop or Illustrator or Pagemaker or MS Word) and on and on. The tools on Amiga pale in comparison even while the hardware was cool. In the 80s the Amiga was magical. By 1992 the way it seemingly just stood still was tragic.
As previously stated, high-quality print had become a "killer app" and Amiga simply didn't get the same support. And a lot of this had to do with their native video modes.
I can throw video cards and processor cards at the Amiga and make it a much more technically impressive machine. But if I want to run A+ professional software on it I basically need to run a Mac emulator.
As the video points out, the gaming market by '92 was moving on in the US. For 2D the SNES and Genesis took over. For 3D the PC arms race had begun, and there was no looking back.
Re-watching steve talk in Stanford and wondered about this calligraphy and font. But these discusssion about DTP reconfirmed how important this understanding make Apple unique. At that time and even afterwards due to the importance of fonts.
From the same era was this oddball solution[0], which at the time I had thought was something that never got out of R&D but apparently did exist as a commercial product in laughably limited numbers.
The a2024 monitor did not require any changes to the Amiga that drove it, except that software needed to be aware that it existed which naturally almost none did.
It basically just used the bitplane model as a tiling frame buffer and rather than displaying four or five bits of color depth, dropped the color depth by half and interpreted different colors as shades of gray using some monitor translation profile type of thing.
Again software would have to know how to play ball with this to work, but the effect was that you got a rock solid 60 hertz screen that displayed, initially, 8 or 16 shade greyscale at 1024 x 800 resolution.
There was a limitation in that the individual quadrants of the screen could only be redrawn at 10 or 15hz even though the scan rate of the tube was non-interlaced 60hz. Subjectively, having seen this thing in person I didn't get the sense that it made much noticeable difference for the kinds of applications that would use it, which would be principally dtp or coding. The computed vector refresh rate of the entire display field was in general nowhere near quick enough for the raster quadrant paging to be noticeable, except in vertical scrolling.
Sadly this was obviously was a desperate if rather clever hack solution to the eye-strain drawbacks of the Amigas interlaced display at high resolutions, so no software support however appeared that would have driven further work in this direction. However had that happen, and the technique applied to this monitor was updated to match the capabilities of the AGA chipset, the resolution would have been potentially as high as 2360 by 1024 at 75hz, (edit: should be 60hz. 75hz possible "only" at 1600x1200) which comfortably exceeds HD (though only in greyscale) but for 1992 would have been safely well out in front of almost anything else at any price range, let alone a few thousand dollars.
Given the awkward aspect ratio, a more sensible use of that resolution would have been splitting it vertically into a dual monitor setup, although one would need two identical monitors to make that work. and the Amiga's version of multiple virtual desktops, pull-down screens, would have been impossible to make work effectively, and one wd have to do multitasking entirely on the workbench display in full-screen application windows like the Mac.
But at least it would have been preemptive multitasking, which clearly was entirely worth all of the extra associated hazards of multitasking without an mmu-aware kernel. /s
> It was not until Windows 95 that I felt that the PC had caught up with Amiga.
By the early '90s, new DOS PCs had VGA and Sound Blaster cards as standard equipment, and were usually also including Windows 3.x. That put the typical PC clone on par with Amiga for most home/office use cases, especially including gaming.
> It would probably be hard to compete with the PC in the office, but I feel it could have competed with Apple to provide an affordable alternative for the home.
Apple didn't have much of a foothold in the home at this point -- their bread-and-butter market was schools, and they were just beginning to get a foothold in the design and DTP market -- and Apple itself was in deep trouble for similar reasons to Commodore.
The fact that IBM PCs and compatibles had become the de facto standard for business computing is one of the exact reasons why Apple, Atari, and Commodore started losing market share in the early '90s.
If you could buy a single machine that's good for both business and gaming --- one that runs Lotus 1-2-3, WordPerfect, Wolfenstein 3D, and King's Quest V -- perfectly, why would you pay even more for a proprietary platform that only excelled at one set of use cases at the expense of the other?
This was the era of convergence between home and office computing, and the fact that IBM-compatibles dominated the office market, the capabilities of PCs had reached near-parity with the Amiga, and the PC was an open architecture with a large ecosystem of OEMs all competing with each other and driving prices down, spelled doom for any non-x86 computing platform by the mid-'90s.
Yeah, though people have explained to me that there were things in the Amiga that kind of "stayed primitive". While I think it's insanely cool how early they got preemptive multitasking, something that I think hurt them was the fact that by-design it didn't really support any kind of protected memory.
The Amiga was a bit before my time, sadly, so I'm getting Wikipedia-depth knowledge of this; my dad had one when I was very young but all I ever used it for was playing games. Still, it's easy to look back at these things and feel like things should have been different; particularly I feel like Microsoft got way more slack than it should have. DOS, which felt really primitive by the early 90s, still seemed to be more-or-less standard until Windows 95.
The filfre.net series of articles on the history of Commodore and Amiga is truly great work, and should give you a much clearer picture than pretty much anything else.
Every weakness of the Amiga could have been solved if money was poured into it instead of poured out of it.
The entire chipset state could have been bankswitched in a multi-tasking fashion and each process could have gotten its own "virtual Amiga" to play in. So much can be done when you have full vertical integration, but Commodore never leaned into it.
Every perceived weakness of the Amiga is from the perspective of the design of the winning system, i.e. generic modular PC design where the hardware and the OS are at best uneasy friends.
Did really anything support protected memory back then?
I also am still much more bought into the power of marketing and generally focusing on getting things into student's hands as the power move that Microsoft pulled off. Probably helped a ton by a lot of failed vendors along the way. It isn't that DOS and Windows were pure successes. Rather, they managed to outsource a ton of their failures onto other companies.
> focusing on getting things into student's hands as the power move that Microsoft pulled off
That's a huge part of Microsoft's success. They looked the other way regarding "piracy" to gain market share. At least in my country nobody paid for Windows at home. If students and home users had been forced to pay, the adoption of new Windows versions would fall drastically.
OS/2 1.x supported memory protection for apps targeted for OS/2. Since 1.x was written for the 286, it put all MS-DOS apps in the same address space, so one errant app could bring down the whole MS-DOS subsystem. It would take OS/2 2.0 to exploit the 386's Virtual 8086 mode, which allowed each MS-DOS app to run isolated.
I think this Amiga version of history ignores what Apple was good at by the late 80s: DTP. We collectively seem to have forgotten what a huge deal this was, laser printing, postscript etc. It genuinely overhauled the entire publishing industry, not entirely for the better either.
By the mid 90s Microsoft were persuading a lot of people WinNT would displace Apple in publishing, but that never quite worked out how they intended either. (This is one of the reasons Xara was developed on the PC and not the Mac, which with hindsight was a big mistake).
The Amiga had good DTP software and hardware at the time. I used them for large format printing. They were initially faster and much cheaper than the Apple solutions.
Large format printing from an Amiga? Surely that is a niche in a niche.
The only pro non Apple DTP people I encountered in that era were Germans with DA's products on Ataris (famously kicked off by an ill considered bet with a Mac owner), and various more workstation or custom hardware type things such as using the Quantel Paintbox for print work, and the odd unix people doing more tech publishing work.
The other thing was Apple became so sticky in part because so many people leveraged Quark Xtensions to a possibly misguided degree, and these proved very non portable. Whole catalogues and directories were being laid out that way.
Fair enough; I guess I'm applying a 2024 lens to a 1988 problem. We don't print a lot anymore, but that I will acknowledge that that was really really important back in the 80s and 90s.
Only briefly. Before about 1988, Unix systems (Interleaf and Frame) were still clear leaders, and then Apple began to stumble from around 1990 - the IIfx was the top of the line for a looong time, and soon looked very dated compared to the 486-based systems selling for a quarter of the price.
But, yeah, for those couple of years the combination of the Mac II, Laserwriter II, and Ready Set Go / Pagemaker / Quark was king. The Amiga never really had a chance to gain a foothold.
The Amiga was dead before a Wolfenstein 3D came out on the PC. A lot of Amiga users failed to recognize it at the time but commodore surely and truly killed that thing off by his horrible mismanagement. When Wolfenstein 3D came around it was really just shifted from keeping it on life support hoping it would awake from its coma to palliative care.
My parents called this before Wolfenstein even came out, though they did it with the Commodore 64 and 128. I really wanted a 128, and they insisted on an IBM compatible instead.
I was devastated at the time, but they were 100% correct in hindsight.
Your parents were wrong. Assuming you wanted to play games as well as run productivity software, the C128 was far superior to any PC in 1985. Yes the PC would eventually catch up in the 90s, but the C128 was discontinued by then.
I had a C128 in 1986, but I don’t see how it was superior to any PC. The 8086 was faster, it had more memory, it had more expansion slots, and of course there was way more software if you exclude gaming.
In 1988, I had saved up for an Amiga, but my father offered to match what I had saved for a hard drive if I bought a PC instead.
Like most kids in the 80s I would never “exclude gaming” from my evaluation.
The PC was 15x as expensive as the C128 and still wasn’t any good for gaming. While C128 had the entire C64 games library available!
It also had the CP/M software library and could run productivity software in 80 columns. Yes CP/M software wasn’t as good as DOS software, yes your spreadsheets didn’t calculate as quickly, but for a couple of years the C128 was the best option, until the Amiga came along.
The Amiga was certainly the best option in 1988. Arguably it was the best until about 1995.
That's correct in the short term, but my parents were focused on the long-term and that the computer was an investment for my future, rather than a tool/toy for a present.
It wasn't long before I admitted how correct they were.
Apparently a typical PC is 1985 cost upwards of $5000. The C128 was $299. So for rich people yes it may have made sense to pay a 15x premium to get hold of tech “from the future, today”. But you could have easily had both. And for a child, both machines would be entirely obsolete by the time the child grew up and entered the workforce, so I would question how much advantage you were really buying.
$5,000 is nowhere close to accurate for a 'typical' PC of that era. In 1985 or 1986, I don't recall the exact year, my dad purchased a rather exotic machine called a Panasonic Sr. Partner, which was one of the first all-in-one portable PC compatibles on the market. It had built-in dual floppies, I think 384k of memory, a built-in monochrome screen and a built-in printer of all things.
I believe the out of the store price was something like $2,300. With a model below that available for under $2,000, I think with less ram or only one floppy or something. But anyway this was for a fairly atypical take on the PC and a regular white box of the same era could have been had for perhaps half of that with the same or better specs.
I had a Commodore 64 that I had bought with my own money, but my friend (rather his dad) had a Commodore 128 with a floppy drive. I was so impressed but also extremely jealous. My brother though bought a PC.
My experience is almost identical. I had a Vic20, C64, C128 and then an A1000 as a kid, then my parents bought a 286 AT 10MHZ with a 20MB HDD and everything changed. They were absolutely right. Had I stuck with Commodore instead of switching in 10th grade, I would have been way behind.
Yeah. I actually agree with the other comment that the C128 was better than my IBM Compat at the time, but my parents weren't looking at the present. They were looking at the future, and even said so.
My first was a 386/20, which had a problem and Circuit City's guarantee saw us trade it out for a "better" computer for free... That turned out to be a 386/16. I didn't know enough about them at the time to spot the problem.
It wasn't long before I had a 486, though, so the problem wasn't long-lived. I've never forgiven them for the deception, but they went out of business, so I guess I won? (I see they're back now somehow. Wow, what a joke.)
My family had a Beeb with 128KB of RAM. I thought XTs were awful. Sure, they had more RAM, but for our use cases (games and word processing), it wasn't really necessary. I even learnt Pascal on it.
I moved from an 8-bit Atari to a CGA IBM XT clone. This was both a step forwards and a step backwards. It wasn't until I upgraded to a 386SX PC with VGA and a SoundBlaster that I could totally retire the Atari.
As a proud commodore Amiga 500 owner starting in 1990 I agree.
When it was released in 1985, the Amiga was at least 5 years ahead of its time when it came to the graphics and sound technology. I don't care about angering Apple bots, but the capabilities of the Mac paled in comparison.
The mismanagement at Commodore that led to the fall of the company is well documented.
For me personally, as a tech professional, the lesson of the episode is clear: superior technology alone is not a guarantee of business success.
The technology companies are high performing in both technology and business. The most clear example of what I mean is NVIDIA that was launched when Commodore was still alive.
For PC games Commander Keen running on an EGA display was already enough for PC gaming to be good enough for me.
I remember how excited I was on my Atari 8-bit awaiting the production of Amiga Lorraine. By the time it came out I think I was less excited, and happy enough with my monochrome Atari ST. The Amiga would certainly have been more fun, but I was glad to have a machine that could run a (Megamax) C compiler. It helped that I got some beta software and dev docs through a connection. The Lorraine was supposed to have it all, graphics+sound+MIDI but the last part went to the ST instead.
It sure was weird when Commodore and Atari made PC-compatibles--the end of an era. In the late 80s I'd already moved past the Atari, Amiga, and even DOS/Windows, being all excited running all versions of OS/2 (until that failed too), but hey we got OS/2 3.0 (aka NT).
They failed to evolve their product while the competition was rapidly improving. The orginal amiga was leaps in front of anything thing else you could get at remotely the same price point, for many features that mattered to games.
But then they only made the smallest incremental improvements to the hardware until they launched the AGA chipset with the A1200 and A4000 which was too little too late. Had the amiga improved at the same rate as consoles or dos PC's they might have had a chance.
The OS (mostly workbench) also did not see any big improvements over its lifetime.
The custom chipsets in the Amiga went from being its secret weapon to its biggest liability. Amiga simply could not afford to keep them competitive with commodity PC hardware, especially not while Irving Gould was busy looting Commodore of every penny it was once worth.
Anything is a liability if you don't invest in it. The Amiga was unchanged basically from the launch in 1985 to Amiga 1200 in 1992. That's 7 years of thumb fiddling and the 1200 was basically a small bandaid on a gushing wound. Reacting not acting.
They never believed in themselves.
The IBM PC was launched in 1981 with 4.77MHz and 7 years later you could buy 40MHz PCs with wider buses.
Other companies have made the same mistake as Amiga, before and after.
They spent a ton of R&D getting ahead on things that would be commoditized in a few years. Later, their sound card had to be better than the best of a half-dozen PC variants, and so did the video card, CPU, monitor, keyboard, disk drive, etc, etc.
Sure, they could have bought commodity versions of some of those (and probably did), but then they’d have to be the best at choosing commodity components, and there were 100’s of strong competitors doing just that for PCs.
Now that moore’s law is slowing down, and everything has long-moved into system on chips, vertical integration seems to make sense again (look at Apple).
Android was trying to create an ecosystem of interchangeable hardware vendors (like Microsoft did with DOS, and, later, Windows), but it basically failed because driver compatibility is so poor in that space, and key components, like cell modems are locked behind anticompetitive OEM licensing agreements.
Android succeeded at other things, but it’s definitely not the case that you can assemble components into something with better specs than a flagship android for half the price, even if you had access to a pick + place / IC soldering machine. It was/is that way for desktop PCs.
Commodore didn't give the hardware development team the resources and go-aheads that they needed in order to keep the Amiga competitive, mainly as a gaming machine and incidentally as an extremely popular demo platform. And it wouldn't have had to be "insane" upgrades to at least buy Commodore more time in the race to figure out the platform's future shape:
* A few more (and ideally also a bit wider) sprites.
* 8 channels of sound, with actual panning. The low sample rates were "OKish".
* Chunky mode, or at least hardware chunky-to-planar like the CD32 has.
* Bumping e.g. the 14 MHz EC020 in the A1200 to the 25 or 33 MHz variant.
* More chipmem and an equal amount of real on-board fastmem.
Yeah, it's technically available, just not feasible for doing much else. Nobody really made use of productivity/DBLPAL/etc. modes just to get the higher DMA period rates beyond ~29 KHz. None of the trackers support it (DigiBooster might be the exception) and you forfeit so many CPU cycles per raster line.
I wonder if they boxed themselves into a corner by putting so many features into ASICs. Custom chips take forever to design and iterate even today, I can only imagine how painful that was in the 80s. Couple that with a frugal management and I can see how they would be reluctant to sink resources into a fantastically expensive development process.
I think with hindsight the tragedy is the home computer market getting split between Commodore, Atari and possibly Acorn. The ST looks like a far more reasonable foundation from which to start building a computing legacy than the Amiga, which really was a one trick pony.
Have an ARM in an ST running a pre emptive multitasking OS in 1990 . . . now that would have been an interesting machine.
They kinda already were getting there when Atari hired Eric R Smith to work on MiNT. MiNT also supported protected memory!! At least this is what I remember from my time playing around with it on my Atari (C-LAB) Falcon before I sadly sold it off.
As a collector, I vastly prefer the ST lineup over the Amiga for this reason. The hardware was more "bog standard" and easier to understand than the custom chipset on the Amiga, even if the Amiga seemed cooler on paper back in the day.
But was it Amiga's chipset the noncompetitive part? My recollection from that era was that graphics/sound capabilities of the era were far ahead of the PC, but PC had better CPUs, which was a problem of Intel vs Motorola.
Anyway, in the early 90s my father wisely chose PC over Amiga to replace our aging ZX Spectrum. The reason - PC was open and easier to upgrade. Maybe that's what actually killed Amiga and Atari ST.
The Amigas were open and easy to upgrade as well. The exception was graphical capabilities for the "home computer" models like the A500/600/1200 - they are quite compact and the available solutions consequently suffered limitations.
Motorola was ahead in the beginning with a better architecture and more performance, but didn't have as much money as Intel had from PC clones to pour into improving it and eventually fell behind Intel in performance. IBM wanted to use the Motorola 68000 in the IBM PC but didn't because the support chips for the 68000 weren't ready yet and they didn't want to wait a year.
I had an A1000 - even worked in high school at a Commodore/Amiga dealer.
What they really didn't understand is that software sells hardware: The OS was so far ahead, but in terms of basic productivity software? Even the Mac was ahead.
And [IMO] markets can maintain a leader and one strong competitor. But just one. So it became PC/Mac throughout the 90's on the desktop.
That Commodore didn't understand that software sells hardware was obvious back in the early 80's before they bought Commodore. The VIC-20 was incompatible with the PET, The CBM-II line was incompatible with the PET, the C64 was incompatible with the VIC-20, the Plus/4 was incompatible with the C64, the C128 was incompatible with the C64, although it had a compatibility mode. All the C64 compatibility mode did was ensure the C128 mode was rarely used.
The only reasons Commodore was successful is that their hardware engineering was great and their products were inexpensive. Apple had weak hardware and high prices, Atari had great hardware and high prices.
Jack Tramiel never used a computer until he was given an iPad. He never understood computers. He would have been just as successful selling bicycles.
Commodore's bankruptcy was primarily driven by Irving Gould to enrich himself. Mehdi Ali was his henchman. They pulled funding from R&D, preventing the chipset from keeping up with the industry.
As someone who followed the development of the Amiga and then bought the Amiga 1000 (plus everything else I could afford with my paper route earnings) the first day it was available -- I do NOT yearn to relive those days. I loved it then and as documented everywhere, it was a head of its time. However, the instability of software and how much Commodore didn't iterate on the technology -- that's what killed the Amiga. I still have PTSD from all the insane amount of crashing and rebooting my setup did.
Indeed, AmigaOS despite being ahead of its time in the '80s, was doomed to lose by the end of the '90s.
The biggest problem was that Amiga did not have an MMU until very late, and the OS has been designed for unprotected shared memory space. It was crashy, with fragmenting leaky memory, and it could not support fork(). Later AmigaOS 4 and MorphOS struggled to add full process isolation.
In retrospect, Microsoft has been prescient in adding virtual memory to 9x, and incredibly successful in switching to the NT kernel. AmigaOS would have needed the same to survive, instead of just sitting on their multitasking-a-decade-before-Windows laurels.
VideoScape 3D and Sculpt 4D where my bread and butter at the time. Although anything would crash.
One day I'll rip and upload a video I made in the late 80s of when I was typing up a chemistry lab report. My Amiga was producing so many guru meditation errors I had my VCR record my typing (in real-time) so I had proof that I did the report. I handed in a draft of the report and a VHS tape.
I'd also add 3DStudio to the list of assassins. It was not even close to LightWave 3D, but it did not require a Video Toaster card and it ran on commodity PC hardware. Amiga never got software package that could outcompete Apple in the graphics, DTP departments either. Add to that the dead-end that was AmigaOS as well.
I feel like Amiga was never really alive in the US. I never knew anyone who owned one, or a 520st. Schools were almost exclusively Apple IIe, later some IIgs before Mac SE and LC. And of course PC. IIgs was more comparable to a Commodore 128 but it ran all the Apple II software. Sometimes you'd see a Mac II or SE/30 in an administrator's office running a screensaver.
It's one of those rare cases where being in the US or Europe gives an uniquely skewed perspective.
It's astonishing to think that the Amiga hardware was done in 1984 and the A1000 came out in 1985 - the same year the NES released in the US! It took Nintendo until 1991 to come up with something roughly comparable power-wise.
From what I understand in the US the Amiga slowly petered out without never truly taking off. In Europe the A1000 never was a thing, but we had four years of the press talking about this mythical monster of a machine and its custom chipset. Then in 1989/90 all of a sudden everyone bought A500s to play Kick Off and Speedball II. That 89/92 period was glorious.
At least in southern Europe Wolfenstein wasn't regarded as a killer app at all, it barely made an impact. Doom and Wing Commander most definitely were, though.
For another data point, as kids in Greece we made fun of our friend who first switched to a PC, (we had Atari STs). Even after he got a Sound Blaster and a VGA card we still had better games like Kick Off and Dungeon Master. But one day he invited us to his house and showed as Wolfenstein. I distinctly remember the feeling. It was over!
Over half of the Amigas sold worldwide were the 500 model and, in Europe, most people with an Amiga had the 500 or 1200 at home for gaming so you would see it when you visited.
The Amiga was less popular in the US and used more for graphics, sound and video for TV and movies on the 1000, 2000, 3000 and 4000, and with the Video Toaster, but you wouldn't see one as often when you visited people at home.
At least from my standpoint, this is completely true. I was hardcore Amiga (AmigaOS and Amiga MINIX) up until 1992 and then Wolfenstein 3D came out and it got me to buy a PC. The OS experience was comparatively terrible, but that game was so good in its day.
I would agree with the other poster that, as far as being a living product, the Amiga was already dead, but as a usable platform, there was still a very helpful, very large community.
I was a kid back then but as far as I understood the waves of computing DOS and Windows 3.0 was dominant in office settings at least a couple of years before Wolfenstein 3D was released, and such machines trickled into people's homes over those years and became part-time gaming stations and created pretty solid expectations in adults about how a computer should work and what to buy for their kids.
It wasn't entirely incompetence. Commodore marketing was incompetent but the bankruptcy was primarily driven by Irving Gould to enrich himself. Mehdi Ali was his henchman. They pulled funding from R&D, preventing the chipset from keeping up with the industry.
For me it was Comanche ('92) that spelt out the demise of my beloved Amiga. I had a floppy disk demo that I played on my father's crappy, crappy PC at work and it completely destroyed anything you could achieve on the Amiga.
I think that a good part of that killed Amiga and Atari was the fact that IBM PC were the "serious" computers, the one you bought for work, and the Amiga and Atari were less considered. I know that they had some success in their niche like music and publishing for Atari, and graphism and design for amiga, I even saw Atari on a warship. But at the end most of the money at the time was made in a professional setting, and neither Atari or Amiga managed to get a sizable part of it.
I regret it because passing from GEM to DOS really felt a step backward
There were two reasons the IBM PC clones became dominant. The smaller reason is because IBM created it. The bigger reason is because everything but the BIOS was off-the-shelf parts and easy to copy. Apple had the same problem with Apple II clones where only a couple of companies managed to not lose when sued by Apple. IBM couldn't sue Compaq and others for their clean room BIOS replacements.
If the IBM PC had custom chips, it would not have been easy, or even possible, to clone and would have remained an IBM-only product. If that had been the case, I suspect CP/M and GEM would have won out and Microsoft would be much smaller and still producing language compilers today. The only question is, which version of CP/M would have won out? I think Motorola's 68K series would have won over Intel's x86 series.
The problem with motorola winning the 80s was that no matter what they did, they were facing the same problem in the 90s, the 88k was going to flop hard. And early PowerPC would still have been 'so so' compared to the Pentium and onwards.
That said, would the Pentium have happened without the PC industry - everything up to the 486 would likely have continued to develop as it did, even without the PC as they were strong embedded chips too.
I remember reading magazines when the Amiga 1000 was introduced, and even then the general conclusion was: amazing hardware, but doesn’t run PC software.
The home-computer wars of the 1990s have always confused me. There's seems to be a kind of tribal-allegiance that computer-buyers participated in when they became computer-owners. I've never understood why it had to be PC vs Amiga or Nintendo vs Sega or whatever.
My best guess is that a lot of the buyers were young kids who did'nt have the maturity yet to see the world in a more flexible way. I was certainly guilty of that back when I was a teen.
It was because there was a lot of difference between the systems then, unlike today where they are all the same technology with a different sticker on the front of the case.
If you cared about games, the system you bought would be completely different than if you cared about business.
In the 8-bit era, if you wanted business you got a CP/M system with 80x24 text. If you wanted games, you got an Atari 800 or Commodore 64 with colors, hardware scrolling and sprites.
In the 16-bit era, there was the IBM PC for business, the Mac for people who wanted ease of use, the Amiga for games, 4096 color graphics and stereo sound, and the Atari ST (512 colors, mono sound) if you couldn't afford an Amiga. That said, they could all do games but obviously some were better than others, and they could all do business, but the perception was that game systems weren't good at business so business apps weren't made available.
Many things were platform specific at that time. You didn't have major game titles or business applications widely available across platforms. Folks who used PCs/Apple/etc at work/school were more likely to buy similar for the family at home etc.
In my family I get the impression we chose our home computers based on merit/value. That meant starting with the commodore 128 in the mid 80s and led to my brother buying an Amiga 1000 [with his hard earned teenage burger king min wage $] in 1987.
In the 90s the advent of Windows 3.1 running on cheap PC clones left Commodore in the dust. The value for the money shifted to PC, even if it was inferior at first.
It was really sad that Amiga did not continue to innovate - the hardware was astonishing which can be seen by looking at the demo scene output and games front he time relative to what was possible on other platforms.
I think it was mostly because $1000 items were a bit harder to come by back then so kids would do a lot of reading and day dreaming before getting enough cash to pull the trigger. By the time they could actually afford a computer/monitor/printer/software they had talked themselves into how great the product was.
I owned an Amiga 4k/040 (after an A500, later with a Retina Z3) at the time and was blown away by Wolfenstein that I played on the 386 (?) of the dad of a friend.
The hype on usenet for DOOM was quite the thing to behold. It was super hyped because of Wolfenstein was so interesting. It even spawned two different usenet FAQs to track the madness. Wolf was also the first game I found out about motion sickness.
I think the biggest reason is just the difference in exposure. Doom had a ton of pre-release because of Wolf3D but then had a buzz of its own. PC sales had also been steadily increasing so just in the time between the two games there were more PCs in consumer hands and pretty much all of them were capable of playing Doom. There were just more instances of Doom running than Wolf3D.
Another aspect is Doom as an engine had longer legs. It got an official sequel not a year after its release as well as official "clones" in the form of Hexen. Hexen's own box advertised its relationship with Doom.
I feel like I'm taking crazy pills because the Amiga was an architectural nightmare but everyone pretends to love it.
There were multiple types of RAM, and not like the PC's conventional/extended/enhanced or whatever there were multiple PHYSICAL types of RAM and upgrading one over the other might make it so upgrade cards don't work, or are slower, or have incompatibility issues.
When I upgraded the video card in my A2000 to a picasso-something I had to have three different drivers and switch between them because AmigaOS didn't have the concept of "a driver" like Mac and PC. One operating mode for old amiga programs hard-coded to OCS, one mode for compatibility with later amiga standards, and one mode for card-specific high-res/high-color modes for pro apps.
How can you tell which mode you have to use? Just do things until it crashes and then guess that the video mode was the culprit and switch modes until one works, of course!
I installed too much of one kind of ram so the address space of the ram expansion board overran into the space for other zorro-ii slots which switched my graphics card into a slower mode unless I disabled some ram. Was this documented, either by the board manufacturer or Commodore? No. Did I find out on a random BBS while complaining about how my fancy new graphics card was so slow? Yes.
The 68000 in my A2000 was slow the day it was introduced so a couple of months later I forked out for an accelerator card and because everything was so tightly integrated with the original CPU and the amiga solution to everything seemed to be "just hang it off the CPU bus and hard code it to the timing of the original CPU" things would be blazing fast until they hit some kind of rarely-used Paula/Denise/Agnus function that HAD to occur at 7 MHz and crashed at which time I had to turn the A2000 off, flip a toggle switch I drilled into the back of the case, reboot and re-do what I was doing at the original speed.
But yeah, the last time I turned on my A2000, which I still have, (except to play Lemmings or pull some files off its hard drive) was around the time that Wolfenstein 3D came out.
edit: also, until its dying day Commodore refused to even acknowledge that computer networking existed. It was easier to share files and resources between an Apple II and a PC over a network than it was to share files and resources between two Amigas.
You are not taking crazy pills. You just had a bad experience because you bought a low quality accelerator and video card for your Amiga.
The different types of RAM were part of the reason the Amiga was so powerful and was a feature, not a flaw. The fast RAM was only used by the CPU so it could run faster without the custom chips slowing the CPU. Because the CPU mostly used the fast RAM, the custom chips weren't slowed down by the CPU. This effectively made the data bus nearly twice as fast.
By the time custom graphic cards were coming for the Amiga, people were just trying to keep up with the IBM PC. Those video cards weren't designed to match the architecture of the system. That's not the Amiga's fault, but a fault of those aftermarket video cards. That is also why you had problems with RAM overlapping. I never used those cards and had no problems.
If you bought an IBM PC at 4.77 MHz and put a 286 accelerator in it, it would be hobbled too. If you wanted more speed, you should have gotten the Amiga 3000 in 1990 with a fully 32-bit 68030 @ 25 MHz with 32-bit RAM, like I did, and I never had to deal with crashes and it was faster than your accelerator.
I don't know how you can say Commodore refused to acknowledge that computer networking existed when they had released the A2065 Ethernet card for the A2000. Few people had LANs at home back then so it was mostly used in businesses.
It sounds to me like you made a number of bad purchase decisions and ended up with a mess of your own making. Sure there were good quality accelerators and video cards but apparently you didn't buy those ones or didn't want to wait for the Amiga 3000.
Here we go again. Someone flagging a post they don't like without rhyme or reason. A post that the community is obviously passionate about and enjoying to share their experiences and recollections. How annoying.
Another thing that killed the amiga was it's lack of hackability and approachability. On the C64 you could start writing code right away, and instantly go into assembly with an extension
What extension are you referring to? It wasn't anything that came with the computer.
The Amiga, and none of the other 16-bit systems had an assembler built in either. They had interpreted BASICs just like the 8-bit systems, except on disk instead of in ROM.
(press enter until you reach the end of your source code at whatever address)
(enter a blank opcode to mark end of assembly range)
-g =100
voila, your (naturally, bug-free, right?) code assembles and runs.
It's living about as close to the land as you can get when it comes to an assembler, but it did the job.
from ms-dos 2.0 onwards debug was apparently re-entrant and could read its own binary from disk, "un"assemble it, assemble it again, and run another instance of itself from inside itself. When I first discovered this at 12 or so It felt I stumbled across a tiny doorway to some alternate dimension.
I feel like there was a roughly three year period where Amiga really was better than all the competition. AmigaOS was really impressive, the hardware was cheaper and more capable than Apples, and categorically better than whatever Microsoft was doing. It feels like the management at Commodore didn't really realize the head start that they had; it appears that instead of continuing to grow the OS, they just were happy enough enough keep things business-as-usual and eventually Apple and Microsoft caught up.
That all makes enough sense, but I feel like there exists an alternate universe where Commodore was competently managed, and in 2024 we are all running Commodore machines instead of Apple.