Hacker News new | past | comments | ask | show | jobs | submit login
Advanced Amiga Architecture (1992) (archive.org)
111 points by doener on April 30, 2023 | hide | past | favorite | 97 comments



For context: this document describes the design goals for the successor chipset to the AGA chipset shipped in the Amiga 1200 and 4000. If you watch the Deathbed Vigil video of the last day at Commodore there's a brief shot of a prototype, but this never made it to production before Commodore folded.


The document was published by Dave Haynie himself, who worked on the design and is also the producer of the Deathbed Vigil video. Haynie was one of the main hardware engineers for later models of the Amiga.


It's my understanding that AAA was supposed to be the successor to ECS, but business problems forced the creation of AGA, a stop-gap (so they thought) successor since ECS was rapidly becoming too out-of-date to compete with PCs.

Commodore never resolved its business problems and the rest is history.


Furthermore, Ranger, not ECS, was supposed to be the successor of OCS, with much higher specs.

Ranger was even finished and ready to go, but CBM management decided against it. It would have shipped around 1988.

Management culture of overruling architects was one of many reasons CBM went to hell.


> Management culture of overruling architects was one of many reasons CBM went to hell.

I used to be a massive fan of Amiga (still am) to the point where I'd still be waving the banner in the late 90s as my friends all were playing Quake and laughing at me. I've often been tantalized by these 'what ifs', but it's pretty clear in retrospect that IBM PCs were always going to win.

IBM (accidentally) democratized hardware expansion with PC clones, so technological advances could be stolen from other architectures, and competition could reduce costs. And, crucially, they already had a massive value advantage by having an excellent suite of business software, justifying their purchase price for many households. Looking at the graphs in this article it's pretty clear they had already won the war by the mid-80s.

https://arstechnica.com/features/2005/12/total-share/5/


It sounds like you're speaking from the assumption that a monopoly is inevitable in computing - the PC was so good, all other platforms had to fail. But all other platforms didn't fail.

If Commodore had run their company properly and delivered viable products, marketed them properly, there are people would have bought them. It's an open question if that could have supported a viable niche business and for how long.


What would have saved the other platforms is what (I argue) saved the Mac - the Internet itself.

Unfortunately it came a bit too late (widespread) for too many of the platforms, and Moore's Law finished everything else off.


> monopoly is inevitable in computing

it's not a monopoly, it's an open standard, which is the opposite of a monopoly.


Conceded, that's the wrong word use for the PC hardware market. But Windows + Intel definitely had something approaching a "desktop computer platform" monopoly in the mid-late 90's. And it could have become much worse, had Jobs et al. not turned Apple into a functional business again, but I still think it's hard to say if there wasn't (isn't!) room for more than one alternative platform. Commodore were obviously lousy at setting a sane strategy and following through on it to make money, to the point where it almost looks like they could have failed without any competitors at all - so I don't think it's particularly easy to say what a moderately well-run Commodore, with actual new products up through the 1990's could have been able to do.


One, to me, obvious path forward for such a fictional Commodore, would have been for them to "embrace and extend" the PC platform. It could have been able to run PC "business" software fine. It could also have been able to run "Amiga" games and gfx productivity software.

Sort of like Voodoo had 3D monopoly for a hot minute.


They did have something like that for the original A1000 intended to do just that. Sidecar I think it was called? Something with an x86 for running DOS apps.


The other way around might have worked. If Amiga became a graphics card company they could have leveraged a market that was several orders of magnitude greater in size. Imagine a PC but with the sound and graphics of early Amigas, instead of CGA/EGA + PC speaker. Look again at the article in my earlier post[0], specifically the graph[1]. No way was Amiga going to supplant PC compatibles as a business necessity in most middle-class households. But as GP mentioned, graphics card companies are still ticking along today.

[0] https://arstechnica.com/features/2005/12/total-share/6/ [1] https://cdn.arstechnica.net/wp-content/uploads/archive/artic...

> In 1985, Bill Gates wrote an amazing memo to Apple management. In the memo, he praised the Macintosh for its innovative design, but noted that it had failed to become a standard, like the IBM PC was becoming. He correctly deduced that it was the advent of inexpensive, 100%-compatible clone computers that was propelling the PC ahead, and that any defects in the design of the computer would eventually be remedied by the combined force of the many companies selling PCs and PC add-on products, such as new graphics cards


They also had "bridge boards" for the later models, which supported up to a 386SX: http://amiga.resource.cx/exp/a2386sx


Precisely. IBM PCs also had a software ecosystem and brand around delivering business value. Just like Mac did. They both survived because of this. Once those ecosystems are established, it's hard to supplant them.


Apple is existence proof that other platforms could have succeeded if well-managed. Apple almost died due to incompetent executive management, not the PC per se.


Apple had a much higher price point than the Amiga, it's not at all comparable.


They also had an exceptional suite of 'multimedia tools'. The market for desktop publishing, video editing, graphics tools, etc. was cornered well before the Amiga did so. Amiga had some early inroads with broadcast media and CGI, but that was an awfully small niche. I believe Apple had the first machine with spreadsheet software.

Just like PCs had a serious business market from the beginning, so did Apple.


This was mostly true in the USA; Amiga was big in Europe.

Overall attributed to a marketing failure by Commodore. They failed to communicate Amiga's value to potential users and developers.


> mostly true in the USA

It didn't matter what part of the world, Apple simply had the superior graphic design and video editing tools. Amiga provided cut-price competition for 'non-linear' video editing, compositing, and 3D CGI (e.g., Newtek products), but once again, not taken as seriously, and not at the same scale as Apple products.

It's not fair to say that CBM failed Amiga, but that Apple aggressively beat CBM at developing markets for their platform:

> According to Eric Peters, one of the company's founders, most prototypes of "the Avid" were built on Apollo workstations. At some point, Avid demoed one of their products at SIGGRAPH. Says Peters: "Some Apple people saw that demo at the show and said, 'Nice demo. Wrong platform!' It turned out they were evangelists for the then new Mac II (with six slots!). When we got back to our office (actually a converted machine shop) after the show, there was a pile of FedEx packages on our doorstep. They were from Apple, and they contained two of their prototype Mac II machines (so early they didn't even have cases, just open chassis). Also there were four large multisync monitors. Each computer was loaded with full memory (probably 4 megs at the time), and a full complement of Apple software (pre-Claris). That afternoon, a consultant knocked on our door saying, 'Hi. I'm being paid by Apple to come here and port your applications from Apollo to Macintosh.' He worked for us for several weeks, and actually taught us how to program the Macs."

https://en.wikipedia.org/wiki/Media_Composer


How about both? CBM was (temporarily) succeeding despite great effort at failing.


I’d say that Apple was the exception not the rule. Like a modern Silicon Valley startup it was partnered by both a technical mind and a business savvy mind, who (unlike Tramiel) remained at the helm during the worldwide shift of business computing to IBM PC. Also, Apple didn’t participate in the race to the bottom, and concentrated on making a high quality product that justified its price tag as a serious business machine. Amiga started out trying to be the cut price Apple, but the entered the game too late. But it is tantalising to think what could have been if Amiga was founded alongside someone who had business savvy


I see that said a lot, but communicating about the value of the Amiga was easier said that done. It was a machine that could do everything, and that was built in the first place as a gaming machine (from the earlier devs themselves). The C64 was also mostly used for gaming at that time so that image of a computer that could be used for professional purposes was not really there in their DNA. Companies went for PCs by default and did not care about the graphics capabilities of the Amiga. Apple had its own market and the Amiga was mostly in a place where it was affordable for many consumers but lacking proper professional support and software to turn it into something else.


It’s pretty sad how the platform stagnated until 1992 with AGA. ECS didn’t add much. AGA was too little, too late.


In Commodore's defense, Ranger needed (at the time) very expensive VRAM to work, and would have made Ranger-equipped Amigas prohibitively expensive.


An alternative history where Amiga originally shipped at the same or higher price point than Macintosh is interesting to imagine. Follow ons would then presumably have had more budget headroom for advanced custom hardware designs. Less initial users but possibly a shot at longevity.


"low-end" A500 could have been ECS, with Ranger being A2000-only.

Instead, A2000 was that much more expensive, while not adding anything that'd justify its pricing. Insanity.


So true. The base A2000 was way over priced. All you got for the extra money was empty expansion slots!


That was CBM's argument line.

CBM didn't see value in having the best machine, or being first to market.


VRAM requirement suggests it was BW starved. Easiest was of adding BW without bumping clock is going wider, thats what AGA did. Ranger wouldnt matter anyway with its 7 bit color, planar arrangement like AGA when games went all in on high color bitmaps. Wing Commander runs at ~1fps on AGA (CD32) because of planar to packed conversion.


> Wing Commander runs at ~1fps on AGA (CD32) because of planar to packed conversion.

Are you sure about that? It was running decently on Amiga 500, I would expect it would run much faster on A1200 hardware (including the CD32)


Ok, it drops to 1fps, normal AGA Wing Commander framerate is around 3-6fps https://youtu.be/n5wXmeI8w9k?t=479. A500 is ~2-6fps https://youtu.be/iR1Nc4hLKq4?t=936 ? and this was full 25pount/$50 retail release game with 85-90% ratings, simply amazing :o


The never released AAA, unlike AGA, had non-planar aka packed aka chunky modes.

CD32, unlike A1200/A4000, had an extra chip handling many tasks specific to that console, and that chip also had specific hardware for planar to packed conversion.


Commodore biz decisions were, to steal an expression, a Fractal of Bad Biz Design.

Even the tragically underpowered CD32 and Amiga 1200 machines would have been twice as fast for gaming if they had 64 kilobytes of extra RAM. Why? They had 2 megs, of "slow" RAM, slow because it shared bus cycles with the graphics and because CPU caches had to be turned off.

It they had just a sliver of dedicated CPU RAM, games could have put inner loops in that RAM and achieved 2 to 4 times the speed in many calculations.

You could buy expansion RAM (2 megs or 4 megs were common) which did that, but very few customers did, and almost no games took advantage.

This was at the tail end of Commodore life, but it's such an obvious example of penny wise, pound foolish decision.


They could've also gone with a different configuration: 1 meg chip, 1 meg fast. Or even have spent a couple bucks more and made it hardware configurable! Few Amiga games needed 2 megs chip. Remember, the original systems only had 512K chip RAM.


>Few Amiga games needed 2 megs chip.

Can you name any? Non-AGA that is. (As AGA machines all came with 2MB chip)

I can't even name any that require 1MB chip. There were machines like the A500+ and A600 that shipped 1MB chip, but they didn't sell anywhere as much as the a500.

Some A500 (like mine) have an Agnus that supports 1MB chip but unfortunately, unless the board is modified (a trace needs to be cut and replaced by another, a "solder jumper"), trapdoor expansion will show up as slow rather than extra chip.


I really can't. I was being optimistic, assuming there were some late-stage Amiga games that required more chip RAM. Seems unlikely, I agree.


I touched an AAA, Amiga3000 prototype Dave Haynie had, in an elevator.


Well, that's a bold move.


I never owned an amiga until very recently, I owned an Atari back in the days, 30 years later now I understood one thing: Amiga/Commodore had the best engineers Atari had the best business folks.


> Amiga/Commodore had the best engineers Atari had the best business folks.

This. The business folks at Atari weren't shortsighted like the ones at Commodore; adding a MIDI port alone to the ST contributed to place that machine into its own music production niche ("I want a Mac but can't afford it") which prolonged their life for years after Atari ceased production. Adding a MIDI port to the Amiga would have been trivial, I built and sold several at that time on local BBSes, and the cost was like a few bucks: one 74ls04, one photocoupler, one voltage regulator and a few analog parts; making it internal would have been even cheaper due to not having to add a serial D25 connector and an enclosure, not to mention the parts purchase in the ten of thousands. Unfortunately the business folks at Commodore constantly cut corners by removing what they thought not important for what they still thought of a game console, so one would buy a desktop computer only to find it had no hard disk and real time clock: boot from floppy and set the date/time at each boot? Come on! One would wonder how it could not fail.


> Adding a MIDI port to the Amiga would have been trivial

The Amiga is unsuitable for serious MIDI work because of a hardware design flaw. There are like 4 timers and the timer interrupts were at a higher priority than the serial port interrupt. There was only a 1-byte buffer for the serial port, so it was possible to lose data if one of the higher priority timers fired at the wrong time.[1]

[1] https://dreamertalin.medium.com/music-x-b4abc68d6f78


I can't verify that since I sold everything ages ago, but before buying the A4000, first with my A500 and then the A2000 (w/ no acceleration) I could easily sample a complex flam+roll figure I did on my old Roland R8 pads at crazy granularity (software and hardware were capable of recording and playing 1/384 notes), and it didn't miss a single note. That figure was obtainable by pressing both flam and roll buttons while modulating the dynamics on the instrument pad; very handy to simulate natural cymbal rolls during song pauses, endings etc. I used it during a song start with the snare, and the only editing necessary was performed afterwards to cut the inevitable leftover notes because I was playing with my fingers. Software used was Dr T's KCS, which was a lot more optimized and snappy than MusicX, which I remember to be quite buggy too.


And yet, it was successfully used for serious MIDI work. It's simple: Don't use the serial port for MIDI.

Peripherals (via expansion port) can trigger level 2 and level 6 interrupts[0].

Notably, level 6 is the highest priority level in 68000, short of the NMI (level 7).

I have to agree with the parent, adding a MIDI port to the Amiga would have been trivial.

0. https://sites.google.com/one-n.co.uk/amiga-guides/amiga-inte...


Jay Miner, the engineer who created the Amiga's AGNUS/DENISE/PAULA chips, came from Atari where he previously created the TIA in the Atari 2600, and then the ANTIC/CTIA/POKEY chips in the Atari 8-bit computers. The Amiga is the 16-bit successor to the Atari 800 that Jay wanted to create at Atari but was told, "No".

But, yes, Commodore had a lot of great engineers too, some of which followed Jack Tramiel to Atari and created the Atari ST and others who remained at Commodore.


>The Amiga is the 16-bit successor to the Atari 800 that Jay wanted to create at Atari but was told, "No".

Later, that same Amiga team finished Ranger, the next generation chipset, and was told "No." at Commodore.

Amiga's history gets more depressing the more you know about it.


> Amiga/Commodore had the best engineers Atari had the best business folks.

It's debatable as it was different in the 8-bit area. A total of ~2 million Atari 8-bit computers [1] were sold as compared to the 12.5 - 17 million figure for the Commodore 64 [2] only. Arguably Atari 800XL was engineered better [3] (faster, better graphics, worse sound).

[1] https://en.wikipedia.org/wiki/Atari_8-bit_family

[2] https://en.wikipedia.org/wiki/Commodore_64

[3] https://dfarq.homeip.net/atari-800-vs-commodore-64/


You're ignoring the Spectrum and MSX on this side of the pond. There were no Atari on sight during the 8 bit days.

UK and Iberian Penisula were all on the Speccy, Netherlands was big into MSX, France and Germany on the C64, no idea about the others, but surely not Atari.


True. I was replying to a comment that compared Commodore and Atari.


There were 8 bit Ataris in the UK, still have a couple.


I feel like Commodore 64 had better sprites than the Atari 8-bit.


Yes, sprites (Player-Missile Graphics) were weird on Atari, probably for historical (i.e. Atari 2600) reasons.


The Commodore 64 also had better sprites than the Amiga. Jay Miner didn't really like hardware sprites. I think he was wrong about this.


There were tradeoffs. The C64's were eight pixels wider, and could also be pixel-doubled but the Amiga's were 3-colour without losing resolution and were not limited in height. The biggest limitation was that both systems supported only eight of them.

The AGA chipset supported sprites up to 64 pixels wide, but I dunno if any game took advantage of this.


On the other hand, you had a lot of flexibility and options both with what you did with the available hardware sprites (e.g. some games drew backgrounds using sprite hardware!) and how you could manage without them (using blitter objects instead).


I'd rather have lots of hardware sprites like the Genesis, in addition to the blitter. I think a lot of Amiga games felt like they ran slow, because it's a lot easier to move a sprite, than to move a blitter object.


> Amiga/Commodore had the best engineers Atari had the best business folks

While I had an Amiga and thought it was ahead of its time, I think both the Commodore 64 and Atari ST have aged better than the Amiga. The Commodore 64 has the SID chip so it's basically a programmable synth and still used even today, the Atari ST has MIDI ports, so it's a programmable MIDI Controller with timing that is arguably better than anything modern. Of course, the Amiga was Jay Miner's design and the successor to the Atari 800, but was released by Commodore. But I fail to see what has survived from the Amiga, personally I prefer hardware sprites to the blitter/copper and the bitplane graphics. The playback of sound samples was nice at the time, but the SID chip is much more distinctive. HAM was interesting at the time, but not relevant at all today. The pre-emptive multitasking was interesting, but not that useful at the time. I prefer the single tasking of the Atari ST especially for music apps, because the timing is more precise.


It's interesting to me that one of the big ideas that made the Amiga different was having a unified memory (the chip memory) where the specialised processors and CPU had equal access; it's now very mainstream in the gaming arena, with the XBox and PlayStation both using AMD processors that do just that.


This model of bus sharing was extremely common at the time, just splitting access to the bus on the off cycle of the bus. Commodore 64 and Atari ST were also like this, among others. I forget if the Atari 8-bit machines (also partially Jay Miner designs) were fully like this, but I believe they were.

It worked because the world then was almost the opposite of the way it is now. Back then memory & bus was as fast or faster than the CPU and so you could do this kind of thing. To the point where there were architectures where the CPU itself had basically no registers (TMS 99xx), and the processor did registers (or register-like as in the 6502 'zero page') access via SRAM.

Now memory access is many times slower than the processor, keeping things in on-die cache and in registers is key.


It's pretty different compared to the Amiga. The Amiga flipped the bus between CPU and chipset every cycle and the CPU could only access chip RAM every 1/2 cycles. No matter how little the chipset was doing, the CPU could only access RAM half the time. The Xbox/PS5 SoCs have both the CPU and GPU behind the memory controller so even under load the CPU will still have access to its full amount of memory bandwidth.


>No matter how little the chipset was doing, the CPU could only access RAM half the time.

Only applicable to CHIP RAM. CPU could access FAST RAM, the default allocation target, all the time.

It also was not as simple as odd/even, although there's sense to the simplification. 68000's design meant it could ultimately only access RAM every other cycle, thus it was sensible to aim for that as the default.

Higher 68000 models did not have that restriction.

And even the original chipset also wasn't limited to half the cycles. This was only true of the default video mode (4 color "hires", 640 pixels wide).

The blitter also had a flag (BITHOG) that allowed it to use full DMA, stealing cycles from the CPU.


>Only applicable to CHIP RAM. CPU could access FAST RAM, the default allocation target, all the time.

"default" haha. Afaik the only Amiga that ever shipping with fast ram was A4000. It was introduced at $4600 in 1992. In 1993 it was down to $2500, same price ARL was selling Pentium 60MHz system with 8MB ram.


>"default" haha.

Yes, default. AmigaOS AllocMem() will use fast32, then fast, then slow, then chip.

Thus software needs to request Chip explicitly when required.

This is fortunately well respected by software, thanks to "slow ram" being commonly mounted inside Amiga 500's trapdoor.

>Afaik the only Amiga that ever shipping with fast ram was A4000

A3000 already shipped with a 1M+1M CHIP/FAST split in 1991.

For the other Amiga, fast ram is a priority expansion.

Typically over 2x performance for the A1200's CPU just from having any, via trapdoor expansion slot.

On A500, the most sold model, you'd get this through the left expansion slot, either as a standalone board or mounted inside an HDD adapter, back in the day.

Today, A500 CPU socket adapters giving 8MB Fast RAM are common.


The flip side of that was if you were writing a vanilla computer program that didn't take advantage of the special hardware, your perf was directly tied to what resolution mode the screen was on.


Well... only for a 32-colour/EHB/HAM lowres screen, or 8/16-colour hires screen, or if you were blitting. 16-colour lowres and 4-colour hires fit within the odd cycles and don't lock out the 68000 during display fetch.

Most people had a 4-colour hires Workbench so wouldn't be affected. People with a faster CPU would also likely buy Fast RAM to go with it.

Details: http://www.theflatnet.de/pub/cbm/amiga/amigadev.elowar.com/r...

Diagram: http://www.theflatnet.de/pub/cbm/amiga/amigadev.elowar.com/r...


... as long as there was no "FAST RAM" left.

Allocations would get fast ram with priority, unless chip ram was specifically requested.

Fast ram is exclusive to the cpu, and unaffected by the chipset accessing chip ram.

Amiga had this fast ram concept. Atari ST unfortunately did not.


> Atari ST unfortunately did not.

Atari TT did. And is possible on Falcon with expansion.


It was only really possible by not using the OS.

The main issue is that the OS did not have a way to request "chip RAM", thus software compatibility hell if added too late (never actually got added AIUI).


Not that it mattered by that point but TOS 3 for the TT did? For what, I don't know, really. Maybe people running Calamus or something could use it?

I think Atari Corp had a burst of engineering competence at the end after Tramiel Sr handed over the reigns to his sons. But it was too late, the market moved on.

The Falcon is a lovely machine. The last TOS releases addressed some longstanding problems. Hiring Eric Smith to work on TOS/MuliTOS was a good, but late effort. Porting SystemV and trying to make a budget Unix workstation was a doomed but clever effort (I still remember the UnixWorld news clip box headline "Up from toyland").

I bought a 486 late 1992, put Linux 0.97 on it, and moved on.


Nah that’s not true. Mxalloc() takes fast/chip flags just like AllocVec() etc. Fastram support was available since TOS v2, 1988(89?).


>Fastram support was available since TOS v2, 1988(89?).

Realize how late that is, relative to ST's release.


There were no machines with any kind of fast RAM until then.


There were OS support when fastram arrived (obviously). Since their chipset supported 4 megs of chipram (14 megs on later machines), they didn’t have to worry about slowram etc. So it wasn’t really an issue for them.


I'd add that honestly people weren't really complaining about the ST's speed at the time it was current. It beat the Mac and PC for horsepower and price.

While it was kind of a crappy games machine compared to the Amiga, it certainly was better than a PC or a Mac, and as an all around general home productivity computer was the best value for the $ out there at the time. Which I think was the Tramiel's aim.


> Amiga had this fast ram concept. Atari ST unfortunately did not.

But the Atari ST didn't have custom chips, so there was no need for chip ram/fast ram?


The Atari ST had different chips, including e.g. a blitter (which they added in a latter machine).

The problem persisted through the Falcon, which 68030's was tremendously held back by lack of a fast ram concept.


And also of course in most x86 computers and Macs.


I would have killed to have access to this document at the time. Amiga documentation was very hard to come by in New Zealand.


I used to get Byte magazine back then. The cover date was three months in the past by the time I got it. Before the internet the News section was still relevant even after three months. Some books I ordered took 6 months if they arrived at all.

This was Invercargill, New Zealand in the 80's.


Plus one for this. It was easier to get people to mail me computer books from Europe than to buy them in 1990s rural Oregon.


don't worry, it was almost just as difficult in non-rural areas in Oregon as well at the time. eventually, there was one amiga dealership in clackamas, which is I guess one more than, say, bend had, but I still never managed to make it there before they eventually closed.


It was in the UK too.

I mostly ended up disassembling demos and using ST books for source code - non-GEM - like “how to do vector graphics with pre-made sin/tan tables”


The pixel format description on page 6 still makes my mouth water:

- bitmapped formats up to 16 bitplanes

- paletted chunky formats for 2/4/8 bits per pixel, and 16/24 bits per pixel without color palette

- block-compressed pixel formats with 12x and 6x compression ratio (looks quite similar to modern hardware-compressed texture formats used in 3D rendering - more details on page 123)



Would this document be enough to be able to create an AAA Amiga in FPGA?



Here's a youtube video review. If I were going to retro game on the Amiga I would seriously take a look at buying one.

https://www.youtube.com/watch?v=YRr74v2tPzM


I'd suggest a miSTer instead.

Minimig is open hardware, which is the way to go.


Shame.


Not really. It might have had some relevance for a year if released in 93, but 94-95 saw a rapid increase in CPU power which obsoleted the use of 2D accelerated graphics for PC games. This chip wouldn't have saved Amiga, it's just another choice that sealed their fate.


1993 with bitplane graphics and Doom around the corner, dead in the water.


That new chipset would have a chunky pixel format similar to mode13h (1 byte per pixel, indexing a color palette - and a lot of other chunky pixel formats from 4 pixels per byte up to 3 bytes per pixel).

There are even hardware-compressed pixel formats mentioned (on page 6) which provide 6x and 12x data compression, which looks quite similar to today's block-compressed texture formats (more details are on page 123).

(also 68030 and 68040 CPUs were still competitive with 386 and 486)


amiga could have 3d accelerator at that point and much more cpu performance


There's a lot of reasons Amiga/Commodore failed, but I think the core issue is that they were creating computers, but not ecosystems. No one wanted to do a full reset on their software again, something 80s computing was rife with.

PCs insulated themselves from this with hardware variability being baked in the moment PC clones appeared on the market forcing the software market to bake compatibility in by design (the bet on x86 was also a smart one). Macintosh managed to survive the transition by forcing developers to use abstractions, making a 68k>PPC emulation layer more viable.

There was no pathway for Amiga to accomplish this. So much software is direct to metal that the only feasible way to achieve performant backwards compatibility would be to include a whole OG Amiga in whatever PPC Amiga they could cook up. Expensive in an environment of 486+VGA IBM PC compatible clones that work with existing software but even faster.

Not to say this was the thing that killed them, but it would have been even if everything else was going great.


This is just layers on layers of hypothetical. One could just as easily argue that since they controlled the architecture vertically, they could have virtualized the whole platform. (Exactly like how Sony made PS2 compatible with with PS1, and PS3 compatible with PS2.)


Sony could pull that off due to their licensing model allowing hardware sold at a loss, being a massive conglomerate sitting in many different industry also helped. There was no pathway for Amiga to do this, they had vertical integration but not the control over developers Apple had nor the licensing income console manufacturers had.

Commodore had it's fans (well, not of Commodore so much as the Amiga itself) and a bunch of niche use cases in the video production industry, but that wouldn't have been enough to sustain the R&D required to be competitive with the rest of the industry. By 1993 a lot of the industry was working on consumer targeted 3D acceleration and here's Commodore making some 2D chip.


Sony's backwards comparability was nearer to the "include the old chips in the new system" approach that a hypothetical next Amiga would probably have to use, at least for the immediate previous generation (PS1 on PS3 is emulation AFAIK[1]). It's not a good sign for that approach that the PS2 chips (and compatibility) were dropped pretty much immediately once Sony realised they needed to get PS3 costs down.

[1] I think some download games use a PS2 emulator on PS3, but that's not available in a general purpose "put any disc in console" mode.


Yes sure whatever, I’m just saying Amiga had an insane fanbase willing to throw money around, but Commodore declined.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: