Erm... I see XOR for blinking cursors is obvious and should never be patented, but to blame only that for Amiga's demise is... a bit of an exaggeration. Management deserves most - in fact, almost all - of the credit for driving Commodore into the ground.
It's such a shame. Commodore made great computers.
The Amiga had sort of an identity problem. It was born a videogame console and NTSC timing was pervasive throughout the system and that made the design more complex as the machines evolved. They should have gotten rid of that as soon as they launched their second-generation machines.
To be fair, the author does not claim that the XOR patent was to blame for Commodore's demise. Rather, the claim is that:
1) Management "bet the farm" on CD32.
2) XOR patent infringement claims killed CD32.
3) The failure of the CD32 product was the last nail in the coffin.
Tragic. I loved my Amiga 500 dearly. It makes one wonder what might have happened if Commodore could have held on for a few more years and (perhaps under new management) ridden the wave of internet-driven hardware sales to new heights.
Technically the Amiga Platform has always been rather agnostic about it's timing & video output. It spits out PAL / NTSC / RGBS in various funky resolutions, all you have to do is ask or set a jumper depending on the model. Secondly if it wasn't for this spot on NTSC carrier frequency that the Amiga could output we would never have seen products such as the Video Toaster. http://en.wikipedia.org/wiki/Video_Toaster
Vaguely related, is that I once hooked up a monochrome VGA monitor to my Amiga 1200 by soldering the monitor's cable to a suitable Amiga video connector, and setting it up to be 640x480 VGA in the OS settings.
To be fair, the A2024 had a refresh rate of 60hz, but due to the way it "multiplexed" 4 screens, it had an effective refresh rate of ~15hz (60/4). A much different, and less interesting reality.
I don't think the first generation Amigas were that agnostic. You had chip RAM that was tied to video timings and fast RAM that wasn't. That indicates the motherboard timings were very coupled to the timings of the video generation circuitry and the dedicated chips.
That's not really the case. It was designed with computer graphics in mind - far more akin to a workstation than a game system. It was always intended as a full fledged computer as indicated by a multi-tasking operating system and peripherals such as the sidecar for the A1000 and the bridgeboard for the A2000. Standard serial and Centronics ports are further evidence of Commodore's intent.
Using standard video timing and providing stereo sound created significant benefits for audio visual applications - and in true Commodore tradition - the video system allowed the Amiga's to be used with a television as the monitor and reduced thereby reduced the cost of a basic system.
The Amiga 1000 was launched as a computer, but the product Amiga Corp (or Hi-Toro?) had developed when Commodore bought it was initially conceived as a videogame console. It was developed as a computer only after the games market collapsed.
almost all mass produced computers of that era (and essentially even present ones) have much of their timing derived from NTSC, not because they are explicitly designed to be connected to television, but because of component costs. Most of the commonly used VGA video modes are essentially NTSC and the 33MHz PCI clock (used as essentially "global" clock now) of modern PC is still generated by PLL from crystal that usually has some NTSC-friendly frequency. Reason is simple: you need PLL anyway to generate lot of different clock signals, so you can as well use cheapest xtal you can found, and that usually happens to be something that is used in lots of mass-market NTSC equipment. Today there are also some other common mass produced xtals (for USB, Ethernet and such things), but in 70's and 80's only mass-produced things that needed exact timing were clocks and television equipment.
It's easy for HN to see that the XOR patent is obvious, and not just in hindsight. Convincing a court is not so easy. The solution is not to get rid of quote-unquote bad software patents, but to adhere to the Supreme Court precedents that make software per se non-statutory. Any patent whose novelty and non-obviousness is claimed only in software should rightly be invalid.
I disagree with their description of the 1981 Diehr opinion as saying "the only new feature of this invention was the timing process controlled by the computer". The dissent said this, not the majority. The dissent was based on the facts of the case, not law. Diehr was not a departure from earlier precedent.
EDIT: Not only did the Court majority not say what bitlaw says in the Diehr opnion, they contradict it: "According to the respondents, the continuous measuring of the temperature inside the mold cavity, the feeding of this information to a digital computer which constantly recalculates the cure time, and the signaling by the computer to open the press, are all new in the art."
I would argue that it doesn't say anything that conflicts with them. But you were citing old cases as authority for software being per se nonstatutory subject matter, and that's pretty clearly not true in the post-Bilski world.
Theoretically, if the only opinions that conflict with those old cases are from lower courts, then they're still valid precedent. However, you're quite right that in today's world you're unlikely to get a patent overturned because of Benson or Flook. This difference between theory and practice irks me.
Don't get me wrong, I don't think SCt Bilski does anything but muddy the water. But it's simply not the case that there is binding precedent that software is nonstatutory. That's all I was talking about.
From a practical standpoint, you're absolutely right, and I certainly hope nobody takes what I'm saying as legal advice.
From what's actually written in the Supreme Court opinions I've read, every software patent I've seen discussed on HN should be invalid. Benson and Flook were super clear. Diehr was long and hard to read, but when you boil it down, it didn't change anything; it merely clarified that software as part of an invention doesn't automatically make the whole invention nonstatutory. I haven't read Bilski, but it sounds like SCOTUS punted on clarifying things because they think it should be Congress's job.
Until Congress steps up and clarifies, I think the previous Supreme Court decisions should be law. But they aren't. I find that extremely annoying.
Somehow, when I was a kid, and I got my first computer, it was an Amiga 3000. The amiga was already dying, but as a kid I was not able to at first realize that.
What personally killed the Amiga for me, was the graphics power. The Amiga 3000 "enhanced chipset", best graphic mode was 320x200 with 64 colors (extra half brigth). Meanwhile, PC VGA displays 800x600 and 1024x768 256 colors were starting to become popular on the PC side. The Amiga was clearly behind.
The Amiga was also capable of 640x400, but only with 16 colors. And the "HAM" mode, 4096 simultaneos colors on low res 320x200, suffered from "color fringing" and was mostly unusable.
What would have made me happy back then, was a 736 x 482 overscan mode with 4096 real simultaneous colors.
I wish I could restore all the game demos that I made back then with Amos Basic. lots of simple games, a basic doom clone, musical apps, etc. I would love to give a look at that code again.
Hmm, I wonder about the accuracy of this, specifically about CD32s not being imported.
I worked, at that time, for the largest Amiga dealer in the US, and we built a couple hundred multimedia display kiosks using CD32s and Paravision SX-1s. We had no trouble getting them (well, other then the usual Commodore supply issues of the time).
I remember reading about it at the time. As I recall, some units had gotten in before the majority were seized. I have a CD32 ( complete with an SX-1 ) in my garage that likely was one of your display kiosks at one time.
I have never been quite as excited over a machine as when I got my Amiga 500. I had owned a Commodore Plus/4 and a 128 since middle school and high school, and bought the 500 to start college. It was quite an amazing little machine for it's day.
Then again, I haven't ever been 17 again either, so that might explain it.
Wow, I have a few cd32 discs in the basement somewhere - want them?
We would distribute burned CDs containing graphic, sound, and video/animation assets to the customers. This was '94, and if I remember correctly blank cds were something like $8-10 each (and the burner must have been close to $1000), and the burning process was so fragile that I'd burn discs at night just to try to prevent any possible issues due to vibration, or power fluctuations, or whatever other gremlins caused burn failures. I think we still had only like an 75%-ish success rate.
I remember when I had a C64, and my friend wanted a computer for xmas, and we were pestering his had to get the 128, which was the best computer we knew of at the time. Then on xmas day, he got an Amiga 500! We didn't even know about those. He was pretty happy. I was pretty bummed I only had a C64.
I first had a C64 and then an Amiga 500 ... those were some of my best memories as a child/teenager. My experiences with these machines undoubtedly led me to being a Computer Scientist as an adult. There will always be a place in my heart for Commodore.
If you Google define:factoid, you'll see the definitions are like "something resembling a fact; unverified (often invented) information that is given credibility because it appeared in print".
Looking at the suffix "-oid", it makes sense that the definition is akin to "similar to a fact."
However, I don't think that's what the author intended. "Factoid" has come to mean, "a little fact", it seems.
Although I used to be a huge Amiga fan, I think that Amigas of the time lagged behind other systems for gaming. PC's having chunky display formats (as opposed to plainer (planer?) on the Amiga) meant that PC's could do far more 3D graphics than a equivalently powered Amiga, and Amigas were often underpowered anyway. The Amiga kicked ass at 2D games, but could not hack it in 3D.
At the end of the day, had the Amiga been a viable business, $10M in patent fines would not have killed them off.
Bitplaned graphics was definitely an impediment to efficient rasterization. But when you look at what demo sceners have done with 3D on the Amiga, it's hard to argue that was its main downfall.
What killed the Amiga was stagnation in both technology and marketing. It also never had much of a foothold as a non-gaming machine in the home.
The IBM-compatible PC was the focus of innovation from many different companies. It wasn't a one-vendor platform like the Amiga. That is the ultimate cause of its demise. If the fate of the PC had stayed tied to IBM, it would have suffered the same end. Apple had a similar problem with the Macintosh in the 1990s and it almost killed them. If Apple's stewardship ever wavers, that is sure to happen again. It will take a lot more before it can become an immortal (undead?) platform like the PC.
The planar mode graphics had been a benefit in the more memory restricted days before; it was starting to become a problem but still, at that point Amigas tended to play the games of the day better than PCs costing 3x as much until Doom pushed 3D to the forefront. There were enough 3D games on Amigas, but Doom was a level beyond them for a while. Still, I did enjoy Acid Software's Gloom which wasn't that far behind.
As a hardware platform, it was more closely tied to the individual chips but that was rapidly fading, with OS abstraction APIs for sound and graphics coming in around then.
As a software platform, while it lacked MMU-dependent features, in other ways it had major benefits. Small, fast, pretty tidy, modular, customisable. Quite frankly I'd still like something to replace the Amiga's Datatypes concept, 20 years later.
A sad loss to computing. To this day I'm baffled how Commodore managed to lose the not-PC slot to Apple in spite of having cheaper harder with more features and a vastly more capable operating system. I've said before here; if Commodore had done their job properly, we'd be saying Steve Who?
I loved the C64 and the Amiga. They were my first and second computer. But it's painfully clear in retrospect that Commodore never stood a chance. Jack Tramiel ran the show like a cigar-puffing industrialist straight out of a Dickens novel. There was none of Gates's vision of computing or Jobs's sense of quality from the top of the company. The Amiga line was developed almost entirely outside Commodore and then bought up and brought in house.
I wouldn't blame Tramiel for Commodore's demise either. The Amiga was a very fine platform and Tramiel is also behind the ST line (called, at that time, "Jackintosh") after departing to Atari.
The ST line was truer to the Commodore of the VIC and 64 era in that it was simple (much simpler than the Amiga), powerful (somewhat less than the Amiga) and inexpensive.
At that time, nobody was predicting the collapse of the PC market around the IBM PCs (which were much more expensive than the home computers of the mid-to-late 80's and much clunkier).
I don't know much about the Amiga, but it sounds to me like the way in which it was a "nice computer" was that it was pushing a quirky design well past where it should actually have been, and that it was rapidly hitting the end of the road technically regardless of what the management was doing. That is to say, listening to descriptions of the Amiga reminds of listening to OS 9; yes, it's full of nice things and there's all sorts of ways to argue about how it's "better than PCs" but in the end you can't handwave around the fact that the foundation is at or beyond its capacity and there's no way to incrementally advance.
Apple managed to make the leap to OS X, as Microsoft managed to move to the NT base before it. I don't see how Commodore was going to do it without being something fundamentally different. Their machines were nice but it seems to me they were always building machines for today, never thinking much about tomorrow. But tomorrow comes... it always does.
I'm posting this because I'm curious about reactions; if this is flamingly wrong I'd love to hear. I don't have direct experience, except with the Commodore 64, which looking back was already experiencing the "make a computer for today" problem in many ways.
I have to disagree. The Amiga's design was not the decisive problem. The IBM PC prospers to this day with many traces of its legacy surviving as junk DNA from its haphazard evolutionary path. Had the Amiga survived in some form, no doubt the fully evolved platform would bear similar marks of its gaming-centric past, mostly in the hardware interface, not the operating system.
You yourself bring up the Mac's leap to OS X. On a technical level that wasn't so much a transition as a hard reboot, only made possible by some truly heroic programming of compatibility systems software. OS X got off to a bumpy start but not enough to kill it. The Mac since went from PowerPC to x86 by another feat of compatibility wizardry in the form of Rosetta. But what would have been easier than looking at the state of the Macintosh in the late 90s and pronouncing it a dead-end, a lost cause?
Commodore's problem was first and foremost cultural. I've always found myself in the unusual position of loving the C64 and the Amiga but hating Commodore.
The problem is that the management of the Amiga did little to evolve it. It didn't need a revolution, ala OS X. It just needed refinement over time, of which there was none. The Amiga that Commodore purchased was the Amiga that died.
The Amiga had preemptive multitasking in 1984. Had VideoScape 3D in 1987.
It's hard to say that they were building computers for today, when they're machines were doing things the mainstream OSes (Mac and Windows) wouldn't do for a decade.
It would be like introducing the iPhone in 2000. And then doing nothing for 10 years.
I honestly can't think of another piece of computing technology that was so far ahead of the field when released. And yet virtually all of it squandered.
> I honestly can't think of another piece of computing technology that was so far ahead of the field when released.
Maybe the PDP-1? Or the Tera MTA? Or the Transputer? Or the CDC 6600? Or VisiCalc? Or the 6502? Or Intel's first X25-M SSD? Or Gmail?
Preemptive multitasking by itself isn't very interesting. It's useful if you're running a real-time control task (in which case, on MS-DOS, you would hook an interrupt, and fuck the lack of OS) or if you have memory protection, which the Amiga didn't. That's why the mainstream OSes didn't have it for another decade: because it didn't matter. (And concurrency control is much simpler without it.)
The Amiga had lots of other awesome stuff in the late 1980s, stuff that (as others pointed out) still isn't mainstream. But preemptive multitasking without an MMU is mostly a waste of time.
I did use an Amiga, but not long enough to really get a feel for it.
Gmail was so much better than the competition — Hotmail and Rocketmail — that Microsoft thought it was an April Fool's joke and advertised that new Hotmail users would get something like a terabyte each.
It's disappointing that you deleted your post, because I hadn't heard of the http://en.wikipedia.org/wiki/CDC_1604 before. The way that I heard the PDP-1 was significant was that it was something like 100 times the speed of any machine within a factor of 10 in price. But I can't source that in any detail. Is it possible that that statement was true but only until the 1604 came out in the same year?
Sorry, deleted my last post. Thought I was a bit snarkier than I usually am. :-)
I will say that I thought Yahoo was almost strictly better at Gmail launch, except one thing... storage. And Yahoo was actually reducing the size if I recall correctly.
I do use gmail now, although I still find the UI, on the web at least, to not be to my liking.
The 1604 was a great device, along with the PDP-1. The PDP was certainly smaller and cheaper (1/4 the price), but the CDC was faster. Both were among the first (maybe the first two?) transistorized computers.
Amigas of the day ran Mac software much faster than the mac did - they shared processor architecture (68000), so there was no need to translate code, and unlike the macs of the day, it had hardware acceleration for 2D graphics ("blitter") and sound (paula, denise or angus - can't remember which).
The amiga was a technology marvel when it came out - it could do background 14-bit stereo sample playing in 1985 (PC got there around '92), and true useful multitasking with 128K (PC got there around '94, but not with so little memory).
Graphics wise, it was on par with the arcade machines of the era - the PC got there in '93 or so.
Technically, it was superior. And then the business part was miserably run, and the tech side stagnated (that is, was also miserably run).
The Amiga was unique when it came out due to its custom sound and graphic chips that placed it beyond any existing home computer at the time (1985). Unfortunately, the hardware didn't change all that much and was slowly eclipsed first by graphics (VGA came out 1987 offering more colors) and then sound (SoundBlasters in the early 90s).
The other issue was the operating system, which is really three pieces---Exec, DOS and Intuition. Exec is the kernel, written in 68000 assembly and appears as just another shared library to the system (neat design really). It's small (I was able to recreate over half of it in C in just a few hours) and would have been able to keep up as is until multiprocessor cores became popular, which would have required a somewhat major change (critical sections were handled by literally disabling task switching). It also has no notion of users (being a single user machine).
DOS, the file system. This itself was a port of a portion of Tripos (written in BCPL) which only took two weeks to do (or four; it was a really short period of time). It's a bit quirky (due to it being written in BCPL) but it worked and given the nature of device drivers in the Amiga (a combination of shared library and a potential thread) it could easily support other file systems rather transparently. This is probably the least problematic aspect of the operating system to update for modern machines.
And finally Intuition, the GUI. This was written in C, and unfortunately, was rather closely tied to the underlying hardware. Not only could you specify a new screen (different screen resolution and color depth) but access to the blitters (which were also used by the floppy disk drivers to decode the bit stream I kid you not) and even the raw video co-processor (a three-instruction CPU that could be used to reset the video registers at an almost arbitrary portion of the video scanning beam). Some of the upper layers of Intuition were fairly video independent but not enough of it to survive a major redesign of the video subsystem.
You're pretty much right in that it was a "nice computer" (I found it really fun to program) but it didn't evolve all that much while on the market. Exec would need some work to work on today's multicore hardware and Intuition would have to be reworked from the ground up.
In 1988 there was nothing like Calagari on the PC side for 3d graphics - at least not at an accessible price running on accessible hardware (1meg A500/M68000) from floppy disk - e.g. Autdesk's AutoShade was $1000 without a modeler and views could not be updated interactively as they could with Calagari. In other words, Autoshade was in the same price range as the entire Calagari system (sans monitor) and required several thousand dollars in additional software resources.
Considering the the cost of the 80287 math coprocessors required by the modeler (ACAD) and the costs of VGA adapters in 1987, the cost of doing 3D graphics on a PC was substantially more than the Amiga.
Sorry, I was talking more about 3D games, in particular those that became popular around the time Doom came out (1993) which is the same year the CD32 was released.
The Amiga was great for 3D production graphics. Software such as Lightwave and Imagine 3D punched way above its weight (and cost) for a while.
Realistically the Amiga and CD32 were dead by the time 3D PC gaming came out (although it was still my main machine at the time, but I knew I was on a dead platform).
It's such a shame. Commodore made great computers.
The Amiga had sort of an identity problem. It was born a videogame console and NTSC timing was pervasive throughout the system and that made the design more complex as the machines evolved. They should have gotten rid of that as soon as they launched their second-generation machines.