Some of these articles were written by Robert Kowalski (homepage: http://www.doc.ic.ac.uk/~rak/), who was part of the initial Prolog
gang/community, which was split between Marseille (FR) and Edinburgh (GB).
I went to the magazine expecting to read a 1985 article about declarative languages and instead (also) found a review of the first Amiga computer. This sentence in the editorial column made my day: "Dazzling graphics and audio and an open expansion bus make the Amiga the intellectual and technical heir to the Apple II." ... if only history had played out differently!
Well, it was more the heir to the Atari 800 where the Atari ST was actually the heir to the Commodore 64. The Atari 800 was far and away better at graphics and sound than the Apple II.
I remember buying this one, and seeing an Amiga a week later in the mall. The Prolog article blew my young mind. How in the heck is it figuring this out, BASIC doesn't do this?!?
I agree the Amiga is clearly influenced by the Atari 800, as Jay Miner's mere presence would indicate, but the Atari ST was mostly off the shelf hardware and the Commodore 64 was anything but. About all those two systems have in common philosophically or otherwise is an intentional skew to the low end of the market, and Jack Tramiel (but I repeat myself).
This. --would even go so far to say Atari ST was intentional knock off of Amiga 1000 styled after Commodore 64/128. Tramiel spent his life building Commodore BUSINESS Machines (after his family was killed by Nazis) only to have it taken away by a new chairman (Irving Gould) who reportedly "gained control in a poker game." Tramiel reportedly dropped $45K buying the dead Atari name (from Warner Bros IIRC?), hired a hack for the design and relied on the MC68000 to do all of the processing (including graphics, though it probably had a DSP) and then dishonestly marketed the hardware as equal or superior to Amiga, knowing full well it did not have the enabling chipset that Amiga Inc. had spent four starving years developing in Los Gatos before he acquired them for a song while still at Commodore. (Many of those engineers later died of self neglect in the Aftermath of the Gould/Tramiel war.) The novice computer users and even business/data processing types of that era couldn't tell the difference between Amiga 1000 and AtariST based on that level of marketing or what few demos were available, but they sure could see the price difference and that's how Tramiel took his revenge on Irving Gould. Worse, "In business, perception is the reality" and "Careful who your enemies are, as they will define you" --so despite the then 30 year legacy of Commodore BUSINESS Machines and despite Atari being nothing but a halloween costume being worn by Jack Tramiel, Commodore quickly came to be seen by Corporate America as a video game toy maker like Atari. (Even you allude to this perception in saying that they had an "intentional skew to the low end of the market"). Bill Gates, who knew better of Commodore, didn't help things any when he brilliantly decided, probably after consulting with Steve Jobs, against porting his MS Office to Amiga, which totally ruined Amiga's credibility --because why wouldn't Gates want to support the very best chipset on the market, if it truly were? Then dystopia...
Please forgive me for this flippant speculation - if there's a biographical record of those who died of self neglect presumably in despair please can you information or suggest sources* - but I can't help but thinking how much rebranding simply to CBM could have done for Commodore Business Machines. Granted, three letter abbreviations are jealously guarded commonly across industrial classifications, but usually specious and purely with pseudo legal hubris and wishful thinking. I may have just touched on a good reason why something like this wasn't attempted. I can't imagine the original Commodore folk going in for any superficiality or legal shenanigans let alone affecting their identity. Thus Tramiel set the tone for commerce in the computing age much more broadly, I'm thoroughly disheartened to imagine.
* not least because the industry was infinitely more (historical invocation of the word) artisanal and the work personalized, I very much want to read. If I may be so forward, it sounds like you have personal history close to this. I'm seriously interested in doing anything possible to help get history like this published with effective general reader marketing and can tap a variety of serious resources and experience.
> make the Amiga the intellectual and technical heir to the Apple II
The hallmark of the Apple II was its simplicity. A person could understand all of it, from the hardware up, quite easily.
The Amiga, OTOH, is a very complicated machine. There is no denying it was an incredibly capable machine back then, but, ultimately, its complexity became an issue when VGA and PCM sound became a thing on PCs and the ISA bus made it trivial to add new capabilities to a PC while Commodore had to push out a new computer (or, at least, a new chipset) for the same capabilities.
The GFX and sound for the Amiga were great but the 68k CPU was overrated. When you factored in how the memory bus worked, 68k machines didn't perform that much better than the Apple II.
Even Motorola gave up on the 68k line and every computer manufacturer that depended on it such as Apple, Commodore, Atari, Sun Microsystems and many others either scrambled to switch to a new CPU or went out of business.
The computer press of the 1980s tells a compelling story about the rise of the 68k but I've never seen a good account of the fall other than the account of why the BBC Micro didn't use it.
Only in 8-bit (and some 16 bit) operations and in interrupt responsiveness. And it's a royal PITA to program for, especially if you're working for anything larger than 64k.
3 registers. In two modes. None can hold more than 16 bits. Sometimes only 8 bits. Direct page (nicely movable) so you can .. sort of... have more registers. But none of those have an ALU and they take more cycles than a register.
You can address 24-bits, but never hold a pointer to any of that in a single register. So real fun doing framebuffer operations, etc.
Writing a compiler for it totally sucks.
The 816 looks great on a spec sheet until you actually try to write programs for it.
People did neat things with the IIgs. But I'm pretty sure they did it with their teeth clenched.
Faster at 32 bit math but you don't always do 32 bit math, particularly people didn't do a lot of 32 bit math back then. In fact 32 bit math is where the 6502 goes to die because it has nowhere near enough registers.
In terms of real experienced performance in the applications people ran at the time the 68k was a disappointment.
In terms of real experienced performance: I had both an Apple II+ and an Amiga and you couldn't be more wrong if you tried.
And we didn't actually do much 32 bit math on the 68K except for address calculations, we mostly did 16 bit math, and in fact only the address ALU was 32 bit (2 16 bit ALUs to be precise), 32 bit ops on data registers had to go through the 16 bit data ALU twice...which means that if you had to do 32 bit arithmetic, you could win some perf if you could express it as address calculations (LEA, I am looking at you!)
And of course 16 bit math was common enough that the Apple II included a virtual 16 bit machine in its ROMs, for code-density purposes, but with an obvious further speed hit [1]
But even comparing 16 bit instructions 1:1 on a 7 MHz 68K (Amiga, Mac) with 8 bit instructions on an Apple II the 68K is faster, and the 1:1 comparison doesn't make sense because the 68K has so many more registers and more powerful addressing modes and of course does more work per instruction, and and and.
So not really sure where this idea of slow 68K came from.
Maybe it is due to Wirth's law: "Software gets slower more quickly than hardware gets faster". The 68K was so much faster that people were much more ambitious in what they tried.
I don't recall anyone who actually put it into a system regretting the choice. Unlike the 6502, the 68k also had a viable path forward, which only really ended when all the workstation vendors + Apple decided to jump to RISC.
Yes, but by 1990 Byte had already degenerated into useless "application" reviews and mostly uncool ads. Something that happened to nearly all the magazines after some years (including e.g. Personal Computer World). And when that happened, long-time subscribers like myself just left.
The 68k fell from grace because Motorola -- unlike Intel -- didn't seem to really understand working on an ISA for the long run. And so they went off and worked on the 88k and signaled to their 68k customers that that was the future. The 88k was a failure. So then they went off to work on PowerPC. And in the interim didn't really advance the 68k.
Intel made similar blunders (i960, i860 among others). But while they were making those blunders they never killed off their x86 line, and continued to work on it. And they threw enough engineering $$ at x86 that they pulled the Pentium out of their hat and made x86 really fly, despite not being RISC.
Motorola could of, but never made a 68k ISA equivalent to the Pentium. Freescale kind of did with the ColdFire, but many years too late.
Anyways, it's really apples and oranges when talking about 68k applicability in those old home computers.
Yes, the 68k was not nearly as cycle efficient as a 6502. Memory access and interrupt responsiveness was way lower. And for the Amiga (and Atari ST), yes, the 68k may not have been the ideal CPU in that sense.
But if you look at the 6502 instruction set... there's just no reasonable way to make it 16-bit -- let alone 32-bit -- without major hacks. The 65816 does it with terrible hacks.
There really was no other option at the time other than the 68000 or x86. The 65816 (in the IIgs) came later, and was as I said, a hack job. x86 was gross. NS32xx was full of bugs and probably had the same laggard cycle efficiency. Z8000 was also weird and segmented and probably slow.
68000 got you 32-bits, big linear address space, and seemed to have a future. And the Mac was using it, so.
I loved writing assembly for the 68k. The big endianness is weird in retrospect, but it was fine at the time. Once they brought out the 020, 030, etc and got a cache in front of the memory bus. it really started to fly, performance wise. It's a shame that they EOL'd it.
PowerPC was nice, but ended up being effectively abandoned as well. And the move from 68k to PowerPC was a real struggle for Apple. MacOS "Classic" was really hardcoded for the 68k and I can't imagine the millions of man hours that happened at Apple to make that transition. System 7.whatever on PowerPC was crashy as hell. It took them years to make a stable system out of it, and by then OSX was almost done. Then they only got another 5 or 6 years before Freescale just dropped the ball on them and they had to jump to Intel.
Now you know why they'll just make their own processors now.
> But if you look at the 6502 instruction set... there's just no reasonable way to make it 16-bit
I remember being partial to the 6809 - It seemed like Motorola was able to do the 8bit (6800) to 8/16bit (6809) transition with that processor pretty well. But IIRC the only "mainstream" system using it was the Radio Shack color computer. I had the z80 based Timex-Sinclair color computer (I think this was known as the 2068) but I wanted to play with 6809 assembly programming so I built an expansion card with a 6809 on it and created my own 6809 assembler in Z80 assembly - I could assemble 6809 code on the TS and then send the resulting program over to the RAM in the expansion board and signal it to run the program and then have it send results back to the Z80 side - I remember being so excited when I got that all to work (Oh, to be young again and have time and energy for such projects :)
> I loved writing assembly for the 68k.
Later on I got an Atari ST and bought the full development system (C compiler, and a bunch Xeroxed docs). 68K assembly was indeed so nice - it's just sad that Intel won and we were stuck with x86 ISA for so long. I really never got into assembly programming for x86 because it was so ugly by comparison I just couldn't bear to look at it.
In the early 1990s the "common wisdom" was that CISC architectures were obsolete and that RISC architectures would take over any day. Motorola was part of the alliance that developed the PowerPC architecture (even if it is often described as just coming from IBM). The lack of focus on the 68k was a self-fulfilling prophecy as current versions just couldn't keep up with Intel or the PowerPC, and newer versions weren't being created.
but they didn't throw the x86 under the bus for it's name. If they had, Intel would be a has-been chipmaker the same way Motorola is.
In 1964 IBM realized it was a revolutionary idea to keep the same architecture from one generation of computers to the next. Intel was the second company to take this vision seriously and realize it and that's why Intel not only made the first microprocessor but it is still a dominant producer today. (Alternately the Apple II has no heirs because there wasn't a progression to a compatible Apple 3, Apple 4, etc.)
The Apple II did have heirs, lots of them, right up to the late 1980s which meant it was a viable platform for well over a decade. From the ill fated Apple III to the various smaller/faster/better IIc/GS/e and whatnot. It's interesting to watch "The Computer Chronicles" from the 1980s on Youtube and be reminded of how diverse the personal computer industry was back then.
Apple had a dual strategy for far longer than is generally remembered, the Macintosh was a hit but it wasn't a home run for a long time.
It still same issue. Remember intel never get 64 bit and we even now call amd64 to confuse people - we have an intel chip why it was called amd64!!!
No intel is abandon x86 all the times. That is why until the competitor call them out they just will work like Motorola did. And that is why the cpu they have never got real update for decade. Once again amd save the x86 days.
And for intel move to risc … balk the wrong tree and give up even ARM as it is not invented here.
Just a badly managed engineering firm. Need a woman I think. Or do not ignore woman when come to innovation. And keep the family.
Intel's amd64 chip name was given by the individual contributor racing car enthusiast at Microsoft behind the NT kernel. There is an interview on Youtube with the actual person.
I coded on Apple IIs, Atari STs, and early Macs, and I remember the 68k machines being quite a bit snappier than the Apple II. Keep in mind, the ST and Mac had a lot more to do in order to make a higher-resolution screen and GUI perform at all.
Also, the big reason the 68k eventually fell out of favor was because it (1) wasn't ready on time for the IBM PC, and (2) couldn't keep pace with Intel on the low-end.
IBM choose the Intel 8088 over the 68000 for the IBM PC, which was more popular than IBM expected. Since the IBM PC was heavily documented (you could get not only schematics, but BIOS listings for it), it was easy (if you were careful) to clone and with Microsoft underselling the other operating systems for it, it because very popular. This caused Intel to push billions into making their chips faster and expand into 32 bits. Unix workstations, which mostly started out on the 68000, typically switched to RISC architectures as Motorola couldn't keep up speed wise (it didn't have the budget that Intel did due to the Wintel duopoly, and Motorola had other concerns than just chip design).
And I don't see how one could expand the 68k into a 64-bit design, but that was still a few years down the road by the time the 68k was in decline (sad to see, I was a fan of the 68k architecture).
Oooh that issue also has a preview of the Commodore Amiga, with epic block diagrams and tech specs. Having grown up learning the ins and outs of the Amiga 500, that is really nostalgic.
Also seeing "Computing at Chaos Manor", Jerry Pournelle's column, in the ToC gave me the warm fuzzies. He always came across as, uh I don't know, likable? Like somebody's magically technical super-nice granddad/uncle or whatever. When I read Byte I had no idea at first about his books, all I knew about him was what I gleaned from the columns. So weird. I guess in a way he was an influencer waaay before the term even existed? :)
'Super nice' is just not the vibe I got from his writing, personally. I liked him anyway. For more nostalgia there's his collection of non-computer columns, A Step Farther Out; IIRC the main topics were space and energy. Wasn't shy about despising anti-nuclear environmentalists, for a counterexample to "super nice".
What I remember is that something happened to Larry Niven (did he have a health problem like Heinlein did?) and then whenever there was a Niven book it was always a Niven-Pournelle book which wasn't as good as a Niven book. Pournelle was also known for his right-wing politics.
> Pournelle was also known for his right-wing politics.
And Niven isn't?
Niven said a good way to help hospitals stem financial losses is to spread rumors in Spanish within the Latino community that emergency rooms are killing patients in order to harvest their organs for transplants. “The problem [of hospitals going broke] is hugely exaggerated by illegal aliens who aren’t going to pay for anything anyway,” Niven said.
My opinion of Jerry Pournelle was that he was very opinionated, and very arrogant. I enjoyed his writing style, although I wonder if it was his editor who encouraged him to recycle many phrases that he seemed to thing were cute.
I think he knew more than me about computers (not hard, because I was living in a hut in Africa, without running water or electricity, most of the time I was reading Byte), but much less than a lot of other people who could have written a better column.
What a surprise to go to Byte Magazine from 1985 and see Picasso’s “Interior with a Girl Drawing” on the cover. I hand-painted a copy of it using oil on canvas in a painting class in the mid-90s. [1] I still have it on the wall of my backyard cottage.
What a massive amount of information (including useful catalog-style ads) packed into a monthly magazine I could pick up at Waldenbooks. I might not have had the internet growing up, but I did have 400 pages of this every month, not to mention heavy volumes of QuickBasic manuals and Norton books, so I definitely wasn't starved for information as a budding computer nerd!
just wanted to write the same. Todays magazines seem to be have both much less content (< 100 pages) and less technical depth. It seems like one could have learned a ton just from reading those magazines back then, while information today is much further spread out between millions blog posts.
This year, PROLOG turned 50 years!