"The original IBM PC had a clock speed of 4.77 MHz. The 8088 processor inside was actually specified to run at 5 MHz, so how did IBM end up at 4.77?"
"
At some point an IBM hardware design engineer made a leap: The Color Graphics Adapter would need a 3.579545 MHz signal to create a color subcarrier; one way to get that signal was to divide 14.31818 MHz by four; 14.31818 MHz is only about 5% less than the maximum speed of the Intel 8284 clock generator (which would divide it by three to get 4.77273 MHz for the 8088 processor). Thus by sacrificing 5% in performance, around $0.50 could be saved in parts"
One man's cheap'n'nasty is another man's elegance. And $0.50 was a lot of money back in those days ;)
Anyway for computers of the era that generated TV output, it seems like it was common to start with whatever clock speed was necessary for the video output, and work back from there. Hence the odd CPU speeds of the C64, Amiga, Atari 8-bit, etc.
For some real excitement, look at how the Apple 2's graphics worked. If you ever worked with hi-res mode you know how screwey the colors were. Turns out they basically serialized the bits right out of ram into a crude NTSC signal and the bits ended up making different frequencies--hence different colors (the high bit of each byte specified whether to delay the rest of the 7 bits by half a clock).
Looking at an Apple 2 schematic is eye-opening. There's hardly anything there!
I have to do something similar with embedded devices to support USB. Of course, 12 MHz multiples don't look as odd as 3.579545 MHz to the casual observer.
As his first post reveals, he started writing the blog a week after his libel suit was dismissed. It's never too late to correct what you believe are inaccuracies that hurt your reputation.
Everything he's posted since then has been fascinating technolore, worth reading.
I was always under the impression that DOS (or QDOS, if we want to pick nits) was essentially a clone of CP/M, even to the point of using CP/M code.
I know the historical record is muddy, and there are competing versions of the story, but I'm uncomfortable calling Tim Paterson the "original" author, when CP/M creator Gary Kildall was the one who really did so much from scratch.
EDIT: A very strong piece of evidence is the recent libel suit Paterson brought and lost against an author who called DOS a copy of CP/M:
"The Judge also agreed that Paterson copied CP/M's API, including the first 36 functions and the parameter passing mechanism, although Paterson renamed several of these. Kildall's "Read Sequential" function became "Sequential Read", for example, while "Read Random" became "Random Read"."
Did you read his blog? Because he addresses this very topic at some length.
Short version: he's never seen the CP/M code, and the similarity between the products isn't surprising, since they both are implementations of the same API.
He claims to have never seen the CP/M code, and I certainly don't have any inside knowledge as to whether that's true or not.
I've read claims that actual lines of CP/M could be found in QDOS. I've also read that Paterson "bought a CP/M manual and used it as the basis to write his operating system in six weeks."
Whatever the truth of the matter, it seems clear that QDOS got a real head start thanks to the existence of CP/M.
I'm not saying Paterson didn't do a lot of hard work, I'm just saying that the word "original" might be a stretch.
> I've read claims that actual lines of CP/M could be found in QDOS.
[citation needed]
Given the large differences in hardware, this seems pretty unlikely. But given they did implement the same API, I suspect that could be the source of some confusion.
> I've also read that Paterson "bought a CP/M manual and used it as the basis to write his operating system in six weeks."
Paterson owned a machine running CP/M and wrote software for it.
> Whatever the truth of the matter, it seems clear that QDOS got a real head start thanks to the existence of CP/M.
He actually attacks that notion head on in the blog, if you'd bothered to read that far. Basically he says that they needed an API and he thought using an existing one would pull developers in. If someone would have sued them right away they would have just changed the API--they weren't really wedded to it and it's not the most important part of the OS.
He also claims that if CP/M did anything it was that it proved that a "real" OS could be run on a microcomputer.
" At some point an IBM hardware design engineer made a leap: The Color Graphics Adapter would need a 3.579545 MHz signal to create a color subcarrier; one way to get that signal was to divide 14.31818 MHz by four; 14.31818 MHz is only about 5% less than the maximum speed of the Intel 8284 clock generator (which would divide it by three to get 4.77273 MHz for the 8088 processor). Thus by sacrificing 5% in performance, around $0.50 could be saved in parts"
Thus setting the tone of the future of the PC.