I'm really curious to know (as an amateur non-expert fan of chip hardware history, and local history) -- when Intel or Fairchild, etc. developed a new chip with whatever capabilities, how did they explain or get people to quickly understand what it could do?
I haven't yet found a good popular level explanation of this, such as from reading https://en.wikipedia.org/wiki/Intel_8086. I see the technical info, but have no idea how I would know whether this is fundamentally amazingly better than what I have now, if I were in 1978 for example.
How did they figure out who would be their customers? Did their customers have engineers who could look at a chip spec and see that it was 3x better on speed, power, etc? Did the chip designers have some use case in mind when designing, and those would be the first people sold to by the sales team?
Was there a big sales effort needed for such new chips? or did they basically sell themselves?
How did they figure out who would be their customers?
To some extent they didn’t. They didn’t think anyone would want the 4004, 8008, 8080 for computers. They started out marketing them for use in calculators. The personal computer market didn’t exist yet. PCs were originally built by hackers, many of whom belonged to the homebrew computer club. The first one to go into production was the MITS Altair 8800, which used an Intel 8080, but when you bought it you got a bunch of chips you had to solder onto the board yourself, so only hackers had any interest in it.
If you’re really interested in this stuff, I highly recommend the book Hackers: Heroes of the Computer Revolution by Steven Levy [1]. The book traces the history of hackers from its beginnings at the tech model railroad club at MIT through the homebrew computer club at Stanford, and on into the beginnings of the computer game industry. It’s a fantastic chronicle of some very interesting and entertaining characters, with some real pranksters in the bunch. A very fun read!
>They didn’t think anyone would want the 4004, 8008, 8080 for computers.
The Wikipedia page on the 8008 says Intel didn't want to make CPUs at first, even though people were interested, because their business was mostly memory chips and they didn't want to compete with clients.
The 8008 was a commissioned project, and when the client decided to abandon it, they gave the IP to Intel in lieu of paying the bill. So Intel was like "what the heck, let's sell them at $120 apiece", in 1972.
I'm not that familiar with the history, although I did read Hackers a long time ago, but it sounds like CTC[1] may have largely designed what because the 8008 and gave rise to the 8080 and x86. Just looking at the Wikipedia pages for the 4004 and 8008, it seems like the latter generally resembles x86 and the former does not, so perhaps the whole dynasty is not exactly based on an Intel foundation. Reminds me of the way Microsoft got going with OSes.
That's basically correct. The 8008 was a single-chip version of the TTL processor in the Datapoint 2200 desktop computer / terminal. It is entirely unrelated to the 4004 except that many of the same Intel people worked on it. In other words, the view that the 4004 led to the 8008 is entirely fictional.
The Intel 8080 (used in the early Altair computer) was a slightly cleaned up version of the 8008, and the 8085 was a 5-volt version of the 8080. Intel's next processor was supposed to be the super-advanced 8800 with hardware support for objects and garbage collection. That chip fell behind schedule, so Intel threw together the 8086 as a stop-gap chip, a 16-bit processor somewhat compatible with the 8080. The 8800 was eventually released as the iAPX 432, which was a commercial failure but is definitely worth a look for its bizarre architecture -- a vision of what could have been.
The 4004 was designed for calculators. The 8008 was for a terminal, the CTC Datapoint 2200, and Intel (and TI) implemented their instruction set… and that's why today x64 FLAGS has PF.
That's a big area to discuss. Yes, chip manufacturers had large sales departments. Magazines like Electronics had articles discussing new chips (and other components) as well as lots of ads explaining the benefits of new products. (@TubeTimeUS posts a lot of these old ads on Twitter [1])
Intel in particular put a huge effort into support tools for microprocessors (assemblers, libraries, development systems, documentation, etc). They worked closely with customers to get the chips adopted. For instance, "Operation Crush" was an effort to get customers to use the 8086 instead of the Motorola 68000.
For more info on Operation Crush, I rec. John Doerr's Measure What Matters. The beginning of the book outlines the intense competition Intel had with Motorola, Intel's strategy, and Andy Grove's sales/business philosophies.
Those ads are a trip! It looks to have been like the Cambrian explosion of every kind of chip manufacturer and chip type back then. It must have been very exciting, and very seat-of-your-pants time -- amazing to imagine.
> I'm really curious to know (as an amateur non-expert fan of chip hardware history, and local history) -- when Intel or Fairchild, etc. developed a new chip with whatever capabilities, how did they explain or get people to quickly understand what it could do?
Apart from what other comments already correctly stated, the more straightforward answer to your questions is probably: They did and they do develop and build reference designs. Take any silicon you can buy today and very likely you will find a detailed reference design in the data sheet. These designs don't exist only on paper, they are built and used for demonstration purposes and sales. As a customer often you can buy what is basically the reference design in form of evaluation boards too.
> no idea how I would know whether this is fundamentally amazingly better than what I have now, if I were in 1978 for example.
For the start you should define what role you imagine you had in 1978.
Do you imagine working in some company that should decide which CPU to use for their new computer? Then the decisions were made exactly like now: you'd first consider the options with which you are more familiar, of for which you already had something "prepared." Second, you'd also want to avoid the option for which you have learned it has some weaknesses.
That's what moved the people in charge of building the first IBM PC, for example. They had experience with Intel, and also experience with developing using Intel chips.
In his previous post Ken linked to the text "A Personal History of the IBM PC" by David Bradley:
Intel did it by hiring engineers with athletic backgrounds (aka more charismatic/better looking ones) and fast tracking them into marketing/sales departament :-)
If you're selling to business customers you just need to demonstrate value, it doesn't have to be technical e.g. "Our new generation of chips will save you a dollar in power while finishing in half the time" etc.
Computing hasn't changed all that much since then, people haven't at all: for those in the know we have benchmarks, for those who don't we have regular marketing.
The first 'computers' ( Babbage? ) were deemed interesting because they could calculate sin-tables for example. Lots of those tables still contained errors because they were calculated by humans.
I haven't yet found a good popular level explanation of this, such as from reading https://en.wikipedia.org/wiki/Intel_8086. I see the technical info, but have no idea how I would know whether this is fundamentally amazingly better than what I have now, if I were in 1978 for example.
How did they figure out who would be their customers? Did their customers have engineers who could look at a chip spec and see that it was 3x better on speed, power, etc? Did the chip designers have some use case in mind when designing, and those would be the first people sold to by the sales team?
Was there a big sales effort needed for such new chips? or did they basically sell themselves?