Hacker News new | past | comments | ask | show | jobs | submit login
The Intel 8086 processor's registers: from chip to transistors (righto.com)
148 points by chmaynard on July 18, 2020 | hide | past | favorite | 38 comments



I'm really curious to know (as an amateur non-expert fan of chip hardware history, and local history) -- when Intel or Fairchild, etc. developed a new chip with whatever capabilities, how did they explain or get people to quickly understand what it could do?

I haven't yet found a good popular level explanation of this, such as from reading https://en.wikipedia.org/wiki/Intel_8086. I see the technical info, but have no idea how I would know whether this is fundamentally amazingly better than what I have now, if I were in 1978 for example.

How did they figure out who would be their customers? Did their customers have engineers who could look at a chip spec and see that it was 3x better on speed, power, etc? Did the chip designers have some use case in mind when designing, and those would be the first people sold to by the sales team?

Was there a big sales effort needed for such new chips? or did they basically sell themselves?


How did they figure out who would be their customers?

To some extent they didn’t. They didn’t think anyone would want the 4004, 8008, 8080 for computers. They started out marketing them for use in calculators. The personal computer market didn’t exist yet. PCs were originally built by hackers, many of whom belonged to the homebrew computer club. The first one to go into production was the MITS Altair 8800, which used an Intel 8080, but when you bought it you got a bunch of chips you had to solder onto the board yourself, so only hackers had any interest in it.

If you’re really interested in this stuff, I highly recommend the book Hackers: Heroes of the Computer Revolution by Steven Levy [1]. The book traces the history of hackers from its beginnings at the tech model railroad club at MIT through the homebrew computer club at Stanford, and on into the beginnings of the computer game industry. It’s a fantastic chronicle of some very interesting and entertaining characters, with some real pranksters in the bunch. A very fun read!

[1] https://en.wikipedia.org/wiki/Hackers:_Heroes_of_the_Compute...


>They didn’t think anyone would want the 4004, 8008, 8080 for computers.

The Wikipedia page on the 8008 says Intel didn't want to make CPUs at first, even though people were interested, because their business was mostly memory chips and they didn't want to compete with clients.

The 8008 was a commissioned project, and when the client decided to abandon it, they gave the IP to Intel in lieu of paying the bill. So Intel was like "what the heck, let's sell them at $120 apiece", in 1972.

I'm not that familiar with the history, although I did read Hackers a long time ago, but it sounds like CTC[1] may have largely designed what because the 8008 and gave rise to the 8080 and x86. Just looking at the Wikipedia pages for the 4004 and 8008, it seems like the latter generally resembles x86 and the former does not, so perhaps the whole dynasty is not exactly based on an Intel foundation. Reminds me of the way Microsoft got going with OSes.

[1]https://en.wikipedia.org/wiki/Datapoint


That's basically correct. The 8008 was a single-chip version of the TTL processor in the Datapoint 2200 desktop computer / terminal. It is entirely unrelated to the 4004 except that many of the same Intel people worked on it. In other words, the view that the 4004 led to the 8008 is entirely fictional.

The Intel 8080 (used in the early Altair computer) was a slightly cleaned up version of the 8008, and the 8085 was a 5-volt version of the 8080. Intel's next processor was supposed to be the super-advanced 8800 with hardware support for objects and garbage collection. That chip fell behind schedule, so Intel threw together the 8086 as a stop-gap chip, a 16-bit processor somewhat compatible with the 8080. The 8800 was eventually released as the iAPX 432, which was a commercial failure but is definitely worth a look for its bizarre architecture -- a vision of what could have been.

I've written a detailed history of early microprocessors here: https://spectrum.ieee.org/tech-history/silicon-revolution/th...


The 4004 was designed for calculators. The 8008 was for a terminal, the CTC Datapoint 2200, and Intel (and TI) implemented their instruction set… and that's why today x64 FLAGS has PF.


That's a big area to discuss. Yes, chip manufacturers had large sales departments. Magazines like Electronics had articles discussing new chips (and other components) as well as lots of ads explaining the benefits of new products. (@TubeTimeUS posts a lot of these old ads on Twitter [1])

Intel in particular put a huge effort into support tools for microprocessors (assemblers, libraries, development systems, documentation, etc). They worked closely with customers to get the chips adopted. For instance, "Operation Crush" was an effort to get customers to use the 8086 instead of the Motorola 68000.

[1] e.g. A Zilog ad explaining the benefits of the Z-80: https://twitter.com/tubetimeus/status/1276912575913984001

A long, interesting thread of component ads from 1967: https://twitter.com/TubeTimeUS/status/1280643791037140992


For more info on Operation Crush, I rec. John Doerr's Measure What Matters. The beginning of the book outlines the intense competition Intel had with Motorola, Intel's strategy, and Andy Grove's sales/business philosophies.


For much more detail about Operation Crush you can read "Marketing High Technology" by William Davidow https://www.amazon.com/Marketing-High-Technology-William-Dav...


Those ads are a trip! It looks to have been like the Cambrian explosion of every kind of chip manufacturer and chip type back then. It must have been very exciting, and very seat-of-your-pants time -- amazing to imagine.


> I'm really curious to know (as an amateur non-expert fan of chip hardware history, and local history) -- when Intel or Fairchild, etc. developed a new chip with whatever capabilities, how did they explain or get people to quickly understand what it could do?

Apart from what other comments already correctly stated, the more straightforward answer to your questions is probably: They did and they do develop and build reference designs. Take any silicon you can buy today and very likely you will find a detailed reference design in the data sheet. These designs don't exist only on paper, they are built and used for demonstration purposes and sales. As a customer often you can buy what is basically the reference design in form of evaluation boards too.


> no idea how I would know whether this is fundamentally amazingly better than what I have now, if I were in 1978 for example.

For the start you should define what role you imagine you had in 1978.

Do you imagine working in some company that should decide which CPU to use for their new computer? Then the decisions were made exactly like now: you'd first consider the options with which you are more familiar, of for which you already had something "prepared." Second, you'd also want to avoid the option for which you have learned it has some weaknesses.

That's what moved the people in charge of building the first IBM PC, for example. They had experience with Intel, and also experience with developing using Intel chips.

In his previous post Ken linked to the text "A Personal History of the IBM PC" by David Bradley:

https://dl.acm.org/doi/10.1109/MC.2011.232

The text is very informative, but behind a paywall.

Here's a TED talk by him too, which has the context explained for the public who aren't professionals, also nice it its own way:

https://www.ted.com/talks/dr_dave_bradley_how_did_ibm_create...


Thanks for that!

If you search for the following link on your favorite scientific paper illegal sharing site, it's available as PDF: https://ieeexplore.ieee.org/document/5984815


Intel did it by hiring engineers with athletic backgrounds (aka more charismatic/better looking ones) and fast tracking them into marketing/sales departament :-)

It was called 'Crush' : https://www.youtube.com/watch?v=xvCzdeDoPzg Enormous success.


If you're selling to business customers you just need to demonstrate value, it doesn't have to be technical e.g. "Our new generation of chips will save you a dollar in power while finishing in half the time" etc.

Computing hasn't changed all that much since then, people haven't at all: for those in the know we have benchmarks, for those who don't we have regular marketing.


The first 'computers' ( Babbage? ) were deemed interesting because they could calculate sin-tables for example. Lots of those tables still contained errors because they were calculated by humans.

Faster horse and all.


For those interested in the number of physical registers on a modern CPU, Henry Wong and Travis Downs have done some great work deducing the physical implementation of recent Intel cores:

https://travisdowns.github.io/blog/2019/06/11/speed-limits.h...

http://blog.stuffedcow.net/2013/05/measuring-rob-capacity/

https://travisdowns.github.io/blog/2019/12/05/kreg-facts.htm...

https://travisdowns.github.io/blog/2020/05/26/kreg2.html


This was just before (1977) computers were used to design chips (Mead & Conway 1980). So you see some irregular hand drawn parts here. I did a fair amount of hand tape circuit design myself in the 1970s. Pre-computer designs were prone to current bottlenecks and dead end wire. A computer could greatly error.

By the mid 1980s computer designed CPUs were a work of art with their regular symmetric lattices. And shimmered with rainbow colors as circuit lines shrunk to wavelengths of visible light.


Though Mead & Conway students used CAD from the start, there is nothing in their method that would keep it from also being used in hand taped designs.

Current attempts to design CPUs with open source tools are not so nice since they tend to be one big standard cell blob. Adding some PLA and RAM generators would improve this.


I used that effect in the early nineties to make earrings out of them. In this form factor, delidded. [1] http://www.cpushack.com/chippics/Intel/8028x/IntelC80286-6.h...

Was difficult to reliably attach the clip to them, had a dental-lab technician friend waste several diamond drill bits on them. Asked several jewelers, of course they wanted insane prices while saying they were unsure because they don't know how to work with that stuff. So I resorted to superglue, with the effect that when moving the head too fast it fell down to the ground, made a very high-pitched PING and was cracked cleanly in half along its diagonal axis.

Didn't matter since I had quite a few of them :-)


This is extremely fascinating. (Incidentally, Feynman's description of this technology in his Lectures on Computation is also well worth reading.)


Interesting discoveries. Perhaps some of the multi-port features on registers were for some of the REP features (ie: REP STOSB/MOVSB which updated multiple register) or some of the more generic instruction like PUSH/POP?


The plumbing around the BX/BH-BL registers is something I'd love to look at. BX, BP, SI, and DI were the only registers available for indexable operations (think "MOV WORD PTR [BX+2], AX"), and I wonder if the genesis of this behavior sits at the register file, or further away. BX/BH-BL is the only register capable of being used for indexable operations that is also 8-bit addressable.


A byte addressable index register was required for 8080 assembly language compatibility to the H and L registers.


By "genesis of this behavior" I meant the implementation, not the reason (spec) for the behavior.


The other thing I'm interested to learn -- did the evolution of RAM proceed as kind of a "hand me down" technology from the CPU industry and basically tied to that?

In that, I imagine memory is just about cramming more and more into the same space, and doesn't require the same complexity of innovation as CPUs (maybe some new developments in addressing, bus, or whatever) -- mostly just increasing the density and getting more storage locations in the same area?

Or are there very interesting stories about RAM too? I do know some of the advances in hard disk magnetic breakthroughs, but silicon memory, not so much.


Don't get me started on RAM. There's a whole lot of history there especially core memory.

But to answer your specific question, Intel started off as a RAM company and their first product was a 64-bit (in total) RAM chip [1]. Processors were a sideline compared to the RAM market until the mid-1980s when Intel bailed out of DRAM as Japan took over.

When Intel created a new chip process back then (HMOS through HMOS-III), they would first build a static RAM chip with the process. Once that worked, they would then use the process for microprocessors like the 8086.

[1] http://www.righto.com/2017/07/inside-intels-first-product-31...


If you study to become MBA Intel memory business is its own separate case study

https://www.gsb.stanford.edu/faculty-research/case-studies/i...


Memory circuits were relatively easy entry points for low cost Asian countries. So about 90% of that business migrated out of US and Europe to Asia.


RAM came first. Even in the late 1960s, semiconductor SRAM was faster than core memory. Far more expensive, but faster. That created a market segment for caches and registers for high-end mainframes.

DRAM quickly followed, and the 1103 DRAM of 1970, holding a total of 1024 bits, was Intel's first breakthrough product: https://en.wikipedia.org/wiki/Intel_1103

DRAM and SRAM were already a multimillion dollar market by the time the first microprocessors came along a few years later.


DRAM requires incompatible fab steps to CMOS (probably also NMOS). This is one of the reasons you dont see much DRAM in CPUs or compute on ram.


The first few generations of DRAM used the same fabs as logic circuits. But the capacitor for each bit took proportionally a larger and larger area with newer processes. So they came up with the idea of making the capacitors vertical instead of horizontal (essentially a deep well into the silicon with the capacitor actually being the walls of the well) and the steps became different.

You can still make DRAMs with the logic process (normally for the last level caches) but it will be less efficient than the dedicated process.


Another reason is that DRAM is a lot slower than SRAM, and you want registers and cache to be as fast as possible.


Very interesting, I always wondered why it was called a register file, as if registers did not represent physical locations in hardware but some abstract file, so its good to know thats not the case.


> The registers of the 8086 still exist in modern x86 computers, although the registers are now 64 bits.

But they are logical (architectural) registers and are mapped at run time to one of the physical registers.


The article is gold but the real gems are in the footnotes.


Huh, it has 15 registers? AX, BX, CX, DX, SI, DI, BP, SP; CS, DS, ES, SS, IP, flags. What am I missing?


There are two internal registers (IND and OPR) that aren't visible to the programmer. See the block diagram at the bottom of the post. The flags are in the ALU, not in the register file.


I believe the status register (flag)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: