
The Intel 8086 processor's registers: from chip to transistors - chmaynard
http://www.righto.com/2020/07/the-intel-8086-processors-registers.html
======
supernova87a
I'm really curious to know (as an amateur non-expert fan of chip hardware
history, and local history) -- when Intel or Fairchild, etc. developed a new
chip with whatever capabilities, how did they explain or get people to quickly
understand what it could do?

I haven't yet found a good popular level explanation of this, such as from
reading
[https://en.wikipedia.org/wiki/Intel_8086](https://en.wikipedia.org/wiki/Intel_8086).
I see the technical info, but have no idea how I would know whether this is
fundamentally amazingly better than what I have now, if I were in 1978 for
example.

How did they figure out who would be their customers? Did their customers have
engineers who could look at a chip spec and see that it was 3x better on
speed, power, etc? Did the chip designers have some use case in mind when
designing, and those would be the first people sold to by the sales team?

Was there a big sales effort needed for such new chips? or did they basically
sell themselves?

~~~
chongli
_How did they figure out who would be their customers?_

To some extent they didn’t. They didn’t think anyone would want the 4004,
8008, 8080 for computers. They started out marketing them for use in
calculators. The personal computer market didn’t exist yet. PCs were
originally built by hackers, many of whom belonged to the homebrew computer
club. The first one to go into production was the MITS Altair 8800, which used
an Intel 8080, but when you bought it you got a bunch of chips you had to
solder onto the board yourself, so only hackers had any interest in it.

If you’re really interested in this stuff, I highly recommend the book
_Hackers: Heroes of the Computer Revolution_ by Steven Levy [1]. The book
traces the history of hackers from its beginnings at the tech model railroad
club at MIT through the homebrew computer club at Stanford, and on into the
beginnings of the computer game industry. It’s a fantastic chronicle of some
very interesting and entertaining characters, with some real pranksters in the
bunch. A very fun read!

[1]
[https://en.wikipedia.org/wiki/Hackers:_Heroes_of_the_Compute...](https://en.wikipedia.org/wiki/Hackers:_Heroes_of_the_Computer_Revolution)

~~~
perl4ever
>They didn’t think anyone would want the 4004, 8008, 8080 for computers.

The Wikipedia page on the 8008 says Intel didn't _want_ to make CPUs at first,
even though people were interested, because their business was mostly memory
chips and they didn't want to compete with clients.

The 8008 was a commissioned project, and when the client decided to abandon
it, they gave the IP to Intel in lieu of paying the bill. So Intel was like
"what the heck, let's sell them at $120 apiece", in 1972.

I'm not that familiar with the history, although I did read Hackers a long
time ago, but it sounds like CTC[1] may have largely designed what because the
8008 and gave rise to the 8080 and x86. Just looking at the Wikipedia pages
for the 4004 and 8008, it seems like the latter generally resembles x86 and
the former does not, so perhaps the whole dynasty is not exactly based on an
Intel foundation. Reminds me of the way Microsoft got going with OSes.

[1][https://en.wikipedia.org/wiki/Datapoint](https://en.wikipedia.org/wiki/Datapoint)

~~~
kens
That's basically correct. The 8008 was a single-chip version of the TTL
processor in the Datapoint 2200 desktop computer / terminal. It is entirely
unrelated to the 4004 except that many of the same Intel people worked on it.
In other words, the view that the 4004 led to the 8008 is entirely fictional.

The Intel 8080 (used in the early Altair computer) was a slightly cleaned up
version of the 8008, and the 8085 was a 5-volt version of the 8080. Intel's
next processor was supposed to be the super-advanced 8800 with hardware
support for objects and garbage collection. That chip fell behind schedule, so
Intel threw together the 8086 as a stop-gap chip, a 16-bit processor somewhat
compatible with the 8080. The 8800 was eventually released as the iAPX 432,
which was a commercial failure but is definitely worth a look for its bizarre
architecture -- a vision of what could have been.

I've written a detailed history of early microprocessors here:
[https://spectrum.ieee.org/tech-history/silicon-
revolution/th...](https://spectrum.ieee.org/tech-history/silicon-
revolution/the-surprising-story-of-the-first-microprocessors)

------
exmadscientist
For those interested in the number of physical registers on a modern CPU,
Henry Wong and Travis Downs have done some great work deducing the physical
implementation of recent Intel cores:

[https://travisdowns.github.io/blog/2019/06/11/speed-
limits.h...](https://travisdowns.github.io/blog/2019/06/11/speed-
limits.html#ooo-table)

[http://blog.stuffedcow.net/2013/05/measuring-rob-
capacity/](http://blog.stuffedcow.net/2013/05/measuring-rob-capacity/)

[https://travisdowns.github.io/blog/2019/12/05/kreg-
facts.htm...](https://travisdowns.github.io/blog/2019/12/05/kreg-facts.html)

[https://travisdowns.github.io/blog/2020/05/26/kreg2.html](https://travisdowns.github.io/blog/2020/05/26/kreg2.html)

------
peter303
This was just before (1977) computers were used to design chips (Mead & Conway
1980). So you see some irregular hand drawn parts here. I did a fair amount of
hand tape circuit design myself in the 1970s. Pre-computer designs were prone
to current bottlenecks and dead end wire. A computer could greatly error.

By the mid 1980s computer designed CPUs were a work of art with their regular
symmetric lattices. And shimmered with rainbow colors as circuit lines shrunk
to wavelengths of visible light.

~~~
jecel
Though Mead & Conway students used CAD from the start, there is nothing in
their method that would keep it from also being used in hand taped designs.

Current attempts to design CPUs with open source tools are not so nice since
they tend to be one big standard cell blob. Adding some PLA and RAM generators
would improve this.

------
Koshkin
This is extremely fascinating. (Incidentally, Feynman's description of this
technology in his _Lectures on Computation_ is also well worth reading.)

------
mmastrac
Interesting discoveries. Perhaps some of the multi-port features on registers
were for some of the REP features (ie: REP STOSB/MOVSB which updated multiple
register) or some of the more generic instruction like PUSH/POP?

~~~
EvanAnderson
The plumbing around the BX/BH-BL registers is something I'd love to look at.
BX, BP, SI, and DI were the only registers available for indexable operations
(think "MOV WORD PTR [BX+2], AX"), and I wonder if the genesis of this
behavior sits at the register file, or further away. BX/BH-BL is the only
register capable of being used for indexable operations that is also 8-bit
addressable.

~~~
drmpeg
A byte addressable index register was required for 8080 assembly language
compatibility to the H and L registers.

~~~
EvanAnderson
By "genesis of this behavior" I meant the implementation, not the reason
(spec) for the behavior.

------
supernova87a
The other thing I'm interested to learn -- did the evolution of RAM proceed as
kind of a "hand me down" technology from the CPU industry and basically tied
to that?

In that, I imagine memory is just about cramming more and more into the same
space, and doesn't require the same complexity of innovation as CPUs (maybe
some new developments in addressing, bus, or whatever) -- mostly just
increasing the density and getting more storage locations in the same area?

Or are there very interesting stories about RAM too? I do know some of the
advances in hard disk magnetic breakthroughs, but silicon memory, not so much.

~~~
kens
Don't get me started on RAM. There's a whole lot of history there especially
core memory.

But to answer your specific question, Intel started off as a RAM company and
their first product was a 64-bit (in total) RAM chip [1]. Processors were a
sideline compared to the RAM market until the mid-1980s when Intel bailed out
of DRAM as Japan took over.

When Intel created a new chip process back then (HMOS through HMOS-III), they
would first build a static RAM chip with the process. Once that worked, they
would then use the process for microprocessors like the 8086.

[1] [http://www.righto.com/2017/07/inside-intels-first-
product-31...](http://www.righto.com/2017/07/inside-intels-first-
product-3101-ram.html)

~~~
rasz
If you study to become MBA Intel memory business is its own separate case
study

[https://www.gsb.stanford.edu/faculty-research/case-
studies/i...](https://www.gsb.stanford.edu/faculty-research/case-
studies/intel-corporation-dram-decision)

------
porknubbins
Very interesting, I always wondered why it was called a register file, as if
registers did not represent physical locations in hardware but some abstract
file, so its good to know thats not the case.

------
mjbrusso
> The registers of the 8086 still exist in modern x86 computers, although the
> registers are now 64 bits.

But they are logical (architectural) registers and are mapped at run time to
one of the physical registers.

------
rahimiali
The article is gold but the real gems are in the footnotes.

------
a1k0n
Huh, it has 15 registers? AX, BX, CX, DX, SI, DI, BP, SP; CS, DS, ES, SS, IP,
flags. What am I missing?

~~~
kens
There are two internal registers (IND and OPR) that aren't visible to the
programmer. See the block diagram at the bottom of the post. The flags are in
the ALU, not in the register file.

