
History of MOS 6502 [video] - teknotus
https://www.youtube.com/watch?v=wOJj-IdYZxI
======
PhantomGremlin
Not a bad talk, considering that the guy wasn't even alive when the original
6502 was designed. He was confused on a few minor points, some of which
Wikipedia clears up.[1][2]

a) There were really two chips done, the 6501 and 6502. The difference was in
a few pins, the 6501 was fully pin-compatible with the 6800, the 6502 was
easier to use in a system design since it did not require a two-phase non-
overlapping 5V rail-to-rail clock.

b) The 6501 cost $20, the 6502 cost $25. Quantity 1. They ran at 1 MHz. You
could send a letter to the company in Pennsylvania, with a check enclosed, and
buy 1 chip for that price. Try buying one CPU chip from Intel nowadays. :)

c) The lawsuit by Motorola that he mentions resulted in MOS agreeing to
discontinue the 6501. But that didn't matter much for 99% of their potential
customers. First, the pin differences were very minor. Second, the TTL-level
clock input in the 6502 meant that an external clock driver chip wasn't
needed.

[1]
[http://en.wikipedia.org/wiki/MOS_Technology_6502](http://en.wikipedia.org/wiki/MOS_Technology_6502)
[2]
[http://en.wikipedia.org/wiki/Motorola_6800](http://en.wikipedia.org/wiki/Motorola_6800)

~~~
PhantomGremlin
I forgot to mention another thing he didn't know was useful

d) BCD instructions, which he thinks are "kind of silly", and perhaps they are
now that computers run at many GHz. But they were very useful in the old days.
E.g. CBASIC [1] implemented BCD floating point math. Here's what Wiki says
about it:

    
    
       CBASIC proved very popular because it
       incorporated 14-digit binary-coded decimal
       (BCD) math which eliminated MBASIC's rounding
       errors that were sometimes troublesome for
       accounting.
    

As the reference manual linked [2] from Wiki says:

    
    
       Real numbers are stored in eight bytes of
       memory. The first byte is the sign and
       exponent. The exponent is maintained in
       excess 64 code. The seven remaining bytes
       contain a normalized mantissa stored as
       packed decimal digits. The high order four
       bits of the rightmost byte is the most
       significant digit of the mantissa.
    

That was of course long before IEEE 754 floating point.[3] BTW did anyone know
that IEEE defined decimal floating point? I didn't know that until just now.
It's probably not that popular.

When a CPU runs at 1 MHz instead of 3 GHz, doing FP in BCD means much simpler
and much faster conversion between internal representation and display. A lot
of early microcomputers were used by small business. BCD is inferior for
complicated scientific calculations, but is ideal for simple small business
accounting, which was CBASICs target market.

[1] [http://en.wikipedia.org/wiki/CBASIC](http://en.wikipedia.org/wiki/CBASIC)
[2]
[http://www.cpm.z80.de/manuals/cbasic-m.pdf](http://www.cpm.z80.de/manuals/cbasic-m.pdf)
[3]
[http://en.wikipedia.org/wiki/IEEE_floating_point](http://en.wikipedia.org/wiki/IEEE_floating_point)

~~~
ANTSANTS
An interesting related tidbit was that the BCD implementation was the only
part of the 6502 that MOS patented. As you could not copyright an IC mask at
the time, Nintendo/Ricoh were able to clone the 6502 without paying license
fees simply by cutting the traces to that part of the chip. (Ricoh was a
licensed second source for the 6502. I imagine Commodore was not happy with
them after that...)

------
blueatlas
Here is an excellent interview with the lead designer of the 6502 - Chuck
Peddle.

[http://retrobits.libsyn.com/show-123-an-interview-with-
chuck...](http://retrobits.libsyn.com/show-123-an-interview-with-chuck-peddle-
part-i)

------
Theodores
There are some cringeworthy aspects to this talk if you know the story. That
does not actually distract because it is interesting watching someone tell the
story when they revere the 6502 yet have not coded for it.

At the time that the 6502 hit the computer scene in a mass-market way it
wasn't like you coded your code straight into a computer. Instead you actually
wrote code out on paper. Then you looked up the opcodes and typed them into a
margin. That was what you typed in on many personal computers of the time. The
BBC micro had assembler options that you could drop into on BBC Basic, so, if
you wanted to do something really cool then assembly was an option.

This was in the pre-internet era where you might have a few books and plenty
of magazines - piles of them, normally, every back issue kept. You could not
Google answers and have some SO answer pop up seconds later. You would have to
rifle through the indexes on the few books you had. This was not an obstacle
to learning, you could actually memorise the whole instruction set and be
fairly sure of what the opcodes were. So, in a way, you could be your own
Google.

I had exposure to Z80 before 6502, I also had some 6809 and 68000 knowledge.
However, if you were new to programming it required quite a conceptual leap to
know what an index register could do. Then things like the alternate registers
- what do I do with these, please? Hence, the little 6502 was nice and easy.
Although there were just the three registers you could master them and do what
you needed to do with just that.

To a certain extent the 6502 forced people to build a proper computer without
getting the CPU to do things like drive the display (Sinclair). You didn't
really have to care about system interrupts for things outside your code plus
the performance was pretty good in comparison to those systems that had a Z80
'do everything'.

Although Z80 might be a bit hard as a beginner, 6502 was a bit more
manageable. With concepts learned on the 6502 you could go on to the other
CPUs with relative ease, e.g. 68000 was easy if you knew 6502.

Going back to the talk, I had it easy - manuals, complete computers you could
buy, even dot matrix printers. Plus community in those magazines. The
generation before me that actually made the stuff I picked up they were on
punched cards and reel-to-reel tapes. I cannot imagine how hard it was for
them or the realities of how it was to problem solve given tools available.
However, I imagine that leap is similar to the leap required by this speaker
to understand the little takeaway here - you could actually program the 6502
without being overwhelmed by the size of the instruction set and, through
doing so, get pretty good at it.

~~~
kjs3
This is completely wrong, and it's obvious the author is the one who is
"cringeworthy". I most certainly "coded your code straight into a computer"
back in those days. The Commodore PET and Apple 6502 computers had built in
BASIC, in ROM, and assembler environments were easy to find if you had a
floppy. The Apple at least had any number of other languages like Pascal and
Forth. And the idea that the 68000 was easier if you knew 6502 is laughable.

~~~
jonjacky
No, he's right. The PET and Apple were second generation 6502 machines -
intended to be personal computers. The first 6502 products were evaluation
kits like the KIM-1, which only had a keypad and seven segment LEDs for
entering and viewing code in hex. Even this was quite an advance over the
previous generation like the Altair where you had to enter code by flipping
switches. This page has a picture, explains the process and shows some
handwritten source code:

[http://blog.jgc.org/2013/04/how-i-coded-
in-1985.html](http://blog.jgc.org/2013/04/how-i-coded-in-1985.html)

(Despite the title, this was more typical of 1976 or so.) It seems awkward,
but it was quite workable for embedded controllers, which was the original
application that the microprocessor manufacturers had in mind.

~~~
vidarh
That may be true, but the timeline was short: The 6502 was introduced in 1975.
The KIM-1 arrived in 1976, and the PET in 1977. There was really only a year
where the 6502 was a realistic options without there being "proper" computers
based on it.

That said, while I never wrote assembly opcodes by hand, I did know the hex
values for most of the 6502 opcodes at some point. And I did later debug M68k
assembler by annotating dot matrix printouts until '92 or '93 or so - I used
to bring them to school with me so I could work on my compiler projects during
recess.

The things you do when you don't have portable computers or network access.

------
paulannesley
Given it's a 6502 talk at some kind of Ruby on Rails event, here's a 6502
emulator in Ruby I was working on:
[https://github.com/pda/c64.rb/blob/master/c64/instructions.r...](https://github.com/pda/c64.rb/blob/master/c64/instructions.rb)
(I've since moved to a Go version)

------
coupdejarnac
Really enjoyed the talk. The website he mentioned, visual6502.org, has a bunch
of interesting resources. The visual sims are fun to explore.

------
Taniwha
he sort of misses that the 8080 and 6800 were in the market before the 6502 -
it was a latecomer.

The big architectural advance was two index registers, otherwise it was really
sort of a 6800 (from a programmer at the times point of view)

As mentioned above having a clock that could be created by meere mortals was a
big advance (previoud chips required you to be able to drive what must have
been a complete unbuffered clock tree - rail to rail voltages and high peak
current.

The thing he misses about yield is that for a fixed per wafer defect rate
there's a point where reducing die size to 1/4 reduces the yield fallout by
almost that much

~~~
vidarh
It was _explicitly_ a "sort of 6800". Chuck Peddle was on the 6800 design
team, and so was his co-designer for the 6502 - Bill Mensch - and Rod Ogill
that did the 6501.

Peddle tried to sell Motorola on making a cost reduced version. When Motorola
refused to let him work on a cheap version, he left to do the 65xx.

Their first model - the 6501 - was designed to be pin compatible with the
6800, and resulted in a lawsuit from Motorola that almost bankrupted MOS.

