
Reverse-engineering the first FPGA chip, the XC2064 - matt_d
http://www.righto.com/2020/09/reverse-engineering-first-fpga-chip.html
======
jhallenworld
A lot of this era of FPGAs was about how to configure the chip- Nobody wanted
to use Xilinx's one time programmable serial ROMs.

I used to use some one shots to generate CCLK pulse delayed from an RS-232
start bit. The idea is you would load one bit at time into the FPGA, one bit
for each RS-232 byte. Once the FPGA was configured, a UART implemented in FPGA
logic would come alive and allow it to communicate with a PC.

Anyway, I used xc2064s in the mid 90s for a project once (they were obsolete
then, but I had a tube of them). I modified a point of sale terminal (a little
credit-card machine) so that it could be automated. This was for a methadone
clinic: they had to check each recipient's social security number every day to
verify their medicaid eligibility. So the FPGA would simulate keypresses and
read the vacuum fluerescent display back to the PC. (My consulting fee was
much cheaper than the medicaid software interface.)

Later I used an XC4010E for a PCI-bus interface for a video capture card.
Again, how do you avoid the serial PROM? In this case, I used timed-out PCI
configuration accesses to load the serial data. Even if the PCI device does
not respond, the address lines of the configuration access make it through to
the target, so you just need a fast flip flop to latch DIN and CCLK. Once the
FPGA was loaded, the card would appear on the PCI bus (but long after
enumeration). This worked until PC BIOS started to turn off the PCI clock on
slots with no cards in them to save power.

Both of these were in the pre-Verilog days. To design the FPGA you drew a
schematic into (in my case) OrCAD. Xilinx had a meta-design capability in
those days (called xblox I think?), where one wire in the schematic was
equivalent to entire bus. The actual width was a parameter on a primitive.
Yes, I made a PCI interface in schematics, it was not fun.

------
ncmncm
The thing I find frustrating about FPGAs is how they are stuck in the "ASIC-
replacement" mind-set.

In principle, these things could change their programming on the fly, as fast
as everything happening in the regular logic, but everybody thinks they have
to be programmed at power-up and left that way.

If the programming bit-stream could be fed, separately, to each section of the
die, under control of logic on other parts of the die, a system of totally
adaptable logic would become possible. Different programming could be
installed according to the needs of the moment, replaced with other logic the
moment needs change.

It would be a project to figure out out to usefully program such a system, but
we are up to it. Our phones adopt completely different usage each time we tap
on an app icon, but the performance of apps running on CPUs is strictly
limited.

~~~
tails4e
Active partial reconfiguration is a thing and works pretty much as you say -
some parts of the chip continue running while others are swapped out, and an
example use case is video encoding, just swap in the encoder engine optimised
to a given standard, rather than have every standard in the same design
requiring a much larger device.

~~~
ncmncm
Thank you. Which devices support active partial configuration? I have not
encountered any mention of it, before. The "Three Ages of FPGAs" paper in IEEE
has "reconfiguration" in the title of a single reference, and no mention in
the text.

~~~
tails4e
It's recently been renamed dynamic function exchange (marketing I guess?) but
it'd supported in xilinx 7 series and later:
[https://www.xilinx.com/products/design-
tools/vivado/implemen...](https://www.xilinx.com/products/design-
tools/vivado/implementation/dynamic-function-exchange.html#deviceSupport)

~~~
ncmncm
Thank you.

The DFX apparatus all looks distressingly proprietary, not to say clumsy. It
seems like it will be a long time before this capability leaks out to
mainstream use.

After it becomes more accessible, one could hope to compile programmatically
generated logic on the fly, and immediately configure some chip area to
execute it, perhaps with on-demand / JIT swapping between software and
hardware implementation. O Brave New World!

~~~
slrz
Even for moderately complex designs the synthesis times are way too long for
that to be feasible. Unfortunately, a lot of it is inherent to the
problems/algorithms involved. So getting rid of the clumsiness of proprietary
synthesis tools won't be sufficient, you are going to need another
breakthrough.

~~~
ncmncm
It is always easy to invent ways for ambitions to be impractical. But,
surprise, those are not the things people actually do with new capabilities.
Instead they do clever, sensible things that work for them.

------
joezydeco
Recently @tubetime reverse engineered a Snappy Play video capture device. The
mysterious PLAY “HD-1500” at the heart of the device is actually an XC2064.

Follow the thread here:

[https://twitter.com/TubeTimeUS/status/1301990455182155776](https://twitter.com/TubeTimeUS/status/1301990455182155776)

[https://twitter.com/TubeTimeUS/status/1302704667143544833](https://twitter.com/TubeTimeUS/status/1302704667143544833)

~~~
kens
I'm working with @TubeTimeUS to see if I can reverse-engineer the bitstream.

~~~
joezydeco
That’s awesome! Thank you!

------
kens
Author here to answer all your questions :-)

~~~
angel_j
Would it be safe to say that modern FPGA, although larger, are not more
complex, in terms of the bit stream that programs them?

In other words, that a modern chip's bit stream is not more complex, only
longer?

~~~
kens
Some FPGAs such as the Xilinx 7 series support bitstream encryption, so that's
a new layer of complexity.

Modern FPGAs also have much more in each CLB, and have multiple types of CLBs,
so the bit stream is more complex in that way. See e.g.
[https://www.xilinx.com/support/documentation/user_guides/ug3...](https://www.xilinx.com/support/documentation/user_guides/ug384.pdf)

I don't know a whole lot about modern FPGA bitstreams, so I'd be interested in
anyone has more details. Is the bitstream still essentially a pile of raw bits
controlling things, or is there more structure inside?

~~~
anfilt
The bitstream is probably decrypted as it enters the chip. After that at best
you might have some simple xoring at each CLB to make someone probing with
needles life harder.

Also making sure each data line is not on the top or bottom layer to make non-
destructive access harder. At which point you going to need careful
application of acid or focused ion beam ablation.

~~~
trollied
The encryption has been broken. You can read about it here
[https://news.ycombinator.com/item?id=22915831](https://news.ycombinator.com/item?id=22915831)

~~~
anfilt
I was not saying anything about that, but if you look at that paper or video
you can see the bitstream is passed through an AES decryption stage before
programing the logic fabric. This break found a way to read the data after
that AES stage.

However, I was just explaining how encryption generally works for these
things. Weather or not breaks exist for some chips.

------
jecel
I am pretty sure that I paid something like $300 for the XACT software in
1986. It would be really educational to have students play around with it
today. You could edit the bits in the LUTs directly and see the equations
change in the bottom pane, or you could edit the equations and see the LUT
bits change.

One fun thing was the mode where as you routed a signal it would show the
delay so far, so you could backtrack and try a different path to see if it
would be faster.

The 2064 had 8x8 = 64 configurable blocks, but many people would assume it
would be equivalent to a 64 NAND gate array. So for the next chip they decided
that it would be better to use "equivalent gates" in the name, so 2018 would
be able to replace an "18" gate array.

------
myself248
Since the bitstream maps directly to the connections, and wrong connections
can create short circuits, I wonder how often a corrupted bitstream actually
destroys a chip?

~~~
ThrowawayR2
Not an expert but from what I recall reading, it used to be possible on the
early ones; people would put a finger on the FPGA when testing a new
configuration and immediately cut power to the board if it started heating up
abnormally. In modern FPGAs, there's thermal protection built in.

[EDIT] See also:
[https://electronics.stackexchange.com/questions/445323/can-y...](https://electronics.stackexchange.com/questions/445323/can-
you-actually-break-an-fpga-by-programming-it-wrong)

~~~
mng2
Only the most modern FPGAs have thermal protection, but before that they added
a CRC check to the bitstream. This leaves open the question of malicious
bitstreams...

------
egsmi
Thanks for another cool article! This one has a lot of meat so it will take
some time to go through it.

One question, how useful are the patents linked to in the footnotes for your
work? (I hesitate to even ask this in fear of stirring yet another patent
flame war.)

P.S. Do you have a patreon, or equivalent, page? If not, you really need one.
I would happily subscribe.

~~~
kens
I would rate those patents as moderately helpful. They go into a lot of
detail, but not exactly the details I wanted :-)

For the broader patent question, there is a huge range in patent usefulness.
Some patents explain everything clearly and at the end I'm like "Oh, now I
understand the problem in detail and how they solved it." Most patents,
however, are lawyer-edited mush where I learn nothing from them and it's not
clear what they are even describing.

Some Texas Instruments patents are super-detailed, to the point that I could
built a calculator simulator from the schematics and source code. The Intel
8086 patents were also extremely helpful (although they covered only half of
what I wanted to know).

My opinion on patents is that the examination process should be much different
and force patents to clearly describe the specific problem and solution,
giving useful background information, more like a conference paper. If a
patent doesn't contribute anything useful to the reader, it should be
rejected.

As for a Patreon, I don't have one. But CuriousMarc (who I worked on the AGC
and other projects with) has one at
[https://www.patreon.com/curiousmarc](https://www.patreon.com/curiousmarc)

