
The Evolution of Bitcoin Hardware [pdf] - Katydid
http://cseweb.ucsd.edu/~mbtaylor/papers/Taylor_Bitcoin_IEEE_Computer_2017.pdf
======
lnsru
Nice history of money printing machines. Very nice and very easy FPGA
application back then, just 2 sha256 pipes.

Is there something worth pursuing nowadays for FPGAs? Machine learning?

~~~
simias
FPGAs are worth it for anything that's too fast for software and is too niche
for an ASIC. In my experience they are very popular for "pro tools" which are
too specialized and don't have a large enough volume to justify creating an
ASIC and are so expensive that the price of the BoM is not really a concern.
For instance I work on products that do a lot of video processing on FPGAs
(way too high bandwidth for software, too specialized for an ASIC).

Now if you look at consumer electronics you obviously won't find many FPGAs.
Too expensive, too power hungry, too costly and at large scales ASICs are a
better match.

~~~
abakker
Bernie Meyers of IBM once told me that he believed FPGA would be the future of
Moore's law. he felt strongly that they provided a performance and flexibility
tradeoff that made them ideal for using in virtually all servers as
coprocessors for certain functions. The fact that the FPGA code can be audited
was a plus as well.

~~~
simias
I'm curious about what Intel will do now that they own Altera. Maybe they'll
be able to push FPGA technology into mainstream CPUs.

~~~
lnsru
Not sure about economics, but separate PCIe card with FPGA and it’s own
storage is not that hard to build. Such setup could accelerate big data
operations for sure.

------
tw1010
Amazing what innovations naturally spring out from nothing once the incentives
are in place.

~~~
heavenlyblue
There's absolutely 0 innovation in creating new hardware for Bitcoin mining:
the only thing that stop us from having it is the economies of scale. Please
stop mistaking one for another.

~~~
tw1010
I think bias is making you more unfair than you need to be.

~~~
simias
I think he's being fair. The real difficulty in making an ASIC is justifying
the huge price to start production. The good old "the first chip is worth $10
millions, the second is worth $5" or something like that.

Designing a bitcoin mining IP is not exactly difficult, it's basically two
rounds of SHA-256. It's still some work of course, but as far as ASICs are
concerned it's very low on the difficulty scale.

~~~
mrb
It took much less money than that to develop the first Bitcoin ASICs.

 _~150k USD for 130nm, 200-300k USD for 110nm, and ~500k USD for 65nm, as of
2013_ [http://blog.zorinaq.com/asic-development-costs-are-lower-
tha...](http://blog.zorinaq.com/asic-development-costs-are-lower-than-you-
think/)

~~~
dragontamer
I doubt that 65nm would be able to beat a GPU (which are all 16nm class or
better) at the task however. If you make an ASIC but if the $3000 Titan V is
more power efficient anyway, then you've wasted your time and money.

65nm and other "old node" designs primarily are about mass-manufacturing a
design. They probably can beat an FPGA in cost and margins once mass produced.
But for performance and power-efficiency, you gotta be way better. Maybe 28nm
or 22nm class or better to beat the GPUs (or even standard CPUs like EPYC and
the SHA256 accelerators built into it)

Also, you gotta beat your competition. If someone else makes a 10nm-class BTC
ASIC ([https://techcrunch.com/2018/01/31/samsung-confirms-asic-
chip...](https://techcrunch.com/2018/01/31/samsung-confirms-asic-chips/)),
then your 28nm or 22nm design is obsolete.

~~~
mrb
A 65nm or even 130nm Bitcoin ASIC handily beats a 16nm GPU at mining, in terms
of perf per watt and perf per dollars. And by handily I mean 1 or 2 orders of
magnitude.

------
k__
How long will it take till I can buy GPUs again?

~~~
dpc_pw
GPUs are no longer used for Bitcoin mining. The recent GPU shortage is caused
by the raise of thousands of little altcoins. It's obviously unsustainable,
and it's going to end in a big crash, and all these GPUs will flood the
market. It's hard to predict when it's going to happen (I'm guessing this
year), but if you're patient you will be able to buy GPUs very cheaply.

~~~
rdl
I wouldn't consider Ethereum to be "a little altcoin". While there may be a
correction in the market, I don't think it will be "bitcoin vs. everything
else"; at this point it would probably be all cryptocurrencies, or something
more specific (individual coins, or classes, but probably not "scrypt-based
coins as a class")

------
doots
Michael Bedford Taylor misrepresents the most important aspect of these "ASIC
clouds" and the bitcoin algorithm; the puzzle of the algorithm is incredibly
simple. Every implementation of a Bitcoin miner simply generates a bunch of
random guesses to the hash equation.

On page 60, graph (a) Professor Taylor uses distorted manipulative log charts
for finance.

The chart on page 61 is the most insanely manipulative chart I could ever
imagine to represent the history and progression of Bitcoin mining in relation
to hardware speed.

Try charting

    
    
      watts per block over time.
    
      ROI in BTC for a core i5 over time
    
      newly minted coins per user over time 
    
    

Michael also doesn't understand or chooses to misinform the reader and the
journal about "computational demand scaling with the number of users"

The Bitcoin network grows increasingly inefficient with any additional
computer hash power added, and as more users use it, it clogs due to limited
bandwidth and inefficiencies of the algorithm. Additional hardware speed does
nothing to scale with network growth, actually the complete opposite occurs as
it requires more (energy, computer resources) to do the same thing
(transactions) per second for less (rewards per user). The result of all these
aspects point towards a system of rules that exploits new, uninformed users to
benefit "early adopters".

Sad but not surprising to realize this guy is a professor at the University of
Washington.

Is Michael Bedford Taylor promoting something here through intentional
misdirection and omission of basic facts to manipulate readers?

Would Professor Michael Bedford Taylor be under a conflict of interest if he
sells Bitcoins?

~~~
yorwba
Using log charts is appropriate for data that spans multiple orders of
magnitude. I don't know why you complain about distortion, unless you think
the data itself is wrong?

~~~
doots
The log chart is used to hide volatility.

~~~
yorwba
I guess it depends on how you are used to eyeball volatility, but for relative
swings (percentages) log charts are actually best, since constant differences
in the chart correspond to constant percentages.

