Is there something worth pursuing nowadays for FPGAs? Machine learning?
Now if you look at consumer electronics you obviously won't find many FPGAs. Too expensive, too power hungry, too costly and at large scales ASICs are a better match.
FPGAs don't trend towards openness and you have to get a 100x win from their flexibility to break even on performance.
I have a sneaking suspicion that this is a more general truth and that in many applications where FPGAs are touted as compute accelerators they were actually chosen to minimize the length of the critical real-time data path rather than to do more computation for fewer dollars.
People seem to forget that much of the allure with FPGA's is the 'Field programmable' part - where you can fix bugs and upgrade hardware after it's shipped.
FPGAs are expensive by themselves, expensive to implement in your solution, very hot, relatively slow and require rare expertise from developers. But they have one feature that beats everything - they can be patched in production.
Designing a bitcoin mining IP is not exactly difficult, it's basically two rounds of SHA-256. It's still some work of course, but as far as ASICs are concerned it's very low on the difficulty scale.
The innovation at the time was that the conventional wisdom that it's "effectively impossible" to do small custom ASIC runs for relatively cheap. Most were laughed at back then when this topic was brought up for other use cases.
While I'm sure none of that was super exciting to someone who works on custom ASIC design for some enterprise, it was pretty neat watching effectively a bunch of hackers figure the process out and do it for a tenth of the predicted "first chip" costs. This has a lot of value in of itself, simply proving something is possible for $a_lot_of_money+skill vs. $epic_truckloads_of_money.
Nowadays it's not super interesting since you're back to needing to be a "big player" to get into the game - but for a year or so it was a real fun time to be a bystander and watch the rapid pace of development.
~150k USD for 130nm, 200-300k USD for 110nm, and ~500k USD for 65nm, as of 2013 http://blog.zorinaq.com/asic-development-costs-are-lower-tha...
65nm and other "old node" designs primarily are about mass-manufacturing a design. They probably can beat an FPGA in cost and margins once mass produced. But for performance and power-efficiency, you gotta be way better. Maybe 28nm or 22nm class or better to beat the GPUs (or even standard CPUs like EPYC and the SHA256 accelerators built into it)
Also, you gotta beat your competition. If someone else makes a 10nm-class BTC ASIC (https://techcrunch.com/2018/01/31/samsung-confirms-asic-chip...), then your 28nm or 22nm design is obsolete.
There's a bit more to this; there's clever optimisations you can do (based on the Merkle–Damgård structure of SHA256 and the format/semantics of the data being hashed): http://www.mit.edu/~jlrubin/public/pdfs/Asicboost.pdf
OP is catching flak for the comment, but Filecoin and others are doing cool things exploring the possibilities of using incentives to drive massive resource re-allocations.
On page 60, graph (a) Professor Taylor uses distorted manipulative log charts for finance.
The chart on page 61 is the most insanely manipulative chart I could ever imagine to represent the history and progression of Bitcoin mining in relation to hardware speed.
watts per block over time.
ROI in BTC for a core i5 over time
newly minted coins per user over time
The Bitcoin network grows increasingly inefficient with any additional computer hash power added, and as more users use it, it clogs due to limited bandwidth and inefficiencies of the algorithm. Additional hardware speed does nothing to scale with network growth, actually the complete opposite occurs as it requires more (energy, computer resources) to do the same thing (transactions) per second for less (rewards per user). The result of all these aspects point towards a system of rules that exploits new, uninformed users to benefit "early adopters".
Sad but not surprising to realize this guy is a professor at the University of Washington.
Is Michael Bedford Taylor promoting something here through intentional misdirection and omission of basic facts to manipulate readers?
Would Professor Michael Bedford Taylor be under a conflict of interest if he sells Bitcoins?
This is an academic article, making that assumption would be negligent.