Hacker News new | past | comments | ask | show | jobs | submit login

I have designed ASICs and FPGAs for nearly 30 years, and seen the evolution of this technology first hand. To say that FPGAs have the wrong abstraction is to not understand what an FPGA is and what is intended to accomplish.

Transistors are abstracted into logic gates. Logic gates are abstracted into higher-order digital functions like flip-flops, muxes, etc. It is the mapping of algorithms/functions onto gates that is the essence of digital design. This is difficult work that would be impossible at today's scales (5-billion+ transistors) without synthesis tools and HDLs. And, given that an ASIC mask set costs 1MM+ for a modern geometry, it needs to be done right the first time (or at least the 2nd). Furthermore, the mapping to gates needs to be efficient, throwing more gates at a problem increases area, heat, and power, all of which need to be minimized in most contexts.

My first job out of college was designing 386 motherboards. Back then we were still using discrete 74xx ICs for most digital functions. The boards were huge. PLDs allowed better intergration and were cost effective since a single device could implement many different functions, and reduced board area and power consumption. CPLDs moved this further along.

FPGAs grew out of PLD/CPLDs and allowed a significantly higher level of integration and board area reduction. They offered a way to reduce the cost of a system without requiring the investment and expertise required for an ASIC. But, FPGAs themselves are an ASIC, implemented with the same technology as any other ASIC. So, FPGAs are a compromise; the LUTs, routing, etc are all a mechanism to make a programmable ASIC. Compared to an ASIC, however, FPGAs require more power and can implement less capability for a given die size. But, they allow a faster and lower cost development cycle. To bring this back around, the LUTs and routing mechanisms are functions that have been mapped to gates. To use an FPGA, algorithms still need to be mapped onto the LUTs and this is largely the same process as mapping to gates.

This article was pointless, even the author acknowledges: "I don’t know what abstraction should replace RTL for computational FPGAs." And, "Practically, replacing Verilog may be impossible as long as the FPGA vendors keep their lower-level abstractions secret and their sub-RTL toolchains proprietary." As I have argued above, knowing the FPGA vendors lower-level abstractions won't make the problem any better. The hard work is mapping onto gates/LUTs. And that analogy is wrong: "GPU : GPGPU :: FPGA : " An FPGA is the most general purpose hardware available.

The best FPGA/ASIC abstraction we have today is a CPU/GPU.




> As I have argued above, knowing the FPGA vendors lower-level abstractions won't make the problem any better.

Where did you argue that? Why is it reasonable to expect that proprietary synthesis tools are going to be better than an open source one? That definitely was not the case, long term, with proprietary C compilers of yesteryears. LLVM is the future, and mostly because of LLVM-IR. So ASTs are well optimized in a general format, why shouldn't digital logic circuits be similar? Yes, actually mapping this to xlnx (etc.) primitives is going to be different for each vendor... In the same sense that mapping LLVM-IR to aarch64 and amd64 is going to be different. So what? That doesn't mean that all is lost.

> The hard work is mapping onto gates/LUTs.

I think it's reasonable to expect that things like FIRRTL have the potential to outperform the synthesis tools that exist currently. The closer the representation gets to a pure graph theory problem, the better chance we have at reasoning about it.

The author makes a good point about Verilog being the current interface. Look at how FIRRTL has to be transpiled back to Verilog to be piped into synthesis tools. That's madness, and it's very opaque, and there's a lot of information lost that we just have to trust the tools to recover. Verilog is a lossy format, and that's the takeaway from this article for me, and you haven't addressed that point at all.


I argued it in the second quote: "The hard word is mapping onto gates/LUTs". It doesn't matter whether its gates or FPGA luts, the work is the same.

I never argued anything regarding proprietary vs opensource tools. I love opensource tools and appreciate projects such as FIRRTL, Symbiflow, Yosys, Chisel, Clash, etc have amazing potential. Having access to the FPGA vendors low-level abstractions enables the broader use and development of these tools, which is important. My point was only that gates/LUTs are the fundamental building blocks of all digital computing. They are not easily abstractable, and to say that FPGAs have the wrong abstraction is not the best way to look at the problem. FPGAs aren't going to fundamentally change, they aren't going to evolve from FPGA : GPFPGA (to answer the author's analogy). But, tools can always be improved and make FPGA design more accessible.


> Why is it reasonable to expect that proprietary synthesis tools are going to be better than an open source one? That definitely was not the case, long term, with proprietary C compilers of yesteryears. LLVM is the future, and mostly because of LLVM-IR.

Lol citation needed. Last I checked the performance oriented commercial closed source C/C++ compilers still outperform Clang and LLVM. And so does gcc for most cases for that matter.


Clue: Gcc is Free Software too.

New languages typically are not implemented with Gcc, though. So, long term, LLVM probably wins.


I don’t follow.

Though gcc still trumps clang in some areas, neither gcc nor llvm beat the commercial compilers like ICC in a wide range of workloads.. obviously there is still a market for ICC and AOCC.

The original point was that open sourcing would necessarily lead to better performing tooling. And again to that I maintain... citation needed.


As a autodidact with a focus on analogue circuits and experience in microcontroller stuff the thing that always threw me off aboit FPGAs is that there was never a real oh-don’t-mind-I-am-just-looking kind of FPGA environment.

When I started with MCUs I started with an arduino. The thing it did for me was to give me a feeling when to use a microcontroller and when to use something else entirely.

Of course the level of control I had with an arduino was far from optimal, but it worked out of the box and guided me into the subject (a bit like a childrens bicycle: neither fast nor special, but helps in avoiding pain and frustration for the learner).

I wished I had this kind of thing in an affordable fpga way. Simple enough to get me hooked, with examples and good sane defaults etc.

This is what mainstream means: idiots like me who didn’t get a formal education on the subject but want to try things out.


Here you go:

Cheap FPGA boards for educational purposes: https://store.digilentinc.com/fpga-for-beginners/

The software is free: https://www.xilinx.com/products/design-tools/ise-design-suit...

The hard part is several semesters worth of textbooks to go through that cover digital logic (try Mano's "Digital Design: With an Introduction to the Verilog HDL" to start with) through computer architecture in order to know what to do with the board.


Much, much larger and better $113 FPGA development board with free 1 year license: https://www.microsemi.com/existing-parts/parts/139680 You never pay the $159 list price. Besides, the FPGA alone would cost you >$350 retail. Arrives with Risc-V preprogrammed and 'hello world' type demos on wifi and usb.


Really exotic and complex board to start with. No community support (think about Digilent or Terasic). I also guess, no examples. I visited a seminar about RISC-V and Microsemi presentation was really weak on this topic. Do not recommend this for getting started. Only for experienced users looking for pain. Cheap and affordable board is for example Max1000 from arrow.


Examples RiscV softcore, ADC, wifi, 12,5 Gbps Serdes(!), tic tac to, console echo. https://github.com/Future-Electronics-Design-Center/Avalanch... https://github.com/RISCV-on-Microsemi-FPGA/PolarFire-Eval-Ki...

Community support is indeed just beginning but the risc-V community and the HiFive1 - SiFive community support these polarfire FPGA's. Even 50 Risc-V softcores fit on this FPGA.


After the one year, the software costs $995 per year ("The board includes a 1 Year Libero Design Software Gold License worth $995!"). Not really a good deal for a beginner.


You buy a new board every year for $113 to extend the license. My point remains, its by far the most powerful FPGA for $113 and good for beginners.

Just a happy user, not affiliated with Microchip/Microsemi


There are also some boards listed here: https://symbiflow.github.io/#boards

If you want something even cheaper look for some ICE40 boards, like up5k MDP, or tinyfpga.

A few interesting things to try out: https://www.cl.cam.ac.uk/teaching/1112/ECAD+Arch/background/... https://www.cl.cam.ac.uk/teaching/1112/ECAD+Arch/files/Thack...

An intro to verilog: http://zipcpu.com/tutorial/


Leaving aside the fact that open source toolchains exist for various Lattice FPGA's (the ice-40 and ecp5 etc)

About the simplest environment for doing what you just described is http://papilio.cc/

While using the existing Xilinx Webpack tools for actual synthesis, place and routing, etc. the Papilio Design-IDE will LITERALLY let you add peripherals to virtual Arduino like appendages! It takes advantage of a number of community projects like the Wishbone bus, and achieves a nearly drag-n-drop level of visual design tool.

Once you have loaded your custom arduino chip onto the papilio boards FPGA you can program it with a modified version of the Arduino IDE!!!!

One of their virtual chips you can start with IS the arduino atmega 328! Another is the ZPU-ino, an implementation of the Zylin ZPU (a 32bit mcu) done by Alvie Boy that allows you to program this much more powerful device ALSO by the Arduino IDE!


There are a bunch of FPGA project breakout boards on Crowd Supply, at generally low prices, cheap enough to try one at a time until you find one that clicks for you.


I highly recommend PYNQ. It comes with a Jupyter interface through an ethernet port that lets you interactively program and execute. Of course, actually designing the RTL overlay is the hard part, and you'll need to get comfortable with Vivado to do it properly.

http://www.pynq.io/


Yes! Part of what an abstraction /API should do is to map the difficulties of the actual physical/electrical problem to the language. Arguing then that the language makes things too hard is missing the point... the difficulty of generically mapping gates and routes and clocks is hard and that has been exposed to you as LUTs and flip-flops and wires and clocks in something that looks like a programming language with a compiler that still sometimes fails to make implementations fit into the FPGA structure.

That some difficulties are hidden doesn't mean they are easy. Unless you have at least a proposed solution (language, compiler, architecture, ASIC) that lets people solve similar problems to the FPGA tool chain, it's just complaining.


Question: What is the simplest publicly available/open design for a 386 Motherboard? (74xx IC's are preferable, no matter how large it would make the entire board... I envision myself building an ancient 386 Motherboard in the future, to teach myself aspects of computer engineering...)


Decade of FPGA work here; thought the same thing.

Fundamental misunderstanding of FPGAs, presents no alternatives. Zero worth article.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: