Hacker News new | past | comments | ask | show | jobs | submit login
Free Range VHDL – VHDL programming book available for free (freerangefactory.org)
101 points by EvgeniyZh on Jan 29, 2017 | hide | past | favorite | 52 comments



I learned VHDL a few months ago. Since then I've build a few projects with it (some PCIe and ethernet stuff). I was very excited to learn a HDL because FPGAs seem like extremely powerful devices.

However, I now know why FPGAs aren't as popular as they could be: the tools are horrible. There is no way FPGAs will become popular among software developers unless the tooling improves dramatically. You can really tell that it was all build by hardware engineers who know their electronics very well, but not good software design.


> There is no way FPGAs will become popular among software developers unless the tooling improves dramatically.

Care to expound in what way? I don't disagree with your opinion completely, but I suspect our reasons differ.

IMHO, the barrier isn't so much the tools, but a clashing of paradigms. All too often, traditional software developers without a solid foundational background in digital hardware architectures think they can easily pick up a HDL as if it were yet another programming language. The HD part is quickly forgotten as behavioral processes are hacked away using all too familiar constructs like if-then-else, for/while loops, etc. Inference warnings are ignored, timing constraints are shrugged off, metastability and synchronization aren't even a thing, let alone driver/receiver selection, pinmap planning, and signal integrity considerations at the PCB level...set to default and things should just work, right?

Anecdotally speaking, the only thing in common that HDLs have with traditional programming languages is a superficial charset. I think every software developer who has ever used tools like VS, Eclipse, Emacs, Vim, Make, Doxygen, Git, etc. would agree that each has its quirks and it takes a bit of use before a comfortable flow is discovered. FPGA vendor tools are no different, except--testbed simulations aside--there's that immutably distinct custom hardware integration end-game that most traditional software has the pleasure of conveniently abstracting away.

Any kernel/driver developers with HDL proficiency care to chime in?

P.S. ground-up PCIe/Ethernet FSM+datapath architectures after a few months without prior HDL experience is really impressive.


I went to CMU for a BS in EE, I used a lot of verilog, and also a bit of Cadence for layout in one class. But I've worked as a software engineer since then.

I totally agree that HDL tools are astonishingly crap. The market is just so much more specialized, expensive, locked down and closed.

Imagine that you had to use the Intel compiler to get source code compiled for intel processors. You had to use their special headers, you practically had to use their huge 8GB Intel IDE with a million lists and buttons, you had to choose what features were on your intel processor (VT-X? SSE4?). Imagine AMD is the same, but separate. This is all closed source stuff of course. And you have to use it to run on any smartphone/laptop/desktop/server class processor.

Imagine that there's no GCC, no CLANG, no GDB, no Jetbrains, no Eclipse. All the related tools we use wouldn't have much of a reason to exist because just about everyone had to use the huge IDEs for each vendor anyway.

The vendor stuff is huge and inefficient and crap, because features sell, and crap doesn't not sell because there's no meaningful competition.

To be more specific, "crap" means high bloat and low reliability. Crashes, inexplicable errors and failures. Huge amounts of bloat. But you can make a tweak and try again move on, so any sane hardware person just gets used to it, it's not like they can do anything about it.


Apologies but I really can't tell if your point is against the proprietary nature of reconfigurable devices, or vendor tool bloat.

> Imagine that there's no GCC, no CLANG, no GDB, no Jetbrains, no Eclipse.

This is quite easy to imagine when you remove the key element which allows these L7 abstractions to be meaningful: an underlying kernel with a well defined interface. What equivalence does the reconfigurable world have when every target device requires its own unique "kernel", if you will? I can't think of any...which would explain why 3rd-party tools are constrained to synthesis, while place-and-route is an explicit function of the vendor tool. Perhaps we too easily conflate the size of the tool with the size of the target?

On the flip side, supposing there were some open reconfigurable interface standard, I don't think this would fly in the current market given the high-performance nature of these devices. Top two FPGA vendors apparently change slice/CLB structure every generation, let alone agree on an open fabric interface standard.

> Crashes, inexplicable errors and failures. Huge amounts of bloat.

Putting the whole kit and kaboodle aside and focusing on just synthesis, isn't it strange that even the big, specialized 3rd party vendor tools (e.g. Cadenace, MG, Synopsis) suffer just as much? I think it's a genuinely difficult problem given the multi-disciplinary nature of the things EDA engineers have to deal with. As much as I dislike dealing with flake tools, I'm nevertheless humbled by their challenges.


You've pretty much hit the nail on the head. What people outside of the EDA field fail to understand is they are coding at a much higher level. RTL does not model all the aspects of the logic. You have different signal delays due to lots of different reasons: fanout, wire lengths etc. As you go through different stages of synthesis via these tools, a lot of decisions are made governed by the constraints files. Each decision affects the performance of the design. It's slow because it's doing a lot of work to optimize speed or area. Some designs won't fit the constraints. Then it reports errors.

The tools are trying to help. The end product is the physical device. The various models are all just abstractions of the physical device. The tools are reporting the problems on the abstractions to assist you to improve the physical device. If you can understand the reports, you can improve things, either altering the RTL or adding more constraints.

The point being that the only time RTL is actually run like a software program is during simulation. This simulation is only an approximation of how the actual thing will work. It is not like SW. The tools do a lot of other things with that RTL. Maybe if people don't throw garbage in, it wont crap itself trying to figure it all out.


> The vendor stuff is huge and inefficient and crap

Thankfully, we now have a pretty decent open source alternative:

http://www.clifford.at/icestorm

and you are welcome to contribute!


Interesting project, but it hardly qualifies as "a pretty decent open source alternative" considering it targets a single family of FPGAs from a vendor with comparatively insignificant market share.

P.S. The only dabblings I've had with Lattice devices was designing a custom tool to configure legacy (pre-JTAG) ispLSI family of CPLDs using our tech, not theirs. Goal was to eliminate their obsolete tool chain (ispLever Classic and piece of shit USB dongle) from the configuration management loop for existing stable applications in longterm lifecycle sustain mode.


> Imagine that there's no GCC, no CLANG, no GDB, no Jetbrains, no Eclipse. All the related tools we use wouldn't have much of a reason to exist because just about everyone had to use the huge IDEs for each vendor anyway.

I don't need to imagine, none of them existed when I started programming.


> Crashes, inexplicable errors and failures.

I bought a xilinx spartan dev board 10 years ago and it was just like this. ISE was 4gb of java bloatware, 2 gb (!) more of "updates", crashed frequently, barely ran, imploded under its own weight. I managed to make some lights go blinky and said "eff this".

It's sad to hear that 10 years later nothing has changed.


When I started the learning process I did expect to have a hard time. I expected VHDL to be very different from software. I expected all kinds of problems related to the "clashing of paradigms". I have encountered those problems. But at the same time I've been fighting with complex config files, unhelpful warnings, horribly long debugging sessions... I feel like the tools could be better. They could have helped me instead of being in my way. I now know how to do all of it, but it took me a LOT of time to learn.

And yes, maybe my goals were too ambitious. Ethernet and PCIe have proven to be hard to master.


I always emphasize that it is hardware design, not hardware programming.

You describe a physical and electrical reality at register transfer level when writing implementation code in Verilog and VHDL to be used in FPGAs and ASICs.

I've seen quite a few implementations (esp in VHDL) by people not understanding that what they write will end up as hardware.

As an example, ff your design contains numerous wide variables written to and read by several instances, that means there will be buses going back and forth between your logic gates. There are only a few levels of wires crossing eachother (that can either be built in an ASIC or are available in a FPGA device). When that limit has been reached, you will have to route around.

Think HW (at fairly ideal RTL) and write code to efficiently describe that in a way that _all_ tools in your chain can understand and parse correctly.


I have only briefly touched FPGAs in college, but to me writing Verilog/VHDL is akin to writing C for baremetal MCU. In both cases it is not too difficult to write correct code that simply does not work, or more appropriately makes no sense, on given hardware.

As for the tooling, to me it looked too low level with too many knobs for a beginner without significant domain knowledge to grasp. Maybe that or [Footnote] was what intimidated GP. I understand the reasons why high level building blocks in high level programming are easy to use self contained modules while high level modules in HDL have that many twists and knobs to twiddle with.

I guess software developers are so much used to taking prebuilt modules and something-something-fudge-fudge-rinse-repeat-until-works'ing them together (not meant as an insult in any way; I am myself guilty of that), that we sometimes forget that `x if y else z` actually moves the magic smoke and pixies in ICs. So I think I can sum up your comment on traditional developer sloppiness and GP's comment on atrociousness of tooling in one sentence: too often we forget that HDL programs must be formally correct.

[Footnote]: I remember some fragments of my fights with FPGA when things worked intermittently because my signals were not stable (race conditions, dear past me) or whatever have I written did not even make sense on given hardware (probably no hardware would support such weird trigger conditions, but there I was staring at the screen with blank eyes and only beginning to seriously consider actually thinking), yet the tooling happily synthesised my code.


> IMHO, the barrier isn't so much the tools, but a clashing of paradigms.

I guess another issue is that most EE guys are happily using GUI tools and don't have any ideological problem with commercial tooling, as what matters is the final result.


I think you can treat 60 - 70% of the FPGA design flow as open source. For example, I am developing a system using PCIe, 10G Base T and some logic to send and receive network packets and to design the HDL and test it, I am using two open source tools predominately (icarus verilog and cocotb). I just use the FPGA P&R tools for building the design once I am satisfied it works. You can also run these tools on the command line quite easily and automate most of the process (They all use tcl for scripting up the flow). Sure theres a few FPGa specific interfaces you have to deal with (transceivers, DDR4, pcie hard ip) but you can pretty much traet these as black boxes and write your tests to target the interfaces in and out of the logic. Also, for things like transceivers, the interface is really not that different between Xilinx and Altera (I treat them as a black box that generates 32-bits every 322MHz cycle for 10G-Base T). The flow to my mind is not that disimilar to a traditional software development flow. I have simulation tests and test cases, I use continous integration to run tests everytime something is commited, everytime I build the FPGA with the P&R tools, I kick off hardware tests automatically, etc


Agreed on the closing note, but I'm not certain that's the case. I mean a good tool is a good tool, regardless of foundational discipline.

It comes down to choice...or rather lack thereof. EDA tools are very expensive and tend to have a steep learning curve. Professional divisions tend to standardize to keep costs in check. Here's an example of one that's "affordable"[1].

[1] http://www.digikey.com/products/en/development-boards-kits-p...


I feel the need to echo metaphor's reply. Even if tools for FPGAs do become better through the years, it doesn't mean they'll become "sw dev friendly".

If you do software you should be fluent with data structures, while on the FPGA/hardware side you "speak" metastability control and timing constraints. Just separate fields of interest.

When high level synthesis tools arise (such as Xilinx HLx, C -> straight to FPGA implementation) I still feel they are aiming the wrong audience: these tools still require a more-than-average knowledge of the FPGA fabric and implementation to work with, and that may be a good thing, but defies the HLS paradigm at its roots IMHO.

Source: humble experience through my job as an FPGA designer


Cool! I'm going to attempt to learn VHDL, as I just picked up a Zybo board to learn with.

I'm just wondering what you used for PCIe out of interest? A while ago I picked up an Igloo2 board, but found their software impossible to use under Linux alas.


I've been working with a NetFPGA 10G board. It's quite cheap for academic research, but not for hobbyists. It's also rather old now. The project comes with open source HDL, but I've had immense trouble getting it to work. You need a special license from Xilinx just to get started (ethernet core). I wish I had chosen something else. I tried a Lattice ECP5 Versa board, but you need to sign a NDA just to get the ethernet documentation, which of course is impossible.


There are many projects started some 10 years ago with Cyclone III FPGA and vendors do not want to scare the folks working with the same Quartus since then.

I wish I could simply synthesize my algorithms from C code instead writing that VHDL or Verilog code. SystemVerilog is a bit better, but the company does not allow using it.


Cyclone III was last supported in v13.1[1] of Quartus, which had a Nov-2013 release date[2]. Curiously, Figure 2 in this white paper[3] depicts all Cyclone models active as of March 2014, suggesting their tool went through significant refactoring with some big picture in mind, considering support for a largely active family wasn't continued. Not sure how much scaring folks away factored into their decision.

[1] http://dl.altera.com/devices/

[2] http://dl.altera.com/13.1/?edition=web

[3] https://www.altera.com/content/dam/altera-www/global/en_US/p...


We're 2017, VHDL 2008 was released 9 years ago, and this book is using VHDL 93.


VHDL compared to Verilog and SystemVerilog has much smaller user base and is less well supported by tools. And HW designers are hugely conservative when it comes to languages.

A raeson for this is that we have so many tools and the subset we can use the the GCD of all parsers in all tools.

Linter, EC-checker, simulator (at least one, often more than one), planner, synthesis/build tool, integration tool etc.

The synthesis subset of Verilog 2001 and 2005 are as far as I have seen accepted by virtually all tools. I tend to err on the side of caution and use Verilog 2001.


As the old hardware commandment goes: if it ain't broke, don't fix it?


It VHDL as implemented at the time was broken, at least as of 2010. [0] VHDL 2008 support would have definitely had a positive impact on the VHDL source code (making it smaller and more comprehensible). There were so many 2008 features I wished to use but simply couldn't. With the slow pace that FPGA toolchains progress, I'd imagine the situation isn't vastly better today.

Luckily I read [1] and realized that VHDL can be written in a better way, even with old toolchains. This 2proc style is so much easier to debug than typical RTL syle. It was unfortunate that the 2010 era Quartus II toolchain did not optimize my behavioral code well. The CPU caches were the worst offenders, which isn't too surprising. Tons of enormous, almost certainly inefficient, muxes really pushed the LE limits on the FPGA and my patience during the very long synthesis time.

[0]: https://github.com/jevinskie/mips--/tree/master/project4/sou...

[1]: http://www.gaisler.com/doc/vhdl2proc.pdf


You can always contribute, the link to github is provided in one of the comments.


incorrect, nobody can immediatly contribute, because contribution takes time.


I didn't tell immediately. I told that if you think that book can be improved, you can improve it by contribution


Quite a few people have been working on making VHDL more accessible by generating it all from Python. Tim Ansell gave a talk about it at LCA 2017 (https://www.youtube.com/watch?v=MkVX_mh5dOU), and it seems pretty cool.


No way to specify different clocks or resets from what I can see. That's a massive limitation to your designs.



Note that it was probably updated since then: Release: 1.18 Date: 8 March 2016


Latex source for the book on github shows what & when:

https://github.com/fabriziotappero/Free-Range-VHDL-book/comm...


Can I write an ASIC with it?


Yes you can!

(no idea why emddudley is claiming otherwise)


I was under the impression you could too. Can you somehow convert it to a format say Magic VLSI could open, out of curiosity.


Magic is a layout tool - you need to synthesize your HDL (Verilog and/or VHDL), then map it to the technology you are trying to target.

In short, not directly - it's the front end tool, and Magic is a different front end tool (direct, full custom layout).


You can design an ASIC with HDL before it is produced but ASIC is not programmable. An ASIC is produced as it is. So you CAN'T WRITE an ASIC.


See Introduction to ASIC Design for a detailed description:

  https://www.u-cursos.cl/ingenieria/2010/2/EL653/1/material_docente/bajar?id_material=305352


No, because VHDL is used for configuring FPGAs. An ASIC is a different beast entirely.


Actually yes - the H stands for hardware: that's any kind of digital logic. You just need a different "back end" and then in theory the same model can run on various targets - in fact there is a whole business model around getting a design up and running on an FPGA and then porting it to an ASIC once you can justify the volume.


You can most certainly build ASICs in VHDL. VHDL is not restricted to any target technology. There are (commercial) tools which allow you to synthesise the masks needed for ASIC production from RTL (register transfer level) VHDL code given a cell library for your target technology. See random Google search result [1].

[1] http://web.engr.oregonstate.edu/~traylor/ece474/vhdl_lecture...


No it is not entirely different. Don't forget that FPGA is originally made for prototyping an ASIC, and HDL came out to configure digital circuit before FPGA is invented.

Of course an ASIC cannot be made with only HDL, and not sure if it is a correct expression to say 'to write' an ASIC.


VHDL and Verilog are used in pretty much all CPU design.


Please don't use VHDL. Use verilog, as a great cycle accurate FOSS simulator/compiler exists for it (Verilator).


While I also use Verilog, I don't understand your plea. VHDL has it's benefits over Verilog, most notably an actual type system (Verilog has none - you'll be redefining the same signal bundles over and over and over again). But yes, Verilator is quite nice.

I think both Verilog and VHDL are fairly terrible and are unnecessarily stuck at the connecting-wires-together paradigm. I have high hopes for systems like Clash[1]/Chisel[2]/SpinalHDL[3] to gain more widespread usage and finally make higher abstraction levels and metaprogramming standard in the industry.

[1] - http://www.clash-lang.org/

[2] - https://chisel.eecs.berkeley.edu/

[3] - https://github.com/SpinalHDL/SpinalHDL


Wow, that is some terrible advice. VHDL is just as valid a choice as Verilog. If a FOSS simulator is your requirement, VHDL has ghdl. Also, many companies only allow VHDL.

As better advice, it is worth writing something in both languages to get a feel which one you like better. Both have their own advantages and disadvantages (VHDL: Strongly typed, Verilog: Less verbose).


This is not a good reason not to use VHDL- GHDL is open source and simulates VHDL?

Verilator is only useful if you don't need backannotated timing constraints.


> cycle accurate FOSS simulator

And how do you propose simulating asynchronous logic and delays? How do I back annotate synthesized net and cell delays into your one-true-simulator?


I rather use Ada like syntax.


Any good Verilog/SystemVerilog book you can recommend?


http://www.doe.carleton.ca/~jknight/97.478/97.478_02F/Peterv...

I do FPGA design as part of my job. This is the most concise document I've found that covers everything you need to know to get started. Many verilog books just focus on the language. This short pdf focuses on how to use the language to do hardware design, which is what most people actually want to learn.


Many thanks for that link. I wrote VHDL for 10 years and sometimes had to interface to Verilog modules. I only know enough Verilog to read it and get a rough idea of what it does, sometimes asking for help from an expert.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: