
Free Range VHDL – VHDL programming book available for free - EvgeniyZh
http://freerangefactory.org/
======
AlexanderDhoore
I learned VHDL a few months ago. Since then I've build a few projects with it
(some PCIe and ethernet stuff). I was very excited to learn a HDL because
FPGAs seem like extremely powerful devices.

However, I now know why FPGAs aren't as popular as they could be: the tools
are horrible. There is no way FPGAs will become popular among software
developers unless the tooling improves dramatically. You can really tell that
it was all build by hardware engineers who know their electronics very well,
but not good software design.

~~~
metaphor
> There is no way FPGAs will become popular among software developers unless
> the tooling improves dramatically.

Care to expound in what way? I don't disagree with your opinion completely,
but I suspect our reasons differ.

IMHO, the barrier isn't so much the tools, but a clashing of paradigms. All
too often, traditional software developers without a solid foundational
background in digital hardware architectures think they can easily pick up a
HDL as if it were yet another programming language. The _HD_ part is quickly
forgotten as behavioral processes are hacked away using all too familiar
constructs like if-then-else, for/while loops, etc. Inference warnings are
ignored, timing constraints are shrugged off, metastability and
synchronization aren't even a thing, let alone driver/receiver selection,
pinmap planning, and signal integrity considerations at the PCB level...set to
default and things should _just work_ , right?

Anecdotally speaking, the only thing in common that HDLs have with traditional
programming languages is a superficial charset. I think every software
developer who has ever used tools like VS, Eclipse, Emacs, Vim, Make, Doxygen,
Git, etc. would agree that each has its quirks and it takes a bit of use
before a comfortable flow is discovered. FPGA vendor tools are no different,
except--testbed simulations aside--there's that immutably distinct custom
hardware integration end-game that most traditional software has the pleasure
of conveniently abstracting away.

Any kernel/driver developers with HDL proficiency care to chime in?

P.S. ground-up PCIe/Ethernet FSM+datapath architectures after a few months
without prior HDL experience is really impressive.

~~~
ploxiln
I went to CMU for a BS in EE, I used a lot of verilog, and also a bit of
Cadence for layout in one class. But I've worked as a software engineer since
then.

I totally agree that HDL tools are astonishingly crap. The market is just so
much more specialized, expensive, locked down and closed.

Imagine that you had to use the Intel compiler to get source code compiled for
intel processors. You had to use their special headers, you practically had to
use their huge 8GB Intel IDE with a million lists and buttons, you had to
choose what features were on your intel processor (VT-X? SSE4?). Imagine AMD
is the same, but separate. This is all closed source stuff of course. And you
have to use it to run on any smartphone/laptop/desktop/server class processor.

Imagine that there's no GCC, no CLANG, no GDB, no Jetbrains, no Eclipse. All
the related tools we use wouldn't have much of a reason to exist because just
about everyone had to use the huge IDEs for each vendor anyway.

The vendor stuff is huge and inefficient and crap, because features sell, and
crap doesn't not sell because there's no meaningful competition.

To be more specific, "crap" means high bloat and low reliability. Crashes,
inexplicable errors and failures. Huge amounts of bloat. But you can make a
tweak and try again move on, so any sane hardware person just gets used to it,
it's not like they can do anything about it.

~~~
metaphor
Apologies but I really can't tell if your point is against the proprietary
nature of reconfigurable devices, or vendor tool bloat.

> Imagine that there's no GCC, no CLANG, no GDB, no Jetbrains, no Eclipse.

This is quite easy to imagine when you remove the key element which allows
these L7 abstractions to be meaningful: an underlying kernel with a well
defined interface. What equivalence does the reconfigurable world have when
every target device requires its own unique "kernel", if you will? I can't
think of any...which would explain why 3rd-party tools are constrained to
synthesis, while place-and-route is an explicit function of the vendor tool.
Perhaps we too easily conflate the size of the tool with the size of the
target?

On the flip side, supposing there were some open reconfigurable interface
standard, I don't think this would fly in the current market given the high-
performance nature of these devices. Top two FPGA vendors apparently change
slice/CLB structure every generation, let alone agree on an open fabric
interface standard.

> Crashes, inexplicable errors and failures. Huge amounts of bloat.

Putting the whole kit and kaboodle aside and focusing on just synthesis, isn't
it strange that even the big, specialized 3rd party vendor tools (e.g.
Cadenace, MG, Synopsis) suffer just as much? I think it's a genuinely
difficult problem given the multi-disciplinary nature of the things EDA
engineers have to deal with. As much as I dislike dealing with flake tools,
I'm nevertheless humbled by their challenges.

~~~
gluggymug
You've pretty much hit the nail on the head. What people outside of the EDA
field fail to understand is they are coding at a much higher level. RTL does
not model all the aspects of the logic. You have different signal delays due
to lots of different reasons: fanout, wire lengths etc. As you go through
different stages of synthesis via these tools, a lot of decisions are made
governed by the constraints files. Each decision affects the performance of
the design. It's slow because it's doing a lot of work to optimize speed or
area. Some designs won't fit the constraints. Then it reports errors.

The tools are trying to help. The end product is the physical device. The
various models are all just abstractions of the physical device. The tools are
reporting the problems on the abstractions to assist you to improve the
physical device. If you can understand the reports, you can improve things,
either altering the RTL or adding more constraints.

The point being that the only time RTL is actually run like a software program
is during simulation. This simulation is only an approximation of how the
actual thing will work. It is not like SW. The tools do a lot of other things
with that RTL. Maybe if people don't throw garbage in, it wont crap itself
trying to figure it all out.

------
oelang
We're 2017, VHDL 2008 was released 9 years ago, and this book is using VHDL
93.

~~~
metaphor
As the old hardware commandment goes: if it ain't broke, don't fix it?

~~~
jevinskie
It VHDL as implemented at the time was broken, at least as of 2010. [0] VHDL
2008 support would have definitely had a positive impact on the VHDL source
code (making it smaller and more comprehensible). There were so many 2008
features I wished to use but simply couldn't. With the slow pace that FPGA
toolchains progress, I'd imagine the situation isn't vastly better today.

Luckily I read [1] and realized that VHDL can be written in a better way, even
with old toolchains. This 2proc style is _so_ much easier to debug than
typical RTL syle. It was unfortunate that the 2010 era Quartus II toolchain
did not optimize my behavioral code well. The CPU caches were the worst
offenders, which isn't too surprising. Tons of _enormous_ , almost certainly
inefficient, muxes really pushed the LE limits on the FPGA and my patience
during the very long synthesis time.

[0]:
[https://github.com/jevinskie/mips--/tree/master/project4/sou...](https://github.com/jevinskie/mips--/tree/master/project4/source)

[1]:
[http://www.gaisler.com/doc/vhdl2proc.pdf](http://www.gaisler.com/doc/vhdl2proc.pdf)

------
cyphar
Quite a few people have been working on making VHDL more accessible by
generating it all from Python. Tim Ansell gave a talk about it at LCA 2017
([https://www.youtube.com/watch?v=MkVX_mh5dOU](https://www.youtube.com/watch?v=MkVX_mh5dOU)),
and it seems pretty cool.

~~~
gluggymug
No way to specify different clocks or resets from what I can see. That's a
massive limitation to your designs.

------
dang
Discussed in 2012:
[https://news.ycombinator.com/item?id=3580537](https://news.ycombinator.com/item?id=3580537).

~~~
EvgeniyZh
Note that it was probably updated since then: Release: 1.18 Date: 8 March 2016

~~~
harry8
Latex source for the book on github shows what & when:

[https://github.com/fabriziotappero/Free-Range-VHDL-
book/comm...](https://github.com/fabriziotappero/Free-Range-VHDL-
book/commits/master)

------
notforgot
Can I write an ASIC with it?

~~~
emddudley
No, because VHDL is used for configuring FPGAs. An ASIC is a different beast
entirely.

~~~
sofayam
Actually yes - the H stands for hardware: that's any kind of digital logic.
You just need a different "back end" and then in theory the same model can run
on various targets - in fact there is a whole business model around getting a
design up and running on an FPGA and then porting it to an ASIC once you can
justify the volume.

------
zump
Please don't use VHDL. Use verilog, as a great cycle accurate FOSS
simulator/compiler exists for it (Verilator).

~~~
EvgeniyZh
Any good Verilog/SystemVerilog book you can recommend?

~~~
blackguardx
[http://www.doe.carleton.ca/~jknight/97.478/97.478_02F/Peterv...](http://www.doe.carleton.ca/~jknight/97.478/97.478_02F/PetervrlQ.pdf)

I do FPGA design as part of my job. This is the most concise document I've
found that covers everything you need to know to get started. Many verilog
books just focus on the language. This short pdf focuses on how to use the
language to do hardware design, which is what most people actually want to
learn.

~~~
BillBohan
Many thanks for that link. I wrote VHDL for 10 years and sometimes had to
interface to Verilog modules. I only know enough Verilog to read it and get a
rough idea of what it does, sometimes asking for help from an expert.

