
Why hardware development is hard, part 1: Verilog is weird (2013) - yarapavan
https://danluu.com/why-hardware-development-is-hard/
======
tonmoy
Verilog seems "hard" because people often use it in a similar fashion to a
programming language, and in most cases that does not make any sense. The
proper way to use it is to design the hardware, then code it up using verilog
(which is trivial compared to the actual design). In fact in terms of time
spent on design, the actual coding part does not take much time at all. In
fact, this very article uses a motivating example that does not make any
sense.

~~~
raverbashing
That's like trying to say a certain "high level language" is weird because you
need to first write your program in ASM them translate it to this language

It makes no sense

I don't want to design from the bottom up _and then_ if I'm "lucky" the
compiler will give me what I wanted.

~~~
jwise0
Well, what you want and the physical realities of synthesis are different :-)

In the early days of compilers, of course, you were mostly writing C as a
macro language for your system's assembly language. If you wanted your program
to perform well, you'd have to write C that was, more or less, a translation
of assembly that you'd constructed in your head first. If you wrote bizarre C,
you'd either get incorrect results, or if you were lucky, you'd get correct
but inefficient results.

But that's also Dan's point: Verilog isn't a "high level language". You don't
write programs with it, you describe hardware with it. (In fact, that is why
it is called a 'hardware description language'!) So if you try to write a
program, instead of describing hardware, you'll get something that isn't
really either.

~~~
raverbashing
> If you wanted your program to perform well, you'd have to write C that was,
> more or less, a translation of assembly that you'd constructed in your head
> first.

Maybe in really old compilers

> You don't write programs with it, you describe hardware with it

Which is fair enough, but it seems the "hardware description" pretends to be
of a higher-level than it really is.

If you need the user to describe gates and flip-flops and how they connect
then make them describe this.

~~~
vvanders
> If you need the user to describe gates and flip-flops and how they connect
> then make them describe this.

You're talking about RTL, which is exactly what these languages output.

Fundamentally they're not programming languages, unfortunately the initial
instinct is to treat them as such and it leads to a ton of confusion.

~~~
wolfgke
> You're talking about RTL, which is exactly what these languages output.

If VHDL/Verilog would output RTL, you could easily analyze it just as you
analyze assembly output of your favorite compiler. Unluckily the output is
some proprietary bitstream for the FPGA.

~~~
k0ngo
Just as a software compilation flow is devided into preprocessing,
compilation, assembly, and linking, a HDL flow is divided into synthesis,
mapping, place and route, timing analysis and bitstream generation. RTL is the
output of the synthesis stage and is readily available to the designer,
typically both as code and as a graphical schematic.

------
CPU_Verif
I'll throw in my opinion for this. The author states how its routine for
people to be older in the Hardware Industry. I work as a Verification engineer
for a CPU project.

From what I see, every company in this sector works off of years of
experience. Even if you are god's gift to the world of CPU design, the career
options available to your are linked to the years you have worked.

I recently attended a presentation by ARM, where they boasted of this quote
"At ARM, we don't really expect our graduates to be able to do anything until
they're about 2 years into their careers, at which point they're expected to
be decent at something"

And they boasted of it! To a theater full of CS students. And this was a 35
year old line manager making the comment.

If you want a career where you can rise based on merit, and not the numbers of
years on your CV, don't pick hardware. I've seen it as an insider, the
occasional "kid" comment being thrown around(ARM), designers refusing to work
with junior verif staff(imagination tech), people who have written a project
from scratch over 2 years not given tech lead roles as the project grew, etc.

Such an enviroment quite often bleeds younger engineers. It saps desire and
motivation. Because there are so few startups, good practices don't matter.
Here's some conversations I've had:

"We don't need to write tests for our software ISG, its a waste of time" "What
is wrong with me writing an extension to the software as a special case? It
only executes that code if the username is me" "We don't do training for new
people".

I routinely see younger engineers with 3-4 years of experience try and spin
out of CPU/Hardware and into Software.

~~~
kbenson
> And they boasted of it! To a theater full of CS students.

I think this is highly contextual, and how you interpreted that statement
depends on how you were primed to interpret it. You can assume they meant that
it's an age based hierarchy, or you can assume that it's an extremely complex
job that a CS degree doesn't adequately prepare you for, so it takes a couple
years of apprenticeship before you are qualified to do something yourself.

There are plenty of industries like this, where it's not just about being
smart and using your intellect to put the relatively small amount of parts
together into something that works, but instead has a large chunk of learned
wisdom that is not quick to impart, the importance of which is sometimes
discounted until it's dealt with, and the consequences of messing up are
fairly high.

Some examples: electricians, plumbers, architects. Fields that we've decided
often require certification and that certification requires an apprenticeship
period, because we don't want electrical fires or dead electricians, or water
damage to our homes, or leaking sewage, or collapsing buildings.

In a lot of ways, I think the problem with the software engineering industry
is that we don't have a system like this in place. Name me a programmer that
thinks he was just as good at 22 at not making architectural or security
mistakes as he was after 10 years more of programming and I'll name a
programmer I never want to work with. Chances are they'll be the same person.

~~~
eli_gottlieb
Yeah, when I want kernel or firmware code written, I want it done by someone
experienced, or someone trained by someone experienced. I've had the very
fortunate experience of being the latter person (up until only a little while
ago), so looking back, I can damn well see the difference.

------
shubb
These objections don't totally match my limited experience (not done this in a
long time). When I did a lot of HDL work, I found that the main challenge was
to stop thinking of hardware design like programming.

This is why dynamic typing (even if it is static type inference) is a bad idea
in the hardware world. Embedded software programmers want C style types
because they are often flipping bits or packing data into a specified data
structure so that some hardware that expects particular bytes in particular
places knows where to get it. In HDL, you are fitting your logic into very
limited space, and need to constantly keep in mind what hardware the code you
are writing is going to synthesize to.

Later, I realized that HDLs are often not really used like we'd think. Having
written the logic in C or maybe Verilog, a lot of shops figure out what the
actual desired hardware is and write verilog to specify that. So in verilog,
rather than getting the compiler to infer a flipflop you can specify one and
use verlilog to wire it up. Professional shops often basically use verilog to
describe a circuit design in text, like a C programmer dropping into assembly.

The thing is that, like in the 1980s, you are so much fighting to get the
fastest, smallest implementation possible, that like an 80s programmer, many
default to assembly - to specifying the logic elements rather than leaving it
to a compiler.

That means verilog as written many places is not really used like verilog at
all - more like using verilog to express a netlist. Verilog is used to wrap
vendor specific code so you specify a flipflop rather than a Xilinx flipflop,
to make it easier to go to another vendor or to ASIC, but the clever features
of verilog are only used for your test harnesses.

Part of the reason for this is that FPGAs are full of specific purpose
hardware like dual port memories, and the compiler won't always infer them for
you. That means you need to explicitly tell it to use them - and once you are
explicitly specifying some parts of your system, you tend to do it more and
more.

~~~
wolfgke
> Part of the reason for this is that FPGAs are full of specific purpose
> hardware like dual port memories, and the compiler won't always infer them
> for you.

You can only use a HLL if you can rely on the fact that the compiler will
nearly always put out much better code than you would be able to invent in a
reasonable amount of time. This is the reason why even in the late 90th for
performance-intensive code (games, multimedia) the critical parts were often
still written in assembly instead of C/C++.

The problem is for FPGAs the bitstream format is very closed - so you can
hardly dive into the low-level abyss and "look at the generated assembly
code/bistream" to rewrite the generated assembly code/bitstream code until you
are satisfied with the result; you _have_ to use the HLL (VHDL/Verilog), where
you cannot trust the quality of the generated code.

~~~
blackguardx
Instead of looking at the "assembly" or bistream, FPGA and ASIC designers use
a graphical tool to look at the inferred hardware blocks. There are also tools
to visualize how these blocks are placed on the die/FPGA.

------
AceJohnny2
As a SW dev who frequently gets to interact with HW folks (embedded dev FTW),
I frequently find their conservatism w.r.t. software tools puzzling.

Dan Luu's explanation makes me think they're driven by trauma inflicted by
their existing tools. ;)

(but really, I figure they have enough trouble dealing with actual HW concerns
that they don't have the bandwidth to deal with SW stuff not immediately
relevant to their HW concerns.)

~~~
InitialLastName
The problems you name are true, but another issue is that errors in HW tend to
be much more expensive than errors in SW. If, say, my timing analysis tool is
wrong and I spin a chip based on its results, I'm out ~$100,000 and a few
months of development time. That's also why we're willing to pay $thousands
for licenses to proprietary SW; you want to be really confident in your tools.

~~~
AceJohnny2
Right, except the promise of better tools is a) fewer costly errors and b)
faster development time. If cost and time are what you're optimizing for,
they'd be a clear better choice.

But they're not.

So what's missing from the picture?

~~~
user5994461
> So what's missing from the picture?

The hardware tooling suck.

IMO: Hardware people suck at writing software so they can't make better. And
software people don't want to get there so there is noone to do better.

------
codebook
Designing hardware is done by Visio and Excel not Verilog. Verilog is just an
implementation. :)

Added: Visio is to draw timing diagram, block diagram, and flow chart. Excel
is used for designing FSM next state logic / output.

~~~
trevortheblack
In my experience it's visio and Matlab.

You need to prove that you can get a certain performance with a matlab
algorithm, then you spend a week writing it into verilog.

Oh, and as someone whose bread and butter is ICs, I go out of my way to avoid
creating FSMs. They're slow, difficult to debug, and hard to pass on to your
successor.

~~~
moftz
Interesting, in my digital design and microcontroller classes, they have been
beating FSMs into our heads since day one. I knew there were a few projects
where I knew it wasn't totally necessary to do but used a FSM to get the
checkbox on the rubric but there were many designs where I had no clue how I
could do it without one and without making the code highly complex.

------
nom
Isn't it more appropriate to say that Verilog is weird because of hardware?
Once you go down to the lower levels you enter another realm that requires a
completely different way of thinking. HW dev is not comparable to software in
any way just because Verilog is a language.

------
jeremycw
I remember reading Verilog HDL Synthesis: A Practical Primer by J. Bhasker in
University and found it incredibly useful for learning the synthesizable
subset of Verilog and all the unwritten rules of how you need to write the
code to make it synthesize a certain way.

I used that knowledge to write tetris in Verilog that output to VGA which
after testing at the wave level in editor worked the first time I actually
loaded it onto real hardware.[0]

[0] [https://github.com/jeremycw/tetris-
verilog](https://github.com/jeremycw/tetris-verilog)

------
alain94040
Explaining Verilog to programmers is a recurring topic and the hardware
community is not doing a great job at it [yet]. So feel free to contribute to
this open book:
[https://en.wikibooks.org/wiki/Programmable_Logic/Verilog_for...](https://en.wikibooks.org/wiki/Programmable_Logic/Verilog_for_Software_Programmers)

~~~
wolfgke
I believe I understand the ideas behind Verilog. I see the problem rather in
the fact that there are few tutorials available that lead you from beginner to
decent "Verilog programmer". The problem begins already in the fact which FPGA
board to buy, how to setup and use the environment, what software is freely
available, how much does the code depend on the specific board, how to debug
the code etc.

For programming there is good material for this available, but not for
VHDL/Verilog.

------
noescape
Is it impossible to do hardware development in C and automatically convert C
to Verilog?
[https://en.wikipedia.org/wiki/C_to_HDL](https://en.wikipedia.org/wiki/C_to_HDL)

What does one lose if they do this? Why isn't this conversion more common?

~~~
GrumpyYoungMan
> _Is it impossible to do hardware development in C and automatically convert
> C to Verilog?_

Yes, it is. C can be converted to Verilog only if you structure the C code as
if it were Verilog.

You have to understand that, fundamentally, the CPU is a lie. It is an
illusionary abstraction layer fashioned out of bare transistors that pretends
to be a von Neumann architecture machine, with a nice set of registers, a
instruction pointer, and so forth that chugs dutifully through assembly
language instructions, one after another. (Note that nothing about the von
Neumann architecture requires binary; for example, the Babbage Analytical
Engine computed in decimal.)

At the transistor / gate level, _everything_ is parallel until you impose some
order on it and build in clocks and flipflops and so forth to impose some sort
of structure and chronological ordering on things. Few, if any, high-level
languages are equipped to describe that in any sensible way.

Verilog, VHDL, and so forth are not "weird". These HDLs, while not without
their warts, describes the underlying reality of computing. It is assembly
language and the high-level languages that are "weird".

(edit: grammar fixes + clarifications)

~~~
joesavage
Perhaps I'm reading too far into what you wrote, but I'm not sure I agree that
the CPU is fundamentally different from other layers of abstraction in
computer systems.

All abstractions “lie” in the sense that they present a perspective of the
world that is slightly different to the reality — function calls “lie” about
the operations that are really being performed, the ISA “lies” about the
electronics of the CPU, and transistors “lie” about the underlying behaviour
of the universe.

~~~
moftz
With a CPU, everything is done procedurally. With an FPGA, all the code kind
of runs at once. You build your modules and wire them together. The state of
the HDL layout is static. There is no stack or heap. If a module is turned on,
its doing whatever its supposed to do all the time no matter if you are
feeding a real signal into it or not. Imagine threading every possible
function you might need in a typical C program and "wiring" everything up with
global variables that cannot be initialized at startup. Printf is always
printing something and will print out some random garbage at startup unless
you tell it to print something else. Simulating larger HDL designs takes a lot
of memory because you have to model everything throughout the entire
simulation. Simulating a CPU in something like C is much less intensive since
you can call an instruction whenever you are ready for it.

~~~
joesavage
I completely appreciate that “writing hardware” is a totally different problem
to writing software, and that hardware comes with its own set of quirks and
challenges. I'm just saying that I don't think the CPU is fundamentally
different than other abstractions in the stack.

C obviously isn't a good fit for a hardware language — it's designed for
software! That doesn't mean that there doesn't exist similarly abstract ways
of writing hardware though (that express the inherent parallelism, etc.). It
is likely that these “higher level languages for hardware” would result in
less efficient hardware solutions, but that's always a trade-off that is made
through abstractions. Writing a program in properly scheduled assembly code is
going to be a lot more efficient than writing the same program in C.

The difference with hardware in terms of language abstractions is not that it
behaves differently to software. We could easily define a programming language
that expresses parallelism in a way that would map nicely to hardware. The
problem, from my perspective at least (please correct me if you think I'm
wrong) is that hardware needs to be extremely efficient — particularly because
it cannot be easily changed. As a result, hardware languages don't tend to be
particularly abstract. But this doesn't mean that hardware languages couldn't
be more abstract!

~~~
moftz
Probably the highest level of abstraction for Verilog is using something like
Altera's IP cores. They are binary blobs intended for a specific FPGA model
that can be hooked up like any other module but they are configurable for
things like bandwidth, latency, and various inputs. You can use things like
floating point arithmetic modules or cores used to create things like phase
locked loops, a way to convert an input frequency to a different frequency
typically using a multiplier and divider. You don't need to know how these
modules work underneath, its proprietary anyway but there are usually
reference designs you can look at. For example, with a few clicks you can
create something like a VGA driver that could interface with a synthesized CPU
to create a terminal for it. You can do some neat things with IP cores but
there are probably some issues with using them in a commercial form.

------
MichaelBurge
I remember taking a course in college on digital logic. I wanted a light to
blink on and off, so I toggled it every clock. Of course that's just a dim
light, because lights toggle on and off very fast normally when they're on.

But how do you then explain the button on my FPGA that was not used in my
Verilog anywhere that could switch the light off?

I think the answer ended up being that it was switching fast enough that small
amounts of leakage current was enough to register as a TRUE value. And that
'undefined behavior' is worse than in C.

~~~
lfowles
There's a (decade old+) paper on using genetic algorithms to design an FPGA
for discriminating tones. The final design after hundreds of generations? The
input wasn't connected to the output! It relied on physical details and even
iirc stopped working as the FPGA temperature rose.

~~~
rtkwe
I believe this is what you're talking about?
[https://www.damninteresting.com/on-the-origin-of-
circuits/](https://www.damninteresting.com/on-the-origin-of-circuits/)

Even funnier was that it wouldn't work on another FPGA. It was exploiting
weird defects in that particular FPGA.

~~~
lfowles
Yes, that's the experiment! Here's a more thorough article:
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50....](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.9691&rep=rep1&type=pdf)

------
guard-of-terra
Seems like a lousy industry to be in. Lower wages, lower expectations,
shooting themself in the foot.

Seems that unlike software devs, hardware devs don't have a say, they're
merely drones to their managers. In software, lead dev will say that they need
this linter, and his superior will have to either provide or argue. But it
seems that ib hardware, management will just say "no", end of storg. You're
not getting anywhere with this attitude.

This can also be so in software dev - in defunct bigcos.

~~~
user5994461
> lower expectations

I can live with not having to do 60 hours work weeks for tech companies that
didn't even give equity.

~~~
guard-of-terra
Frankly, I'm in software dev and I've never worked more than 45 hours/week.

------
garbage_stain
Verilog has many drawbacks, including no support for structured signals. On
the other hand, the other big language, VHDL, is really difficult to use for
"modular" projects. Is anyone here familiar with CHDL, a C++ hardware design
language?

[https://github.com/cdkersey/chdl](https://github.com/cdkersey/chdl)

Being a fan of C++ myself, the idea of using template metaprogramming to
represent hardware designs is something that I think is very cool.

------
Raed667
We have studied VHDL in school.

I can say with a fair certainty that 0% of my class uses it now in their jobs.
I wouldn't ever want to touch it again with a ten foot pole.

~~~
monocasa
Really? I'm one of those weird people that like VHDL more than Verilog.

~~~
user5994461
100% of my class wouldn't touch Verilog with a pole :D

Guess that's a classic case of USA vs EU standards.

------
eggie5
I designed a Pipelined MIPS processor in Verily during undergrad. It was
probably the hardest and most rewarding task I've completed in my life!

[https://github.com/eggie5/SDSU-
COMPE475-SPRING13](https://github.com/eggie5/SDSU-COMPE475-SPRING13)

~~~
GrumpyYoungMan
HDLs and FPGAs, such luxury. In my undergraduate days, we had to design a
32-bit processor by laying out individual transistors by hand (n-wells,
p-wells, polysilicon gates, and metallization layers) in a 25 micron CMOS
process with two metallization layers in very little chip area and then prove
that it worked in simulation. It was like Tetris from hell; I'm still proud of
the compact barrel shifter design I came up with. We also had the option,
which I will forever regret not taking, to have the chip physically fabbed as
well, as long as we agreed to do the work to validate it afterward.

~~~
eggie5
haha, respect.

------
petra
If Chisel is lower level than SystemC or SystemVerilog, and those two have
much more acceptance, why go with Chisel ? Aren't modern systems complex
enough to justify abstraction ?

~~~
pjc50
I don't think they do have much adoption. SystemVerilog is still Verilog, and
most of the new functions are aimed at simulation and testbench use. SystemC
is rather painful as you have to write an artificially constrained subset and
style of C.

> "Vendors pushing high-level synthesis have a decades long track record of
> overpromising and under-delivering."

Abstraction is important, but in hardware detail matters. In software you can
afford to have a less efficient language if it results in faster development.
This is absolutely not true of hardware.

High-level synthesis has usually held out misleading promises of portability;
that you might be able to get some complex application working written by
(cheaper, easier to hire) software developers, then just drop it straight onto
a chip. That's simply not true because the constraints are different.

Chisel looks extremely promising, as it addresses a lot of what I'd say were
Verilog pain points:

    
    
      Algebraic construction and wiring
      Abstract data types and interfaces
      Bulk connections
      Hierarchical + object oriented + functional construction

~~~
tonmoy
SystemVerilog does seem to be used more and more. It is just verilog from the
designer's perspective, but you have less chance to make mistakes, easier to
write and the synthesis tool has more control over how to optimize the design.

------
mars4rp
oh Verilog is so pretty, I missed coding it so badly, compare to VHDL it is
very easy and has less bug, try nested without else in VHDL !

------
okket
(2013) see [https://danluu.com/](https://danluu.com/)

