Vendors make money selling silicon, they see the toolchain as a necessary evil.
There are 2018 tools that are still unable to fully support VHDL-2008 standard. Many bugs reported years again remain open. And often the GUI centric
approach means you are left manually changing ad nauseam several GUI fields and ticking boxes, until you eventually determine what is the tcl 'equivalent' to at least try to ease your pain (tcl scripting often is another pain all together but arguably a lesser evil).
Also specifically with Xilinx they seem to have zero consideration for basic version control and create/duplicate/modify an explosion of files and cached versions of files.
Projects like GHDL and this are a breath of fresh are.
There is an ugly feedback loop here. HDL tools interact poorly with version control -> it is difficult to collaborate using those tools -> HDL engineers would rather silo up than fight the tools -> no pressure to improve tools' support for version control.
The problem again (in my very speculative opinion) is that you have a very limited set of vendors who have a complete lock down on their hardware.
Imagine a world without gcc or llvm and only Intel or AMD could provide you with a compiler. Imagine also that they would provide you with this compiler for free, such that their business model was entirely based on them selling their CPU's. How much pressure would they have in providing a high quality toolchain?
It's not like their customers are gonna run away to a non existent competition they still need to buy CPU's and they would be stuck with their 'barely works' toolchain.
You can see the difference if you compare FPGA vendor software with HDL simulation software. Simulators are in the business of selling you software so they have pressure to actually deliver quality software. FPGA vendors are in the business of selling _silicon_ (@saagarjha) so they pretty much see the toolchain as an operational cost.
There are two things that amaze me:
1. That the dumpster fire is actually capable at all to synthesize hardware without exploding
2. That their head of software still has a job.
My very favorite issue is with their IDE. The version I'm using has a broken search and replace... if the two strings have different lengths, the editor loses its place after the first replacement -- they compute the location of matches and then just smash the replacement strings in... failing to maintain an offset as character offsets change. Makes me scream.
I see the pile of ideas they're trying to glue together. But it really just does get in the way more than anything else.
Now my process is: Verilator + CMake + CLion + SystemVerilog plugin for CLion (excellent but $$$); then when it runs well in verilator I bring it over to Vivado, fix any warnings it gripes about and try it on hardware.
"Replacing the 15 year old ISE with Vivado Design Suite took 1000 person-years and cost US $200 million."
We switched nearly all of our PCs to Linux, and it's ironic that it is our engineers' computers that still have to be on Windows because of Solidworks, MCU toolchains, Virtuoso and other semi tooling
Its file format works very well with version control systems, is human readable to an extent and can be opened with any text editor (so your data isn't 100% gone if one bit gets accidentally flipped).
I’ve used to do mechanical design with Pro/Engineer for several years. Now I occasionally do some mechanical design stuff in Onshape and all I can say is that this is a tool designed by elite mechanical engineers and developed by elite software engineers.
Regarding document management it is like going from assembly to some high level programming language.
The only downside could be that it cannot run on premise. Everything is in the cloud.
So I would not agree that non-software engineering is behind the times:-)
(I’m not affiliated with onshape)
Onshape was bought by PTC (the developer of Pro/Engineer). I hope they won’t screw it up:-|
If you want to invoke different tools for different types of files, you can either handle this in your script, or you can explicitly pass the --tool flag in cases where you want to use a non-default tool.
Writing the wrapper script is generally pretty straightforward - the only "gotchas" are making sure the file patterns can be handled by your merge tool, and making sure you write the result to $MERGED. Sometimes you need to have a "clean-up" step in your script that copies the merge tool output to $MERGED and maybe deletes tmp files that were created.
Binary file types that you can't view in some way would be challenging.
Schematic design tools that make common tasks like drawing/connecting buses an obnoxious hassle involving futzing with the mouse making sure lines and nets line up when one should just be able to describe the whole thing in a DSL and have the tool build up an initial drawing for you.
FPGA "IDEs" with no auto indentation, refactoring/rename, no support for version control, arcane project structures that don't move easily between directory structures, non-modular tools that don't work well or at all outside of the IDE, critical settings buried in dialogs, 'drag and drop' authoring that takes you about 50% of the way to your solution and then falls over in a hot mess.
I had to apply some bathroom silicone sealant over the weekend so I just go ahead and blame it on that.
(before any of you start typing your answers: that's the joke)
I was sitting on my living room floor with my FreeBSD laptop punching out my project while others were waiting in line at 2am for lab time.
Wrote all of the components for a basic 8-bit ALU while watching with Mothman Propechies and sipping some Bourbon. Wrote a C++ program in that time, too, to generate exhaustive tests for all components. Icarus was fucking awesome for all this. As a commuter student my last 2 years, not having to hang out waiting for time was awesome. I got a lot more sleep, and a lot more done.
I've not kept up with its development, but its apparently now cross platform and still under active development. Any amateur interested in Verilog should definitely give it a look.
It even worked on Windows.
And, come on, how can you argue with a coding team who will make you an honorary "Steve"? (Inside joke: for a while it seemed like everyone who worked on Icarus Verilog was named "Steve" or some variant--so we started joking that anyone not named Steve needed to get rechristened with a new honorary first name.)
Tangential, working on FreeBSD with Icarus is what caused me to learn/love/prefer Vim. It was there. It worked, and I learned (some) of it's quirks.
Processing video (especially if you have weird requirements like 14 bit greyscale input): https://gregdavill.com/blog/2018/9/9/boson-camera-project-pa...
Experimenting with non-mainstream CPUs:
Usually if you need extremely fast i/o in the nanosecond order of magnitude then FPGA designs are a common design choice. An example of this are GPON network switches.
As mentioned in another comment video processing or any signal processing where the algorithms benefit from high parallelization is also an application where an FPGA would be a good fit.
The caveat though is that often FPGA/ASIC development is expensive and slow so a recent trend is to have a System-On-Chip with an FPGA area and multicore Microcontrollers. The idea with this is a hybrid design so that you can have an RTOS dealing with functionality where speed is not as critical. And have a custom design on an FPGA that is responsible for whatever bespoke application you need and have some memory interface between the two.
Upvote for this. When I used to think about PROM, I think it as a medium of data storage, or sometimes think it's a lookup table. But it's actually simplest form of programmable logic device - a device that can transform x-bit of arbitrary input signals to y-bit of arbitrary output signals, so you can build any digital system that uses combinational logic in PROM (and RAM for sequential logic), including a CPU. And since it's a PROM, you can reprogram it to implement another different logic device, simply by burning a new truth table.
After I realized this, the existence of reprogrammable hardware like FPGAs no longer sounds like magic to me anymore. From this, you can also see that computers with finite RAM and ROM is not a Turing machine, but a Finite State Machine.
My current project is AR glasses. An FPGA is decoding a displayport signal and driving the display.
Because writing RTL is much more fun.
And if I happen to need a little CPU for some generic control operations, I just add a soft core to my design.
https://www.ecmwf.int/sites/default/files/gpsro_lecture_2015... is an overview.
There's rumors that a couple of the big cloud vendors use them on network cards for SDN.
Integration with non standard peripherals like ku band radios on a satellite.
No rumours needed, the papers are out there in the open.
It's a great product.
There's definitely a quality gradient from mainstream OS development tools down to FPGAs.