"Most importantly, we have to change the culture of hardware design. Today, we don’t have open sharing … "
This, to the 100th power.
The culture in the EDA industry is stuck in the 1950's when it comes to collaboration and sharing, it's very frustrating for newcomers and people who want to learn the trade.
As was pointed out by someone in another hardware related HN thread, what can you expect from an industry that is still stuck calling a component "Intellectual Property"?
The un-sharing is built into the very names used to describe things.
It's not just the EDA industry, it's almost all of hardware including most of the embedded software people. Yeah, I get it - I too am sometimes stuck using some old proprietary compiler with a C standard older than some of the people I work with - but come on. On my last job I used a Qualcomm radio and they ended up giving me close to a thousand zip files of driver revisions going back a decade because their build process kept screwing up the output source code. All it took was running an open source static analysis tool and 200 man-hours of junior developer time to fix the root causes of 90% of their bugs - for a product that has made billions of dollars (and I'm not talking about the GSM/4G chips with crazy standards that require tons of real R&D).
You read that right. Their build system outputs source code, generated from another C code base, using macros to feature gate bug fixes depending on who the customer is. The account managers would sent a list of bugs that a given client had experienced and the backoffice engineers would make a build that fixed only those bugs.
Forget about collaboration and sharing. They haven't even figured out the basic business processes that many software engineers take for granted.
This decision was likely not made to rip-off some more moneys, but because of bad quality. If any bugfix might as well introduce two new issues, you don't deliver the fix if it isn't needed. The whole "if it ain't broke, don't fix it" saying has some of its roots in that environment, maybe even all.
This approach is wrong on many levels, but fixing the real cause is often a bit more work than "a hundred junior dev hours". In the meantime, you have to deliver to customers.
I think a reasonable explanation may be that other customers possibly have come to unknowingly rely on some of the bugs. Shipping a fix of all the bugs could actually break their code!
Add in qualification of specific hardware inside specific operating conditions.
The "move stuff and break stuff" crowd forgets there's a market for "move slow and rarely break stuff" too.
So if you have qualified hardware w/ a certain driver version, you don't just ship a bunch of code fixes to a customer, you ship the bare minimum change set required to fix any specific issues they experience. Because you're sure as hell not recertifying every part revision with every driver version.
I am definitely not one of those "move fast and break stuff" people, especially since I often have only a few months to design something that is expected to be in service until oxidation is a legitimate concern. My personal ventures are largely web based so I've also become rather too personally acquainted with the other extreme - the madness that is node_modules/, implicit *.d.ts imports, and browser impedance mismatches.
The problem in the hardware industry is far more insidious. Unlike with software dev, there is no concept of "full stack" or "devops" in manufacturing. The entire field is so hyperspecialized that youve got people dedicated to tuning reflow temperatures to +-5C depending on humidity and local weather. No one person can have a proper big picture view of the situation so the entire industry is dominated by silos each with their own micro-incentives and employees competing with each other for notoriety.
That’s absolutely true. Once you’ve delivered hardware with certain bugs that can be worked around by firmware/software, often that means the firmware/software teams don’t need or even want those bugs fixed. Fixing the bug means they’d have to go back and change their workarounds, which not only costs time/money, but adds risk that their may be unintended consequences and new issues.
Yep, there is a method to the madness. Something complex like making an electronic product is bound to have tons of pitfalls and most of these behaviors have rational (albeit twisted by economic incentive) reasons for why the things are the way they are.
We were, however, an established client of Qualcomm and working on a new design so we explicitly asked for all bug fixes and they refused except for the ones we could name. We got "lucky" in that no other client had needed both bug fixes at the same time and when the build system broke, the one guy responsible for it was MIA so we got the whole source dump (they didn't even test their fixes together until a client found problems).
I've used other Qualcomm products with different engineering support teams that are much better at source control, testing, and even devops but it's a very rare sight within the industry.
Seems very similar to how Restaurants and grocery stores throw food out instead of giving it to shelters and homeless people. It’s a common corporate policy in the US. The only justification seems to be that if they gave it away, they might miss out on some otherwise potential sales.
I think there's also liability issues, like if they give away food and someone gets salmonella poisoning or has an allergic reaction or something then they could get sued. It sucks but it's not completely arbitrary.
In addition to possible bad press, it’s also common for people to not know the law. And as I read news stories asking if the reader knew the difference between “best before”, “sell by”, and “use by”, I would also expect some to destroy the produce out of a misguided but kind desire to avoid harming those who they could legally and effectively help.
Throwing away food is a common practice also in production factories to keep prices fixed. If a resource supply becomes too much available compared to its demand, the price lowers, and we aren't anymore at the "get an apple from a tree and sell it" level: if the price lowers too much entire production or distribution companies can go bankrupt.
I designed the ABEL language back in the 80's for compiling designs targeted at programmable logic arrays and gate arrays. It was very successful, but it died after a decade or so.
It'd probably be around today and up to date if it was open source. A shame it isn't. I don't even know who owns the rights to it these days, or if whoever owns it even knows they have the rights to it, due to spinoffs and mergers.
I think you'd be surprised. I went to Caltech (graduated 2014), which is a fairly well known university for their Electrical Engineering program, and I learned ABEL in my Sophomore/Junior year. My instructor, an admittedly old school hardware engineer, was in love with the language and had it as part of our upper level digital design curriculum for a few labs. FWIW, I think it was super intuitive and a hugely valuable learning tool. I suppose that doesn't mean it isn't "dead" for professional purposes, though.
Walter - appears Xilinx are the current copyright holder and ABEL was last supported in the XILINX 10.1 ISE toolset released circa 2008 (Current release is 14.7). Introductory guide can still be found here: https://bit.ly/2NfkLWq
It was very successful, but it died after a decade or so.
Perhaps not entirely. We had one lab session dedicated to it during my junior year in college. That was ten years ago but apparently they haven't changed that[0] (course description in english at the bottom of the page).
Xilinx supported their flavor until ISE 10 which is still available to download. They also had a code converter to the HDLs so you could still theoretically target the latest FPGAs using ABEL with some scripting to orchestrate the mixed tooling.
Holy shit, around 2005/6 I took an EE class in programmable logic that used ABEL for the labs. I couldn't find any information on it anywhere and all we had to learn the language was an old single-page photocopy of an example that had been re-copied so many times it was barely readable. And the ends of the lines were cut off.
Needless to say, the ABEL projects were frustrating...indeed a shame that it's not open source.
"The ABEL concept and original compiler were created by Russell de Pina of Data I/O's Applied Research Group in 1981." This is false. I don't know what de Pina did, but ABEL was developed from scratch by the 7 member team listed in Wikipedia, and the grammar and semantics were designed by myself.
> I have an original manual around here somewhere, I wonder if anyone would shoot me if I scanned it and made it available :-/
Is there some "copyright" printed on it? Before 1989 it was apparently "required" and if not printed apparently it matters if "the author made diligent attempts to correct the situation":
The exception is for materials put to work under the “fair use rule.” This rule recognizes that society can often benefit from the unauthorized use of copyrighted materials when the purpose of the use serves the ends of scholarship, education or an informed public. For example, scholars must be free to quote from their research resources in order to comment on the material. To strike a balance between the needs of a public to be well-informed and the rights of copyright owners to profit from their creativity, Congress passed a law authorizing the use of copyrighted materials in certain circumstances deemed to be “fair” — even if the copyright owner doesn’t give permission."
It's not simple, but maybe an interesting starting point...
Maybe, if there's if it is not a product actually being sold or even used as such anymore, there's reasonable chance that the copyright holder wouldn't be interested to enforce the protection of the "historical material"?
>> "Most importantly, we have to change the culture of hardware design. Today, we don’t have open sharing … "
I'll have popcorn ready for the eventuality where IP blocks are widely available under GPL type of FOSS licenses and Intel|AMD|ARM|TI|... is eventually found to include one or more of those open sourced blocks with incompatible license in their chips.
The exact same arguments have been perused ad nauseam by old-timers in the software industry in the 80's and 90's when they clamored against open-source being anti-capitalist and un-american. Not sure how and why H/W is different.
I was thinking this mostly from the standpoint of viral nature of GPL-kind of FOSS licenses. E.g. somehow, somewhere someone manages to inject a GPL licensed IP block into a commercial CPU, after which, with my layman understanding, chip manufacturer now owes under the terms of GPL everyone who happens to have purchased one of those chips a full VHDL, Verilog etc. source code of the entire chip.
In software you can always replace a library that had incompatible license with another. Not sure how this would work with a chip.
As for attempts to modernize EDA industry, I welcome that in open arms. And if in the process we end up open sourcing a lot of current IP blocks (think of USB, HDMI, etc. designs) - all the better.
Different level of liability. When GPL missteps are discovered in software, the offender can usually just re-release the software without the GPL code and move on. In hardware…it's soldered into a bunch of devices all over the place. Consider Intel taking a $475 million hit for the FDIV bug (requiring a hardware replacement) vs the invisible bugfixing through microcode (i.e., software) they do now.
Not sure what cores will be needed for you to break out the popcorn. But there are actually quite a few cores available under open and free licenses.
First off, the RISC-V community is based on the open ISA. There are several open implementations of the ISA. And the RISC-V community is meaking good headways in developing open tools, peripheral cores etc.
The older project is OpenCores. OpenCores has been quite tightly related to the OpenRISC CPU core, the wishbone set of on-chip interconnect solutions. They have been used in many FPGAs and ASICs
Then you have projects like Cryptech that develops a complete, totally open, Hardware Security Module capable of doing certificate signing, OpenDNSSEC signing etc. The Cryptech Alpha design from PCB to FPGA cores and SW including pkcs11 handling is open. The project has amassed quite a few cores. The PCB design is available in kiCAD. (disclaimer: I'm part of the Cryptech core team doing a lot of the FPGA design work.)
Speaking of tools like KiCAD, there are aqtually quite a few open tools for HW design. For simuation there are Icarus Verilog, Verilator, cver for example. They might not be as fast as VCS by Synoptsys. But they do work. I use them daily.
For synthesis, P&R the state is less good. For implementation in Altera and Xilinx devices you currently have to use the cost free tools from the vendors. But there is work ongoing to reverse engineer Xilinx Spartan devices. I don't know the current state though.
But what has been reverse engineered are the ICE40 FPGA devices from Lattice. And for these you can use the open tool Yosys by Clifford Wolf (also mentioned below by someone else).
The sha256 and aes cores has been used in quite a few FPGA and ASIC designs. Right now I'm working on completing cores for the Blake2b and Blake2s hash functions.
I agree that we in the HW community is waay behind the SW community in terms of open tools, libraries (i.e. cores). But it is not totally rotten, and it is getting better. RISC-V is to me really exciting.
They have been some attempt. E.g: the mit project called sirus that used python 2.5 as a dsl to describe high level components you could combine and reuse and then process to generate system c or verilog.
Unfortunaly, while the tool is pretty nice, it never resulted in major adoption (qualcom has some tool using it internally and a few others) and we haven't seen the idea of making reusable libs and components florish.
Somebody would need to find this project and up it to python 3.6. With current tooling, it would make writting code in it really nice and ease the creation of reusable components.
Every CAD system I know of supports ways of group circuits into modules and libraries for multiple Instantation. And those libraries are distributable.
I wonder how a system would turn out in which all electronics/software are forced to have both their diagrams/schematics and code published, but in exchange this is copyrighted for like ~7 years or so.
This, to the 100th power.
The culture in the EDA industry is stuck in the 1950's when it comes to collaboration and sharing, it's very frustrating for newcomers and people who want to learn the trade.
As was pointed out by someone in another hardware related HN thread, what can you expect from an industry that is still stuck calling a component "Intellectual Property"?
The un-sharing is built into the very names used to describe things.