
Tech’s Next Revolution Might Be Open Source Semiconductors - jseliger
https://www.bloomberg.com/news/articles/2020-01-22/open-source-transformed-software-the-chip-industry-is-next
======
pepijndevos
Open source semiconductors is such a broad area, and sadly only a tiny spec of
it is open source: The HDL code that describes the digital circuit.

Everything else is _very_ experimental at best. These days the FOSS FPGA tools
are finally getting some traction, with Yosys and Nextpnr. But AsicOne tried
to make an ASIC with open source, and faced endless troubles.

For ASIC there is basically QFlow, which is quite old, but used successfully
to tape out a chip in the past, and there is OpenRoads, which is very new,
experimental and ambitious. There are still major gaps in these tools, so in
the end you inevitably have to sign an NDA and use proprietary tools and
libraries.

And that's just talking about DIGITAL semiconductors where you compile HDL to
pretty much generate the transistors from foundry cell libraries. So you have
to sign an NDA to get the cell library, but you can at least release your
code.

For analog chips, you can't do anything. An analog design highly depends on
the parameters of the transistors you use, so before you even BEGIN designing,
you have to sign an NDA to get the transistor models and you can NEVER open
source an analog design.

The small dot of light at the end of the tunnel are projects like Minimal Fab,
who make more accessible fabrication lines with open transistor models.

The crazy thing is that back in the days there were lambda rules, which were
open rules anyone could use to design and model with. But with sub-micron
devices, these scalable rules no longer scale, so fabs started producing
secret models for their specific process.

I'm hopeful that after FPGA, and digital ASIC, analog will be next to be
revolutionized.

~~~
jleahy
> And that's just talking about DIGITAL semiconductors where you compile HDL
> to pretty much generate the transistors from foundry cell libraries. So you
> have to sign an NDA to get the cell library, but you can at least release
> your code.

Yes you can release your code, but you can't release your netlist or GDS-II.
There's no guarantee that somebody else will be able to take the same HDL and
close timing, even with the same foundry libraries (say if they are using a
different tool, or different options). You'll also need things like clock-
gating cells, memories, IOs (at a minimum) and those are foundry specific, so
those would need to be abstracted out in some way.

> For analog chips, you can't do anything. An analog design highly depends on
> the parameters of the transistors you use, so before you even BEGIN
> designing, you have to sign an NDA to get the transistor models and you can
> NEVER open source an analog design.

Now this is where I disagree. Sure you can't open source your analog GDS-II,
but maybe that's not the way to go. In my opinion what you want to do is build
a foundry independent PDK for a generic 28nm, 40nm or whatever node using PTM
models. A well designed analog circuit needs to be relatively independent of
specifics, otherwise it's not going to work across all corners (this is more
true for modern nodes than the kind of nodes the old textbooks talk about) and
it'll be difficult to port to another process. So there's a good chance that
analog circuits built for 'generic 28nm' or 'generic 40nm' could be ported to
any foundries process (of course the PDK needs to be well designed). Yes you
won't be able to push things to the limit as the DRC will be wider, but analog
rarely needs to go to the limit. You could probably take the same approach for
digital, but that's a lot more open source stuff to build.

Check out OpenRAM and FreePDK45 for academic projects taking this approach.
Unfortunately FreePDK45 is only available to those with an academic email
(despite being called 'open source'), which makes me very sad.

~~~
pepijndevos
Yea, I think this is an interesting approach. But FreePDK45 slides mention
that it's not designed for manufacture, and there does not appear to be an
easy upgrade path. So you could design a chip with FreePDK, release the
design, and then redo the whole thing in a vendor PDK.

I talked to someone who worked on AsicOne, and he said that even if you make
your own PDK and draw your own transistors and everything, you'll still have
to sign an NDK to do the sign-off and what not. I'm not intimately familiar
with the whole process myself, but from what I understand it is basically
impossible to have an open source analog design that you can actually
manufacture. (sure, you can make a theoretical toy thing, but if you can't
manufacture it, who cares?)

~~~
jleahy
I'm quite familiar with the process and I believe it's entirely possible.

You will need to run foundry DRC decks, but the company you're taping out
through will do this for you (I presume that you're not big enough to deal
with TSMC directly). This is because a design that fails DRC could actually
break other people's chips if you're sharing a wafer.

Of course if you really want to know that it'll work you need to also run
foundry LVS and stimulate corners with foundry spice models and foundry PEX.
But if you're gutsy you could skip this, if you believe you've put enough
margin into your PDK corners.

Certainly there is zero need to redraw your transistors. Transistors are
transistors, a few layers (od, poly, contact, implant, ...), there's no magic,
no magic sauce. The foundry wants an SVG with overlapping rectangles (of some
minimum size), nothing more.

------
Trisell
I think the bigger thing for Open Source Chips is going to be the expiration
of a ton of the x86 patents next year. That is going to free up a whole host
of people to take a look at trying to build A better open x86 processor. Maybe
then we can finally get to having a truly open source server from chip through
software.

~~~
Eikon
What’s the point of open source for consumers, expect for maybe education,
when you can’t produce something from the source yourself nor be sure the
product you buy is made from the source?

It’s _very_ cost prohibitive to run a batch through a semiconductor fab. You
are not going to request one offs unless you have a few millions to spare.

It’s not like running make after cloning some sources.

~~~
sliken
Well the barriers are pretty low today. Even fairly small companies like Axis
(they make security cameras) or ubiquiti (routers and access points) can work
with CPU vendors to license their tech and get a custom CPU. With an
opensource IP the barrier will be even lower. Maybe even within a well funded
kickstarter range.

Maybe for a killer home router with a 100% open hardware+software platform
that can route 8 ports of 5Gbit/port at line speed that would more trusted
than random commodity hardware.

Or something to handle say 8 security cameras and use machine learning to
handle all 8 streams for not just motion detection, but also identifying what
person/object is in each stream.

I'd certainly pay a premium for smart devices in my home that I knew that I
could trust, wasn't spying on me, doesn't require cloud connections, used open
APIs/standards, and wouldn't die with the next time a company dies, gets
bored, gets greedy, get purchased, etc.

~~~
tomasato
I’m curious about the ubiquiti custom CPU, can you share a link to more info ?
Thanks

~~~
mikepurvis
"Custom CPU" seems like a bit of a stretch. They have ASIC people on staff (eg
[https://rocketreach.co/ya-chau-yang-
email_61973123](https://rocketreach.co/ya-chau-yang-email_61973123)), but
based on the fact that you can flash basically all their devices with OpenWRT
(including those disc-shaped APs), I think the "ASIC" is just a conventional
ARM or MIPS core with some peripherals geared toward hardware acceleration of
common network tasks— switch routing, vlan tagging, etc.

This page talks about the specific features available for hardware offloading:

[https://help.ubnt.com/hc/en-
us/articles/115006567467-EdgeRou...](https://help.ubnt.com/hc/en-
us/articles/115006567467-EdgeRouter-Hardware-Offloading)

------
wbhart
Is the real barrier to CPU production open source of the logic design or is it
in research into chip production, node design, including hundreds or
potentially thousands of process steps, maximising yields, segmentation of the
market and other marketing issues, defining standards, chipset design and
integration, validation, the generally prohibitive cost of building fabs and
actually producing the chips and getting sufficient capital to make inroads
into the marketplace, e.g. with OEMs.

The idea that open sourcing the logic is going to make other companies
competitive with a financial behemoth like Intel is really a stretch.

~~~
ci5er
Thermal packaging!

------
throwGuardian
If silicon/chip IP becomes free, and the only major cost to tape-out is an
integration team, and Fab related stuff, China, not silicon valley, is the
real winner

~~~
DivisionSol
All the more reason for companies to focus on true innovation instead of
milking IP licensing.

All the more reason to make it profitable for fabs to exist within the US.

~~~
javajosh
_> All the more reason to make it profitable for fabs to exist within the US.
_

I'm particularly interested in this space. Anyone more familiar with it care
to comment? Is there a push for more US fab capacity? Any startups? Any
political push to change laws to make it more favorable to start one?

~~~
LatteLazy
I can't answer your questions but may I piggy-back?

What stops Fab Facilities existing in "western" countries at the moment?

Someone told me a major reason for the move was because the chemicals used in
top-tier facilities are basically banned in the west. Not technically banned
but considered so dangerous (they're all carcinogenic) that the cost (health
and safety, insurance, compensation) make them uneconomic.

Is that right?

Does it matter whether fab facilities are near to final manufacturing
locations? Will Apple\Foxconn buy chips from a San Fran shop if they then have
to be shipped to China for inclusion in the device? Since most devices are put
together in SE Asia, the delay might be killer...

------
Aardwolf
Open source, or open drain? :p

~~~
blululu
I think that depends on the gate.

------
blackrock
When will optical CPUs become a reality?

Instead of pushing electrons with a voltage difference, to signify 0 or 1, a
light wave of red or blue, can be used instead.

Anyone here a photonics expert?

~~~
beisner
[https://www.luminouscomputing.com/](https://www.luminouscomputing.com/)

There's at least some cool work going on in the GPU/TPU area.

~~~
jlokier
Thanks for that link, I hadn't heard of Luminous.

------
speedplane
This trend (if it's actually happening) is a symptom of the death of Moore's
law. When Moore's law was operating in full swing, it would rarely make sense
to build your own chip using open source. In the time that you could design
and build your chip, general processors would have doubled in speed and you'd
be better off waiting for the general processor.

Here are a few predictions, some of which have already occurred, that may
result due to the death of Moore's law:

 _Already Happening:_

\- More custom chips (squeezing the last bit of performance)

\- More reliance on the cloud, to push off processing power where there are
more economies of scale

\- The rise of traditionally "second tier" processor manufacturers (e.g., AMD,
ARM) to be head-to-head with traditional leaders (e.g., Intel).

\- A greater amount of chip manufacturers R&D dollars spent to each dollar of
revenue.

 _Starting to Happen_

\- China and developing countries catching up with chip technology (when the
leaders are no longer growing exponentially, it's easier to catch up)

\- Governments imposing their will on chip developers (when there is less
competition over performance, other factors like trust and national origin
will start mattering)

\- Trade secret theft, i.e., in the past if you stole Intel's designs, you
would get one good chip, but that theft would be obsolete in 18 months. Now,
it gives you a much longer advantage.

\- An societal shift from utility to branding, i.e., as all goods start
becoming equal, branding is the main differentiation.

\- Living standards catching up to western and U.S. standards.

 _Further Afield:_

\- No significant technological improvements for decades.

\- Stagnant per-worker/capita productivity.

\- Economic growth becoming far more tied to population than individual
productivity.

\- Economies/governments fighting to increase their population (e.g., through
legal immigration or by force).

\- Government's power and control becoming based more on population rather
than ideals or innovation (e.g., China).

\- More monopolies ... in a dynamic and innovative society, a small smart
company can defeat a larger slower one. In a stagnant society, that won't work
and the competitive advantage can only be obtained by consolidation and
economies of scale.

\- Social unrest ... in an exponentially growing society, there is room for
every generation to become wealthier than their parents, but in a flat
society, on average half will become richer and half will become poorer.

~~~
wuschel
I would argue there is more to an economy than processor speeds or price of
transistors e.g. hardware security, software security, data integration,
expansion of communication capabilities, not to mention other industries
(biotech, transportation, energy), etc. Although I must admit that I do not
know how much the increase in modern hardware capabilities contributes to the
productivity of the economy. I disagree with your statement regarding
significant future discoveries, as that is something that is very hard to
predict.

If CPU and computer hardware become commodity, it would certainly be a new era
of tinkering that is upon us. Perhaps more pressing problems than
computational power f CPUs and transistor cost will then be attacked.

~~~
speedplane
> there is more to an economy that processor speeds ... I frankly do not know
> how much increase in modern hardware capabilities contributes to the
> productivity of the economy.

In industrial times, you would be right. However since the "information
revolution", a huge amount of productivity gains can be directly attributable
to transistors. Sure it started with improving simple calculators, but then
came spreadsheets, industrial CAD models, instantaneous global communication,
remote teams, global branding and a whole lot more.

In fact, it's pretty fair to say, that nearly every standard of living
improvement in western countries since roughly 1980 can be attributable to the
transistor.

~~~
wuschel
> In fact, it's pretty fair to say, that nearly every standard of living
> improvement in western countries since roughly 1980 can be attributable to
> the transistor.

Very interesting statement.

Out of cusiosity, is there any way you could back it up e.g. chemical or
biological research was not very digital for a long time after the 1980s. Of
course, digitization was transformative in the end of the 90s and beginning of
the 2000s, and it seems to me that it is just now where we are in that
exponential growth of exploiting IC technology.

I quickly checked your impressive background, and I guess you have seen a lot
of the IC sector in terms of innovation and technology. I believe that you are
making a sound call here.

At the end, everything is intertwined. Advances in material science go back to
IC design and production, and vice versa.

------
scarejunba
Hygon knows how to make Zen, dudes. This cat is out of the bag and running for
the door.

------
LargoLasskhyfv
Just a different pattern on the waffle iron. No mention of things like [1]
[https://www.minimalfab.com/en/](https://www.minimalfab.com/en/) or at least
[2] [https://www.wafertrain.com/](https://www.wafertrain.com/)

But hey, valley of silly cons, who cares?

