Hacker News new | past | comments | ask | show | jobs | submit login
Semiconductor Startups – Are they back? (semiwiki.com)
71 points by breck 10 days ago | hide | past | favorite | 54 comments





While my experience is mainly in digital hardware, my tinfoil hat theory as to why the hardware startup scene is so lackluster is because the EDA tools are expensive hot garbage. Not only are the dev tools cost prohibitive, but so are the necessary analog components like PLLs and PHYs (which are more or less required in pretty much every modern VLSI design.)

The same companies that make the EDA tools also make IP blocks that semiconductor companies purchase, so it's in their best interest to prevent an open source hardware scene from thriving. I get that making masks for a chip is expensive for a startup. But if the design, verification, and layout steps were cheaper and more streamlined, I would bet money that the fab costs would work their way down in clever and creative ways.


The EDA tools are often given for almost free to startups, as the vendors recognise startups can't start without them, and if they are too expensive to a fledgling company they are shooting themselves in the foot. Startups get the tools, but not the support. I'm in HW chip design and use EDA tools daily, and while they are not always great, I would not be as concerned about them for a startup as I would getting enough money to create the first prototype, fab costs, lab equipment costs, man years of engineering, and of course the IP.

Though they are essentially given for free, stitching everything together is still a lot of work. Getting the tools as an individual is not really possible (with the exception of Mentor sometimes..) the open source software ecosystem needed around every tool is non-existent, which means that each company ends up recreating their own scripts and flows each time you want to design a new IP.

The tools are still kind of rubbish in the sense that for many startups in software their edge could literally just be having an extremely good internal development flow whereas EDA stuff seems to be written solely to get from A to B.

I haven't seen the deep end of the software as you get closer to the fab, but I'm guessing it's worse.


A HW startup will not be suddenly successful because of their EDA flow. They could however fail badly if they have their EDA incorrectly setup, as if your chip STA flow was incorrect and you have hold time issues on silicon, you just blew 6 months+ and a few million. It would be critical to have an STA expert for that reason, as well as a solid sign-off process before tape out.

You need the support though to know why the crappy tools are breaking your design and be able to hack around it.

Yes and no, if a startup has experienced designers, they can work around issues. I've often done this and I have access to AEs, as to get an issue resolved externally takes more time than a bit of debug on why the tool is not doing what you want. Rarely is the issue an actual tool bug, but often a tool quirk with some esoteric solution. The issue with most of these EDA tools is their million configuration variables, and that they rarely do something useful out of the box. And experienced designer knows the edges on this and will work fine without support.

I don my tinfoil hat with you. EDA tools are intentionally opaque to maximize IP capture. I have no skin in the game, nor do I have hardware/circuit development experience... But every time I want to dabble as a Hobby-ist it's clear: "You're not allowed to experiment."

Starting that train now is quite difficult. Lots of inertia to overcome with pre-existing closed source tools being chosen by default at colleges and companies.


God bless Claire Wolf

While EDAs truly are ridiculous expensive garbage, their cost is a pocket change in the age of billion dollar startups without a selling product.

It's just you actually have to make some working product in semiconductors, it's a substantial technical undertaking, and thus a big turn off for the aforementioned billion dollar internet startups.


The six figure per seat cost of cadence is a lot to the point of being unobtainable for even early stage of what would become unicorns.

It's not like giant checks are generally being written to these startups that have no users and no engineering done yet.


Isn't that, at least for digital application, many chips can be replaced with FPGA, CPLD or just a good old CPU? Only recently you can see some companies reproducing audio IC from the 80s, but not much of original work. It seems like there is a huge niche in that space, but probably companies think it is too small to be profitable.

It was sad to see what happened to Touchstone Semiconductors. They made some really excellent components with no equal.

It seems like a VC pulled out, they were forced to sell to a big company at a terrible price, and then that company discontinued all their products shortly after.

https://www.edn.com/touchstone-semiconductor-falls-prey-to-h...

https://www.eetimes.com/touchstone-semi-sells-assets-to-sili...

https://www.silabs.com/documents/public/pcns/EOL-1609091-Tou...


It's sad that 'success' in a startup sometimes means that you get a payout, then everything you worked for over years gets put in a drawer somewhere and everyone gets let go. It's some sort of Pyrrhic success.

As an aside, It's a story I've heard over here (Australia) quite a few times as well. It makes me wonder if we are capable of playing in the big leagues (Atlassian being an obvious exception here).


Realistically, what happens to the IP in this case? Can the original founders (or new party) just decide to restart and roll with the same IP and find new VCs? Is the IP locked up forever in the vaults of BigCo? This pattern of "acqukilling" a company seems like smelly, sometimes malicious pattern. I know that FAANG does this to social startups all the time.

All of Touchstone's patents would have been transferred to Silicon Labs, and they still own them despite discontinuing all the parts. The founders would probably find it very hard to try again.

There needs to be a "movers" clause. Maybe it should be illegal to sit on IP over time without provably making progress towards said thing, or there should be some decay factor where part of your ownership goes back to public where it can be bought by someone else. The smartest and most willing and capable people should be the ones with the IP. and allowing IP to sit unused is an awful waste.

Yep. Without a contract stating otherwise,[a] the purchase of a company generally transfers all IP of said company to the purchaser. This is especially true in bankruptcy proceedings where a company only buys a bankrupt one for the IP (branding/logos, trade secrets, etc.).

[a] For example, an inventor could license their IP (the patent or whatever) to a company they create with a clause that the license can’t be sold (this has varying success from my understanding)


I'm very bullish on the entire sector. One incumbent vs startup story to watch in the AI accelerator space is NVidia vs Lightmatter, If they can realize the cost savings of photonic computing it looks like a 5-7x improvement. NVidia's Megatron trillion parameter language model requires astounding compute capabilities: 3000+ A100 GPUs, And while I don't see GPU dominance retreating through 2024 at least, as we get into universal translation and global parallel corpora by the end of the decade, the limits become apparent. And it probably won't be talent, design or money that becomes the bottleneck. But the relative difficulty of working with photonic crystals compared to the low hanging fruit of silicon that has provided such a bounteous harvest for the last 70 years.

https://github.com/NVIDIA/Megatron-LM


It depends on the definition of "semiconductor startup".

Since 1990 there have been no foundry or IDM (vertical) startups funded by Sandhill road VCs. ZERO. Zip. Nada.

Fabless have been founded exclusively by VCs. But they are NOT the current bottleneck. It's foundries that are. E.g. TSMC, Samsung, Global Foundries, etc. And further ANY future innovation will NOT come from fabless companies - it will be from semiconductor HW companies or their supply chains. Basically Silicon Valley going all out into software/internet/social media has completely obsoleted the professional ecosystem there. And most of the people who were part of the prior ecosystem have been retiring or dying or have moved on to other things. All never to return.

In general to do anything OTHER than fabless especially since 1995 (which is about the year that semiconductor companies started RACING out of 'Silicon Valley' never to return), you've had to get startup funding from anyway BUT VCs generally and usually only from Angel/VC investors OUTSIDE OF California.

I know because I've founded such companies. I'm a HW guy with device physics, circuits and similar training with software added on. We had to bootstrap hard and then only ever got funding from foreign companies or non-California investors. My last startup was bought by a Taiwanese company.

So I'm dubious and I also know well that the US no longer has the professional ecosystem to support semiconductor HW plays - the US literally needs to start from scratch (like it's 1950-1960) because it's already pissed away everything that would allowed a quick and cheap re-entry. Honestly the better investment is to so other things like follow where the money went (e.g. like my last company).

People grandiosely wanting this don't even know what the right questions are anymore. You would have had to have "gone to Asia" (like I did) to keep up.

The best hope right now is TSMC in Phoenix and Samsung's planned expansion (location still TBD but it's between Phoenix, Austin and Malta apparently) plus GFs Malta expansion. Most importantly: Silicon Valley will never rise again in this market.


Can somebody more experienced than me chime in and say whether Bizen [1] could be worked to be a viable alernative to CMOS at least in some niches? While CMOS has accumulated decades of improvement, much of it wrt. lithography would be transferred.

[1] https://www.wafertrain.com/index.php/theprocess/what-is-bize...


This would be really great, I think innovations with GaN transistors could be something that startups could tackle and I would love to hear more about it!

But fab is extremely capital intensive and complex to setup, I don’t think this is the realm of startups anymore.


Take the story of Bethleham Steel. Unassailable entrenched incumbent in steel production due to high capital costs.

Then a completely new way of steel production came about. Nothing iterative about it, a completely different set up that came with its downsides but had better economies of scale.

If there is a change to come, it's not by building a bigger fab that has higher capital costs. It's by making the idea of a fab entirely obsolete by finding a way to bypass all the steps involved in the current manufacturing process and producing a slightly inferior product in some way but isn't a multi-month long process with capital costs measured in $10s of billions. Fabless manufacturing was a great boon in its time by changing the business model around fabs but it didn't change the underlying engineering. Somebody, ultimately, owned a fab in the chain.

The real trick will be something that bypasses the whole idea of pure crystal ingots as the core of the process that has to be processed in batches with 100s of steps. Something more continuous and simplified would be revolutionary. Hypothetically, imagine instead of growing a perfect crystal and etching or doping it, you grew an imperfect crystal but that growth was controlled in the direction that got you 80% of the way to the layout of the final design. Or imperfections were embraced and utilised as a feature for randomised bias in AI cores than a flaw that wasn't tolerated.

All of this, hypothetical of course, but my point is even if we are at the end of this fabbing game, the real trick is creating a completely new game entirely.


I like the idea of yokogawa's minimal fab, which removes the necessity of a clean room. It still needs plenty of fabrication steps, so you still need plenty of different machines and the entry ticket is ~10M.

I'm convinced that there are plenty of design space that is not explored. In particular, the fabrication process today is mainly 2 dimensional with a very high necessity of control.

If you relax one of this strong constraint and build the system bottom-up instead of top-down. It reduces to a search for the chemical compound, which can self-assemble into an interesting structure. It should look like a biological machine but running at computer frequency.

Plenty of nano-computing units interacting. For example today we can already build some silicon-nanorobots ( https://www.electronicdesign.com/industrial-automation/artic... ). Give each one a small memory unit, processor, and radio-link, and you can build tons of them on imperfect crystal using standard process. Then your processor is a 3d volume of these bots instead of a 2d-surface. You will still be constrained by heat dissipation but if you put them inside a liquid, it would be less problematic, but you'll have to compensate for nano-processor brownian drift.


Yeah, to bad Mapper was acquired by ASML. Would be interesting to see what maskless fabbing with Electron beam lithographically could do for small production runs. If you get the price for that down designing and testing of integrated circuits would be attainable for many more organisations. Maybe Multibeam can get it to work.

I'm not so familiar with steel – in the Bethlehem Steel analogy, is the newcomer you're talking about Nucor?

Yep, the mini-mill tech they adopted was pretty disruptive at the time because of it's economics.

Based on random googling I found this PDF:

https://www.gammabeyond.com/wp-content/uploads/2019/05/Why-C...

Page 19 talks about the history of the mini mill in steel production.


If you know where the lattice imperfections are, you can maybe go around them.

Basically all of the chip startups from the last 20 years are fabless, including the ones that this article is referring to ("high" capital costs in the $tens-of-millions vs. $billions)

No

- someone who worked at a semiconductor startup


I'm at FAANG hardware, and I never know when to leave to join a startup for a big payday. In hindsight.. the Nuvia exit was pretty obvious :(

Then again, so many other failures.


As someone who spent five years in Facebook doing hardware after an acquisition, my advice is to save enough to be comfortable and then apply your passion and knowledge to a startup doing something you believe is worthwhile. Worst case if it fails, you get to move the needle a bit in a space, make useful connections, learn a bunch, and then bounce back to a FAANG job.

FRL?

Exit before validation is so specious

Yeah, it's an expensive acquihire.

Expensive only means anything if you are not made of money: these megacorps are literally made of money. What's $1.4B to keep down a possible competitor? To keep down someone who may have made their chips generally usable? Who may have upstreamed support?

It's just unsatisfying to the world to have absolutely no new evidence, no points of data, no benchmarks in capitalism, no way to figure out what to expect of product, when interesting companies are acquihired & their efforts to do good pulled into the spider web of megacorp-ism. The world literally learns nor understands nothing, after Nuvia got bought. Whatever comes out will be a synthetic other. Similar to so many other acquisitions. The backing of the larger company changes the effort itself, & diminishes the competitiveness of the offering.

And the intent to support the thing well, the desire to form a solid community: dead. The new corporate largess defines the environment for whatever might ship. There will be no radical new upstreamings, new mainlinings. It will be business as usual, alas, most probably: closed, highly proprietary vendor drops. An anti-free anti-accessible system of chipmaking, unsupportability-writ-large, re-asserting itself, over what might have been.

It seemed the world might get a sip of that sweet ambrosia, but now we all sip again as we are permitted from the great leadened chalices. Nothing new will be offered.


Anti-trust now (and yesterday too). I can not believe that the DoJ ended the anti-trust suit against Qualcomm[1][2]. This is such capitol evidence about how shitty chip-making is: everyone of potential value gets immediately snatched by a colossus. This system is poisoned. Why are the regulators doing nothing? Why is there no back-pressure against this system of everyone being acquired? Nothing new is ever permitted. Everything is hideously broken.

[1] https://news.ycombinator.com/item?id=26629492

[2] https://www.bloomberg.com/news/articles/2021-03-29/u-s-aband...


efabless.com

How much raw resources for semiconductors do we have left?

Silica, a precursor to silicon wafers, makes up about 10% of the Earth's crust by mass [1].

[1] https://onlinelibrary.wiley.com/doi/abs/10.1002/14356007.a23...


The raw consumption of resources is tiny: remember that semiconductors are tiny objects!

Of more concern are the energy and clean water input, and the ability to cleanly dispose of used solvent. Much of the old silicon valley factories are now "superfund" cleanup sites because of this.


Of all the natural resources we consume, semis would be the one I'd worry the least about. We'll likely run out of many other (more vital) things before semis make it to the top of the list.

Raw resources? You mean Silicon? You mean 27%[0] of the planets crust? Yeah, we have a bit.

[0] https://www.rsc.org/periodic-table/element/14/silicon


While the raw materials needed for silicon and silicon carbide are extremely abundant, the semiconductor industry needs many other elements that are much less abundant, e.g. gallium, germanium, arsenic, antimony or hafnium and also some that are extremely rare on Earth, e.g. indium, tellurium or gold.

The rare elements may be needed as dopants, when they are needed in very small quantities for each device, but those quantities add to non-negligible values for the entire huge production of semiconductor devices.

However, the rare elements can also be the main constituents for the so-called III-V and II-VI semiconductors.

The fact that the other better semiconductors require large quantities of rare elements has been a very important reason that has prevented the replacement of silicon in a large number of applications where it is an inferior solution.

For example, the GaN transistors and the white LEDs need not only gallium but also relatively large quantities of the much less abundant indium, which might limit some time in the future the expansion of their applications.


> The fact that the other better semiconductors require large quantities of rare elements has been a very important reason that has prevented the replacement of silicon in a large number of applications where it is an inferior solution.

Maybe, but silicon is also damn good because of the adherence and dielectric nature of its oxide. Not to mention that several of the so-called superior semiconductors, even if you can get them in large quantities, are more difficult than silicon to crystallize without many defects. And silicon also conducts heat relatively well, meaning that it draws thermal energy away from hotspots within a chip better than some of the more expensive semiconductors.

Not disputing the larger point about rare metals. Just asking you to put some respect on the name of my boy Si.


You are right that silicon has many other advantages besides being abundant.

Also correct is that the price of a semiconductor material in the form as it enters a semiconductor plant depends not only on the abundance of the raw material but also on how easy it can be purified and grown into crystals with very few defects.

The latter 2 operations are indeed much easier for silicon than for compound semiconductors.

Nevertheless, the choice of materials for any semiconductor device results from compromises between a very large number of properties and while silicon is good at some, it is worse at other properties, e.g. energy bandgap, breakdown electric field, velocity limits of charge carriers, electron mobility, leakage currents and others.

So there are applications for which silicon may be the best choice, even if the cost of the materials is ignored.

However, in more and more applications silicon continues to be used only because of its lower cost.

It is likely that the use of silicon for the active part of the semiconductor devices will continue to decrease and this trend will accelerate.

For example, to make faster CPUs, there are not a lot of remaining possible improvements.

Three-dimensional silicon devices are a possibility for increasing the multi-thread performance, but only if it would become possible to circulate some liquid coolant through channels in the device, to eliminate the heat.

Otherwise, the only chance is to use some other material than silicon for the active regions of the device.

Even if the active semiconductor devices would be made from other materials, it is likely that silicon crystals will continue to be used as substrates long after that, due to the 2 advantages that you have mentioned, i.e. very few crystal defects and high thermal conductivity.

For the record, I have worked for many years in a plant where silicon devices were made and I have handled thousands of silicon wafers, breaking just a few ;-( .

Therefore, I actually have a lot of respect for your boy Si !


Can that be readily converted to silicon wafers?

To be blunt, yes. It's mostly a function of how much you need and applying heat to burn off impurities and separate out by weight. Suck out the liquid silicon layer, crystalize it, slice it. Boom, wafer.

Sand is used because of the high surface area so lower amount of heat needed to be applied, but sand is just ultimately worn down rock. Quartz for example is just pure oxidized silicon in a non-uniform crystal structure. And there's a lot of quartz. Don't like quartz? It's also in the even more common feldspar too. We as a species will run out of water before running out of silicon.


No sand goes into silicon production directly.

You can heat up some rocks with carbon to get silicon and CO2, then use the Czochralski method invented in 1915 to produce a single giant crystal by essentially dipping a stick into molten silicon and slowly pulling it out. Then you can slice up that big crystal into wafers.

There’s a cool photo on Wikipedia of somebody just growing a crystal: https://commons.m.wikimedia.org/wiki/File:Silicon_grown_by_C...

There’s a bit of waste, but AFAIK it’s not too bad compared to other things.


It's not the raw materials that makes semis expensive, rather it's the extensive processing it has to go through to get be transformed into the final product. The biggest issues with materials I'm aware of are often the 'conflict resources' which are in high demand but large percentages of the supply are mined in regions that for social and/or geopolitical reasons limit supply.

Yup

I am going to be that nitpicking guy.

Raw resources don't matter that much. We have a lot of oil in the ground for example, what matters is how much human labor and energy it takes to extract the oil. Given enough energy and labor you can just recycle materials endlessly, you can take the CO2 in the atmosphere and turn it back into oil. The same applies to semiconductors. As others have said the earth is 10% silicon, but the real question is, how much does it cost (energy and labor) to extract it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: