A guy in HK is making a 1um fab with own process and open tooling.
These people are amazing and their efforts are seriously cool. Props to all of them.
For comparison, the 80386 was fabricated with a 1um process.
In case anyone else is getting slow response times loading.
More details: http://www.halfbakedmaker.org/blog/lmarv1-1
Here, I think you dropped a few zeroes.
There's plenty of similar guides to making Silicon Carbide LEDs.  It's basically recreating Round's discovery.
The most complex parts of phones and computers (e.g., the CPU) are integrated circuits (ICs), which are (nowadays) billions of nanometer-sized transistors on top of a piece of silicon. The way these are made is arguably the most complex, high precision manufacturing process in the world, and is usually done in multi-billion-dollar "fabs" (fabrication plant) by huge companies (e.g., Intel).
Even the most basic IC fabrication, like in the article, absolutely requires maybe ~5-10 complex tools (furnace, sputterer, etc; $1000-$10k each at current eBay prices and very low quality, if you know how to rebuild/fix all of them) and a host of supporting equipment (fume hood for seriously dangerous chemical work, etc). To get reasonable results, you also need to understand the device physics and then test multiple times to get the process right. I have some serious respect for Sam Zeloof of the article for getting this to work: it's at least an order of magnitude more difficult than other home manufacturing (3D printing, woodworking, welding, sewing...), even if you're already an industry expert. And his device used 6 transistors; you'd need to get the transistor manufacturing reliability up significantly to make a useful microprocessor (instead of small analog circuits), which probably starts at several thousand transistors .
If (GP post) you want an automated device that makes an IC for you given a digital design file, well, hm. The closest things we have today are companies that manage and run the equipment for you (fabless semiconductor companies (Qualcom, AMD...) give their chip designs to, e.g., TSMC to manufacture). Academic researchers often send parts in together to reduce costs, in which case you could get tens of identical (reasonably simple) chips for several thousand dollars. Someone linked to , which looks like an attempt at a more open, hobbyist-friendly version of the same thing. I did run across  once, which _does_ seem to be attempting to make an easier to use, very small, automated system. I've no idea what their status is.
A desktop device as simple to use as a 3D printer is barely even on the conceptual possibility level at the moment, and then only when people start talking sci-fi self-assembly and molecular nanomanufacturing and a century of R&D.
From all my experience, far and away, the biggest bottleneck and ball and chain on progress in computing is bad technical writing, bad documentation, poor ability to describe technical work; one of the biggest problems here is undefined jargon; and there one of the biggest problems is undefined acronyms.
In particular, even with my decades of experience, there is no good way for me to guess that the acronym "IC" abbreviated "integrated circuits". For an audience as broad as all of Hacker News, there is NO way to know.
In particular, writing
"Homemade Integrated Circuits"
is plenty short so that there is no good reason to write
That acronym is undefined and obscure.
Technical writing in computing is awash in undefined and obscure acronyms, and that is part of the bottleneck and ball and chain on progress.
The situation in technical writing broadly, e.g., in math, science, engineering, is clear: When jargon is used for the first time, define it or at least give a link to a definition. With the Internet, giving links to definitions is especially easy and convenient.
For the sake of progress in computing, I urge computing and the Hacker News audience (i) to minimize use of obscure acronyms and (ii) on the first use of jargon always define or link to a definition. This advice is rock solid and just technical writing 101.
In the meanwhile, indeed, I'd be more interested in homemade ice cream than homemade integrated circuits -- even for the Hacker News audience, the acronym IC more likely abbreviates "ice cream" than "integrated circuits".
Other guesses at what "IC" might abbreviate:
IC to mean integrated circuits is a very common acronym, and right in the center of the HN focus.
I’ve literally never once seen “IC” used in any context to refer to any of the strained alternatives you propose.
It really feels like you had a canned rant that you've been waiting to find a place to use, and you decided to strain to make it seem like it fit the first thing that you felt like you might get away with stretching it to cover.
> even for the Hacker News audience, the acronym IC more likely abbreviates "ice cream" than "integrated circuits".
I disagree that this is true, either in general or in the specific context.
Wrong. IC is for hardware. The center of Hacker News is software, NOT hardware. Commonly the audience here works with software for hours a day but goes for weeks without ever seeing an IC or hardly even thinking about one.
You are just trying to pick a fight with me.
Again, once again, over again, yet again,
is more likely to mean ice cream than integrated circuit.
So, to pick a fight, you pick IC out of the context to say that generally IC means integrated circuit more than ice cream -- true but trivial, beside the point. Again, the title was
and THERE no telling what the heck IC meant, and even ice cream, even at Hacker News is more likely.
My point about the worst bottleneck in computing is rock solid and very important for computing and the Hacker News audience and fully appropriate. Your "rant" is insulting and provocative.
Your point that
clearly meant integrated circuit is absurd, just deliberately insulting. You are just trying to pick a fight.
Resist all you want: It remains, computing has a severe bottleneck -- bad technical writing with undefined jargon and acronyms. And Hacker News titles make WAY too heavy use of acronyms. Disagree, fight, resist, object, all you want -- you are still wrong.
Here you are doing the usual for an angry person with weak arguments -- you are attacking the person instead of the ideas. That
is obscure jargon is true beyond any question. So, you accuse me of a "rant": My original response was short. Then I got attacked.
You could have just asked for the title to be clarified to avoid a potentially unknown acronym.
Sure I COULD have done lots of other things, but it is insulting and patronizing for you to suggest I SHOULD have done what you suggest.
I did NOTHING wrong.
My point about undefined jargon is rock solid. My point was clear enough in my original post. My responses are only to explain with grossly excessive clarity to defend myself against people who want to attack for whatever reason.
There are some very thin skinned, hostile people on Hacker News.
And my early statement is literally true: I'd be more interested in homemade ice cream than homemade integrated circuits and justifiably so.
I totally agree with reducing acronyms, but I think you're doing a really bad job at advertising that idea right now.
The people attacking me, usually personally instead of my ideas, are finding NOTHING wrong with what I stated but are embarrassing themselves.
It should be enough to be correct, and I am fully correct.
And why on earth would you even think of ice cream the context of hacker news?
"Homemade Ice Cream"
even on Hacker News is more likely the meaning than
"Homemade Integrated Circuits".
"IC Density and Moore's Law"
is clearly about integrated circuits and not ice cream.
Yup, a lot of people in computing really hold on to jargon like hugging mama.
mean? It might mean
Homemade Ice Cream
You what to say that IC usually abbreviates "integrated circuit" and nearly never "ice cream", and that is true but nearly irrelevant since the issue is what does "IC" mean in
Homemade Integrated Circuit
is tough to swallow because making integrated circuits usually takes $billions and lots of highly dangerous chemicals, all super tough to do at "home". So,
is actually MORE likely, even at Hacker News.
You can see this. It's grade school stuff. You are just having fun arguing against the obvious and are embarrassing yourself.
Why? Because a huge fraction, no doubt a huge majority, of the Hacker News audience concentrates on software.
In more detail, a lot of the audience uses laptop computers, WiFi hubs, smartphones, etc. and never sees an integrated circuit, not even in its plastic box on a circuit board.
I plugged together my most recent computer so saw the motherboard with its many integrated circuit packages, handled the integrated circuit of the processor, an AMD FX-8350, installed several adapter cards with their visible integrated circuit packages, etc. but still could not be sure about the meaning of "IC" in the title. Besides, making an integrated circuit at home is a rare and strange, also possibly interesting, thing.
Again, computing needs to work really hard to avoid use of undefined jargon.
Again, literally, even for the Hacker News audience,
is more likely than
Homemade Integrated Circuits
Again, once again, over again, yet again, one more time, the biggest bottleneck, a real ball and chain, on progress in computing is bad technical writing, and undefined jargon is one of the worst parts. Again, ..., in essentially all the more important technical writing it is 101 level rock solidly standard always, no question, to expand acronyms.
Right, there are some exceptions -- HTTP, HTML, URL, but the full list is short, and IC is not on it. Neither are JS, ASIC, CSS, ACL, OO, JSON, RSA, LDAP, CMIS/P, SMTP, SNMP, ASP.NET, ADO.NET, and some hundreds more.
There used to be CICS, IMS, MVS, VTAM, ISAM, VNET, SNA, IPL, RACF, CP67/CMS, VM, DB2, etc. which was priesthood jargon for some years but gone now.
This is just a rock solid technical writing lesson 101. Accept it or not as you wish.
Jargon is an insider thing, and everyone else gets irritated.
was obscure. Since I've also heard of TSMC (Taiwan Semiconductor Manufacturing Company or some such) and the $billions for making ICs, that an IC could be "homemade" was strange context.
Believe me or not, but I say again, once again, over again, yet again, computing desperately needs to avoid undefined jargon and acronyms, and in this case
would have been much better.