Hacker News new | past | comments | ask | show | jobs | submit login
How to make a CPU – a simple picture-based explanation (robertelder.org)
605 points by robertelder 69 days ago | hide | past | favorite | 169 comments

If you ever code something that "feels like a hack but it works," just remember that a CPU is literally a rock that we tricked into thinking.

Not to oversimplify: first you have to flatten the rock and put lightning inside it.

Source: https://twitter.com/daisyowl/status/841802094361235456

> If you ever code something that "feels like a hack but it works," just remember that a CPU is literally a rock that we tricked into thinking.

Is the "rock" "thinking" though? What _is_ "thinking"? Another way to look at a CPU is as a rock that we "carved" to conduct electricity in a specific way. It doesn't think - we interpret how the electricity flew and "think".

I asked my thinkingrock if it could think and it said yes. Really though, "thinking" is very hard to define. Do animals think? Do insects think? Do bacteria think? I'm sure somewhere we could find an example of a "living" thing that people consider to have thinking capability whose abilities are substantially more primitive than a CPU. So... why not say a CPU can think?

It'd be really awesome if microprocessors, even at a low-end process node like 130 nm, could be made with room-sized machines or smaller. There's a lot of space for companies wanted to manufacture their own MCUs, for instance, without relying on massive supply chains.

I think this'll happen at some point, as silicon manufacturing hits final roadblocks and becomes increasingly commoditized, but it'd be nice if it were sooner rather than later.

(This would be nice for self-sufficient decentralized communities being able to produce their own microelectronics as well.)

Well, this chap managed to make his own chips at home: http://sam.zeloof.xyz/category/semiconductor/

Plus an electron microscope!

Sam is super impressive, but to be fair he bought premade wafers and bought a used SEM.

Jeri Ellsworth has gone "deeper" into the process and made a few transistors, although at a much larger scale:



To me this is far more impressive than any of the other stuff posted here. It's actually doing it rather than talking about it or faking the first 99.9% of it.

Did he do his own etching? I'd say buying blank wafers would be a perfectly reasonable place to start, but if you're buying printed wafers, then you might as well buy the whole chip.

Yeah, I don't think people will be pulling pure silicon crystals at home any time soon.

That’s actually quite doable in a garage, assuming you are satisfied with, say, two inch wafers. (Currently, they are approaching 18 inches!)

It’s much easier to produce silicon ingots than to do wafer fab, because of the currently tiny line widths.

But, it would make more sense to simply buy wafers (or epitaxial wafers).

If they do, I imagine they’d leave it at that instead of going on to make a chip.

I believe he bought them with the gates etched & fets doped, and did his own metal. Something like that.

He used wafers with premade structures on it.

No, Sam Zeloof started from blank silicon wafers and did four layers of photolithography to produce PMOS gates. He used boric acid for the boron diffusion. He sputtered an aluminum layer on top for the metal and etched it with phosphoric acid.


So he "only" did metal deposition to make the interconnects? I haven't read the story, but I was thinking that there's no way you would handle some of the toxic chemicals used for doping.

They are not really that toxic. You do have to have a chemistry background and use a fume hood and a process sink with acid neutralization capability, like you find in many chem labs.

Good dopant choices would be phosphorus oxychloride and oxidized boron nitride wafers.

As I recollect, he started with silicon wafers with an epitaxy layer. He then did the usual oxidations, diffusions, and metallization, along with all the photolithographic steps. No small feat!

Ok. I was never a process guy, but worked with some back in the day and they told me that one of their phosphorus sources was phosgene gas.

Silane, phosphene, and diborane gases are used to form epitaxial layers on silicon wafers. Those are truly dangerous gases. However, a simple wafer process using the reagents I mentioned can produce very credible ICs, if one buys the epi wafers.

If you knew enough VHDL you can make your own digital chips including CPUs using off the shelf FPGAs...


Yeah, certainly true, but FPGAs are beholden to the same supply chain. This doesn’t fix the issue at all.

Helps a lot though, it's much easier to be a consumer of a commodity (the FPGA) than to make the supply chain produce your bespoke product.

how steep is the learning curve for VHDL?

I made a simplified MIPS (no FPU, 4-stages pipeline with a basic interrupt controller and nothing out of order or the like) in VHDL for a uni course. We had some simpler exercises before that (e.g., PS2 keyboard controller) and with that I found it quite OK.

One just needs to avoid falling for the similarities with programming too much, and rather think in terms of signal and have the clock in mind.

The worst things are the tooling, e.g., I expected Quartus to crash or hang at any moment, I early wrote a TCL script to setup the project and FPGA pin mapping to be able to scratch the Quartus project at any time and just recreate it in seconds without losing any work (or my mind).

There's ghdl (open source) which I found quite nice, but that's only for virtual development and (at least then) had no support for getting out a bitstream to load onto a real FPGA, so often I used it for test benches and some quicker development tries before actually loading it onto the FPGA every so often.

I find VHDL unpleasant (and Verilog only slightly less so, even if it's the worse language) but the hard part for someone with a programming background isn't the language per se, it's learning how to design digital logic, which is very different from writing software.

Any university graduate-level fabrication lab class has the equipment to do this. A self-aligned, metal-gate process is relatively simple. You can probably build something akin to a 6502 in it.

The main issue is creating the masks. I'm not sure where you would get rubylith and the associated machinery for contact printing in this day and age.

As a side note: the thing stopping commodity VLSI is the CAD tools, not the silicon. Silicon runs are around $50K or so for an old node and Fab Shuttle/MPW(multi-project wafer) runs are often under $10K.

I don’t know if $60k sounds cheap to you, but I’d hardly call it a commodity.

Injection molds cost $50K all the time and are considered commodity.

Just because you, personally, aren't willing to spend the money doesn't mean something isn't a commodity.

The author misleads you with a point I brought up, it is not possible without globaliziation providing essential raw materials components that have no replacement unless that is your local environment. Spruce Pine has silicon that gives the US microchip supremacy that has dominated for the entire duration of manufacturing. It will never be commoditized that you can take off the shelf raw materials from anywhere locally and refine them. In the structure of globalism I linked to how google is allowing home designers to produce older technology chips.

https://www.electronicsweekly.com/news/business/diy-chip-10k... https://www.hackster.io/news/efabless-google-and-skywater-ar...

If you are interested in self-sufficient decentralized communities, microchips are not essential for a good society or long life. https://aeon.co/ideas/think-everyone-died-young-in-ancient-s... They're useful for being able to make non specialized hardware that can run general programs that many can support. Microchips do calculations and are more useful in scaling, analog computers can take some roles but it will be more wasteful to produce specialized hardware.

I don't know if vaccuum tubes need globaliziaton to make but you are not going to make decentralized microchips with local goods, they are not fungible raw materials like food.

I don't think it's inherent that doped silicon will stay the dominant microelectronics substrate and I think it's plausible that people will find new ways to grow it that don't require excellent raw materials.

Where is the evidence for this change or why is it plausible? Unless you have some information I don't, it has never been the case and there isn't a reason to expect it. I hope it can happen, but manufacturing shows no signs that we may make it from a more common material, if anything they will use an even rarer more difficult process because purity of materials is very important and the result of lower quality silicon is so bad that nobody uses it, the chip shortage and reliance on a few places isn't ideal but there is no alternative except not to use them, which is possible. There is a lot of interest for everyone to have it but its not plasible, it can be minized but the refinement is very important, and there isn't any reason for a small community to stop buying globally thir plans at the highest quality plants have better yields, better quality, and they will be more energy efficient. Homegrown parts will not even work because it relies on economies of scale.

However, it is plausible to not rely too much on microchips in general, a pot can control fan speed with older motors, but we lose brushless fans. You may really enjoy post collapse guides if you are interested in what can be made locally.


There are great youtube videos on like primative technology as well that makes things from scratch, even steel cannot be made locally without the proper materials so there needs to be some sort of global/larger community unless they live in a resource rich environment (the US does not have metals like Eurasia does).

Making chips at home, by building the required setup from generic parts, is feasible. So is achieving nuclear fusion at home, by building a fusor.

Both are educational and cool, and neither makes economical sense.

I wonder what other processes could be used instead of photolithography to etch the silicon, deposit metal, etc? If you had a machine that started with a doped wafer, and worked one gate at a time, instead of a whole layer at a time, it would be way slower, but could, in theory be testable as it goes, and produce arbitrarily complex chips.

Also, I wonder what other materials might work, even if they are much slower, lower power, etc. Copper Oxide, for example?

I have similar questions. I wonder what wild alternatives can be substituted. Meaning, instead of thinking of making transistors on a silicon chip, can we make transistors on any other material or in any other fashion (say tapping into existing structures in nature) in a much easier way? Can we somehow exploit synthetic biology? Etc. etc. To replace the many assumed and developed processes and steps of the supply chain with much more feasible and easier approaches might result in a more efficient process in general.

Basically, we need functional understanding of all components, and hierarchical functional equivalents/isomorphisms. Aka "What is the end function of logic gates?" "To do X" "Are there simpler, easier, faster, more feasible structures that do X? Any way whatsoever? Does not need to relate to the current discipline or our established approahces" and iterate that question over all components and steps of the chip making process.

yeah you are onto something. massive supply chains have a ton of carbon footprint. which some people that doesn't matter but for someone like myself, i am a bigger fan of less carbon emissions. i wonder if there is a way we can build an etching machine and print chips somehow, the process seems a little clearer after watching this simplified video. i think it can be done, everything complex is just a bunch of simple steps, solve each step and get closer to the goal. might be fun to create a github type community where people push their ideas to a source control platform where others can chime in and give their input. so like open source chip manufacturing, kinda like how 3d printers started out with makerbot and other open source printer projects.

i think it can be done and it would be fun. we have to filter out people who have a vested interest in chip manufacturers because they may try to over complicate the process to protect their purse. so like a vouch system, where we know the people coming in have the right heart and won't purposely screw up moral

I think it's extremely unlikely that a small scale DIY process would be more energy efficient than using existing fabs.

> massive supply chains have a ton of carbon footprint

Are you talking about distribution? Because I would have thought the small size/weight of wafers would mean those costs would be fairly small. Maybe (silicon) packaging could be done locally, but even then packaged chips weigh little.

While energy, and thus CO2, is a significant part of solar cell production (because of the sheer scale), the attributed CO2 output of the chip industry is tiny compared to the commercial value of the chips produced.

so carbon tax would be effective in this scenario?

Current C-suite culture should suffice.

You can buy or rent Yokogawa modular machines for making chips. Fits in a room.


*I added a reference link

Any idea what something like that costs?

It'd be really awesome if it could be done at home. We need the ability to make hardware at home just like we can download a compiler and make software at home. Computing freedom depends on it.

Circuit designer here - feel free to ask any questions about the manufacturing process, design etc.

Having a thorough understanding of the process, I thought this was hilarious. But if you really want to understand the process, it's pretty terrible. It spends 10 steps on making a wafer, and then the bulk of the actual process is condensed to 16.

Suppose I want to make a custom ASIC with 20B transistors. I have a lot of money to spend but no semiconductor experience. How do I go about hiring good chip designers? How much $ should I realistically expect to spend on design, verification, and fabrication, respectively? I've heard $20M is the ballpark for a mask on a leading edge node. What is the marginal cost per CPU?

That sounds way too high for mask costs. You're definitely looking at 7 figures, but I don't think they've yet hit 8. Unfortunately the answer to all of your budget questions is: it depends. It's going to scale with the complexity of your requirements. Something like a custom ARM chip with heavily custom machine learning and signal processing, high speed clocking, etc, would probably need a decent sized team and a few years. On the other hand, there are tiny teams in Asia that crank out bitcoin mining ASICs like candy.

I'm going to guess at a number and say you're probably looking at $10-20MM all in.

As far as hiring good designers, you really need the relevant technical background to screen them. Assuming you need a team, I would suggest starting with someone in a director position at a large company with experience in your area of interest. They would be able to more concretely define the project and determine human resource requirements/allocation.

If your ASIC is just large but not complex (meaning lots of repeated structures), and you can get away with just a few strong designers, I would suggest hiring a consultant to help you define the project and screen candidates. EE profs at your local university might be a good start.

Feel free to shoot me an email (in my profile) if you want to describe your project in more depth and I'd be happy to offer what advice I can.

> That sounds way too high for mask costs. You're definitely looking at 7 figures, but I don't think they've yet hit 8.

7nm logic mask set costs are estimated at $10.5M. Of course you could always use a slightly less leading-edge node and save a lot of money.

Slide 13:


Huh you might be right, if only just. It says it's an estimate, but that probably means 3nm nodes will be >$10MM for sure

20B is slightly bigger than the Apple M! (10nm) and slightly smaller than the 32 core Epyc (14nm), which you should consider are both very substantial projects from a headcount, simulation, tools, licensing, etc.

To put it in perspective, a company a friend founded that is doing a very large but very regular ASIC for TSMC 10nm raised twice what you're ballparking and will end up doing another raise before tapeout. Chips nowadays are _expensive_ if you're anywhere near state of the art. The people who do very high performance chips are also very expensive.

But I'm also guessing that unless you're doing a bitcoin hashing chip, AI chip, or GPU, all of which is regular, your scale estimate is probably very off. If you're doing any of those, just don't.

That said, there are SOC vendors who do core plus, so it's possible that if this is just an accelerator, no matter how wide, you might be able to outsource the whole thing.

Do you think chips in the 10k-100k transistor range will some day be able to be produced by hobbyists? Or are the chemicals simply too dangerous and machines too expensive to be affordable at that scale?

No, I don't think so. But it's not because of the chemicals or machines. There's not really any demand for it. Most hobbyist "ICs" are fully digital and can already be realized on an FPGA. For simpler applications, you can probably program a microcontroller to do what you want.

Integrated circuits are appealing to industry because they're integrated - they can be smaller and they reduce cost (long term; still need the upfront investment). These are important things for many products, especially in RF, but they aren't really driving factors for hobbyists.

That being said, people are trying! http://sam.zeloof.xyz/second-ic/

I think it probably depends on what "produced" means here.

If it means designs inside an EDA environment, design get submitted and shows up realized, then that is possible now. And it's not all that expensive. eFabless chip ignite is quoting ~$10K for 100 QFNs in 130nm. That's getting into beater car territory. https://efabless.com/chipignite/2110C

If it means actually fabricate then I think there is no way because a DIY won't have the scale to compete on price and they won't be able to bring any custom processing step to justify being at boutique scale. Think about PCBs. We used to make them ourselves with chemical etch. Now I don't even use breadboards because a custom bare fab is $5 and the components cost more than that. I also get a much better electrical result, and it doesn't fall apart if I look at it funny.

The main problem with ASICs is the amount of skill/time that it takes to do it right. Floor planning, track planning, closing timing, etc. etc. on an ASIC is much harder than an FPGA. You don't even have to do half those things on an FPGA.

With an FPGA one can almost get compile and go if you're willing to be loose on area and performance. ASIC CAD tooling is no where near that at the moment. Closed or open source.

Yeah, I saw and am amazed by his work, which made me wonder whether chip manufacturing could get as streamlined, compact and mainstream as 3d printing is today!

I think the first step is making EDA (electronic design automation) more accessible. Right now, even if you could do the fabrication yourself, doing a real, useful design would be just as challenging because the tools are all proprietary and expensive.

What about someone buying old machines and providing it as a service like the low-end PCB manufacturers today? They seem to be doing okay.

It's possible. The chemicals are not too dangerous, and yes, you can train yourself. Maybe it wouldn't fly in the EU.

> Maybe it wouldn't fly in the EU.

May I ask why?

Because Europeans are afraid of everything. I was trying to avoid a comment that seemed politically charged but I don't know any other way to say it.

I was into hobby chemistry for a while -- people trying to do anything in the EU find it almost impossible. Poland and Eastern Europe are better about it, but eh.

Even in the US, chemistry is avoided and even deplored. Where are the Christmas chemistry sets of yore? People are scared to death by anything having to do with “chemistry”, yet they think nothing of jumping into brewing.

Being scared and being banned are two separate things. If you want to see how many dangerous chemicals are available online in US see Cody's lab or NileRed.

NileRed is particularly good.

I rather think that Cody is a bad influence, of the kind that could attract unneeded regulations.

How are the pn junctions created? The article says I can "optionally" dope the wafers. Why is that optional? Are they doped to create both p and n areas, or just one type? Or are multiple wafers used? Or are areas somehow doped after etching? Thank you for entertaining my dumb software person questions.

This is one area the article did a very poor job of explaining. In some processes, you treat the entire wafer before you get started with the rest of the procedure. One example that does this is SOI (silicon on insulator) processes. Others may not need this "global" doping, and from what I know most bulk silicon cmos processes do not do this.

Then you do the lithography (photoresist developing, etching, etc) to expose specific regions on the silicon that you want to dope to create devices like transistors, for example. So first you might expose all of the p-type transistor (PMOS) diffusion areas and dope them. Then you'd remove all the photoresist, repeat the procedure to expose n-type diffusions and dope that. And so on for the various needs of that particular wafer.

PN junctions are created simply by having p-type doped silicon adjacent to n-type doped silicon. The boundary between the two is the PN junction. In practice what I usually see is a square of one type with a ring around it of the other, but these devices are not frequently used.

Can you talk about if the US has microchip supremacy due to raw materials from Spuce Pine's pure silicon, and if there are any sources that are almost as good or being used instead? Is it almost like De Beer's diamond monopoly?

Do you use any countries or specific factories that do better refinement or are the raw materials directly shipped to the manufacturing country? What do you make from the wafers usually? Are certain sizes much harder to make? I know for example that larger sensor for digital cameras are much harder to make. I also heard of redundant circuits used to increase the yields of chips, how often is this used and when is it most useful versus less?

The US may have technological supremacy when it comes to design and the resultant products, but it lags behind in terms of actually manufacturing semiconductors. The dominant players are in Asia (TSMC, Samsung, GF) and Europe (GF). Intel has their own fabs, but they are currently behind the competition, and up until very recently only fabricated their own products. There are a lot of other companies in the industry with fabs, but they're generally making their own products - discrete devices - rather than integrated circuits/System-on-a-Chip's.

Another issue is the equipment used for manufacturing. It's very hard to come by, and the classic example is ASML (Netherlands), which dominates the market for lithography equipment.

I work on the design side, not in a fab, so I can't tell you much about sourcing or refining the silicon for wafers. Wafers are used to make every single microchip you can imagine. There has been a slow but continuous push towards using larger wafers, since its more cost-effective. I imagine it's more difficult, but couldn't tell you any specifics.

As far as manufacturing each individual integrated circuit: yes, larger is harder to manufacture because there is more physical space for a defect to occur. There are some design challenges as well when you get very large, but it's not a significant overhead because you're usually doing your design in sub-pieces anyways.

Some designs do use redundancy, as you mentioned. This is more often the case for very large, very uniform structures, like DRAM, flash, CPU cache, etc. But there's a tradeoff because you waste money on that redundancy for every chip that comes out with no defects. And there's overhead to actually testing the part in order to utilize the redundancy. In my experience, yields are targeted at the high 90%s these days, so the redundancy would have to be very cheap to be worth it. For almost all RF, analog, and mixed-signal circuits, there is no redundancy. I'd say most digital circuits, except the largest, also don't have any.

Thank you so much for this industry info. When you design, do you think that Shenzhen is the best in terms of innovation in microchips? They have very interesting random chips that are undocumented, underground, and seemingly random, like the ESP8266 being used in some consumer goods, people noted it, hacked it to run custom software, its essentially a cheaper more powerful and higher energy usage arduino type reprogrammable chip with wifi built in, the company later released an SDK and aruduino studio was also ported. I don't know if there are some wafer that cannot have any defects in them, camera sensors come to mind.

I’d never heard of Spruce Pine, interesting. I wasn’t able to read the wired article but another article said it wasn’t the silica/silicon that’s the world-beater there though, it’s the quartz for use in crucibles etc?

Also - does the US really have microchip supremacy? The highest tech fabs are non-us (Samsung and TSMC)

https://web.archive.org/web/20180808115837/https://www.wired... Here you go. Who gave them the technology in the first place and who do they rely on? They did not make it on their own, and the US's reseach, sphere of influence, technology shared in the western world is why these countries were able to advance in advanced electronics. Look at the founder's history in the US. https://en.wikipedia.org/wiki/Morris_Chang Its hard to say what country multinational companies are, you can note their location but does Toyota become Mexican if they assemble their cars in Mexico, Are they American if they assemble it in the US or does Apple become Chinese if they use Foxconn a Taiwanese company in China? If by fabrication location, when TSMC's 3nm plant opens in Arizona, or Samsung opens their 3nm in Texas that mean Taiwan or South Korea has supremacy or the US? The US controls the supply of the best materials, who gets technology, and sells goods like military technology, its lead the semiconductor business and I would have to say without the ideas, and materials from the US, it would not exist.

If I remember correctly, the silicon used to manufacture transistor-grade wafers is purified to roughly 10 impurities for every billion silicon atoms. Less pure starting stock might increase the cost of getting there, but probably not by too much.

No, it is not possible. Refinement isn't changing to lower quality, if that was true then every enemy of the US would be refining their own materials. It is in everyone's interest not to rely on a single place for every microchip materials, but the fact that is hasn't happened, and if it didn't incraes the cost by much why hasn't it happened with the US's global enemies? The USSR was never able to gain eqivalence even with the same blueprints and they were not able to be made in bulk.

>Soviet computer software and hardware designs were often on par with Western ones, but the country's persistent inability to improve manufacturing quality meant that it could not make practical use of theoretical advances. *Quality control*, in particular, was a major weakness of the Soviet computing industry.


Not what you were offering so feel free to ignore this, but I'm super curious about RISC-V. Do you know if any serious attempts are happening to make RISC-V based systems? And if not, do you know why? Is it too raw/un-polished?

Yes, it has no licensing fees, and its being used in pinepen, some small electronics like calculators, and do you mean dev kits like this? They are arduino like devices with low power edge AI/neural capabilities. https://www.seeedstudio.com/sipeed

On linux, they are making one with an allwinner chip called the D1. https://www.hackster.io/news/sipeed-teases-linux-capable-64-...

RISC-V is not inherently better or more secure, its a different instruction set with no fees, so anyone can make one, its possible to be less secure.

RISC-V probably has the most "serious" effort of anything other than ARM and X86.

That's a fair estimate, but let's not forget IBM's Power architecture.

Power is very cool and I want a Power10 box a lot but I'm not sure if it's fair to really decide in it's favour because POWER is basically unobtainable whereas you can actually buy dev boards for RISC-V from SiFive.

There is also Sparc and to a lesser extent MIPS. They are more mature than RISC-V and have a decent software ecosystem.

Not my area sorry!

Have you ever used https://www.fossi-foundation.org?

If I had the skills I would immediately investigate how to couple a RISC-V CPU with some open GPU on that platform!

With most open source hardware efforts, like this, it's not really so much circuit design as it is just code. All of the open source efforts I've seen revolve around digital circuits, which are written in some kind of RTL and turned into an actual circuit by completely automatic processes. It's still great, and I fully support them, but there's a massive amount of non-digital that's crucial to getting these systems up and running. Not to mention that even if you have an open source CPU RTL, for example, you'll need access to a closed-source and often NDA-blocked PDK (process development kit), fabrication company, etc. I remember seeing some efforts at open sourcing the PDK part as well, but remember being unimpressed.

It's infotainment, not information. I really like the 'indistinguishable from magic' video:


lots longer, but far more informative and less fluffy.

When designing circuits, I cannot imagine you lay everything out by hand. Do you have specific sets of pre-made circuits/connections that you just kind of copy paste all over depending on the desired functionality of the chip?

Analog and RF circuits are laid out completely by hand! But there are often a lot of symmetries and repeated structures that mean you can re-use a lot of layout. Also, designs are often copied, reused, and repurposed, so those layouts just get modified to adjust, which saves time.

For large digital circuits (e.g. CPUs), it's all automated. There's a lot of human involvement, but ultimately a computer is placing all the transistors and wiring them together.

Is it possible to embed optic fibers and optoelectronic gates along with semiconductor gates? How is the fabrication process different? Probably there are size limitations due to fiber width and paths?

This is a very active area of research. It's quite technical though so it takes a good amount of study to get into it.

https://en.wikipedia.org/wiki/Silicon_photonics https://en.wikipedia.org/wiki/Integrated_quantum_photonics

I'm not very familiar with the details, but I know this is an active area of research. Generally you use a process thats conducive to optics (like SOI), you fab the regular silicon in some areas, and you have separate process steps to build the optical devices.

Hmm, I was fairly certain there were a few steps missing when the smashed rock suddenly transformed into an ingot.

If minecraft taught me anything it’s that ore doesn’t spontaneously turn into ingots.

What's the canonical book? I used to know this from my EE days, but it's been too long.

For circuit design or manufacturing?

I haven't been in school for a while, so I'm not sure what's current. I really liked Baker's book - CMOS circuit design. It had a decent overview of the manufacturing process for the perspective of a designer, as well as good introductions to major design topics.

Unfortunately, with modern processes, most of the textbook design equations and learning no longer apply so it becomes as much learning an art as it is science.

Both. Ah I think I might have had that book until my parents' basement flooded.

Yeah of course when it comes to how things are actually done it's hard to know without actually working in the field. But I just wanted an overview.

I think the Baker book is good for an overview. I also used Pierret's Semiconductor Device Fundamentals in undergrad. It goes more in depth for device physics than it is a pure manufacturing text, but I recall it also had a nice overview.

Baker http://cmosedu.com

And Weste http://pages.hmc.edu/harris/cmosvlsi/4e/index.html

are both very good. Weste is used in more Universities but it also has a digital slant to it.

which company would you invest in?

I recall there being a game mentioned somewhere here on HN, where you start with basic logic gates (I think), and you build up a fundamental CPU at the end of the game, using the parts you discovered along the way.

Problem is, I don't remember what the game is called, and no amount of searching seems to help me. Anyone know what it was called?

- Turing Complete (2-dimensional circuit building, mission-based): https://store.steampowered.com/app/1444480/Turing_Complete/

- NandGame (2-dimensional circuit building, mission-based): https://nandgame.com/

- Logic World (3-dimensional circuit building, no missions/goals yet): https://store.steampowered.com/app/1054340/Logic_World/

Not the same game but if you like that genre zachtronics is really good at logic games and has assembly programming. https://zachtronics.com/tis-100/

>Print and explore the TIS-100 manual, which details the inner-workings of the TIS-100 while evoking the aesthetics of a vintage computer manual!

>Solve more than 20 puzzles, competing against your friends and the world to minimize your cycle, instruction, and node counts.

>Design your own challenges in the TIS-100’s 3 sandboxes, including a “visual console” that lets you create your own games within the game!

>Uncover the mysteries of the TIS-100… who created it, and for what purpose?


>Build circuits using a variety of components from different manufacturers, like microcontrollers, memory, and logic gates. Write code in a compact and powerful assembly language where every instruction can be conditionally executed.

>Read the included manual, which includes over 30 pages of original datasheets, reference guides, and technical diagrams.

>Get to know the colorful cast of characters at your new employer, located in the electronics capital of the world.

>Get creative! Build your own games and devices in the sandbox. Engineering is hard! Take a break and play a brand-new twist on solitaire.

I didn't know an assembly game could be made, it's a pretty hard game only progammers and very logical people would enjoy.

Well, if Zachtronics is mentioned, then worth pointing out that the older games fit the CPU manufacturing theme better. He had a game literally called "Silicon Foundry" which used premade blocks, and then KOHCTPYKTOP which got more into the nitty gritty

I think the problem with zachtronics games is that many people feel "I would rather program at this point", did you ever feel that way too?

Not a game, but https://www.nand2tetris.org/ is a similar concept.

Yes, that was exactly it! "Turing Complete" it's called apparently, available at Steam: https://store.steampowered.com/app/1444480/Turing_Complete/


That pixel based logic simulator, I made some years ago, might also entertain you:


Silicon Zeroes [1] starts out slightly more abstracted than what you described (byte-level operators, register files, ALUs, that sort of thing) and builds up to a full CPU.

[1] https://pleasingfungus.itch.io/silicon-zeroes

I've always been curious how someone gets into this line of work. Is it all via college / post education and your directly recruited by these companies? Obviously, this is a very hard (impossible?) thing to teach yourself. I can't imagine more than a few universities offer this type of education? Where would you start / what path would you go down to be a chip designer / work for an Intel / AMD ???

My brother and several of his friends do this for a living at Intel. Most of them have Electrical Engineering PhDs, though I believe one is a Chemical Engineer. My brother's thesis specifically was in 2d transistor design and worked under a Material Science professor. I believe most universities have professors who teach Semiconductors classes, whether it is under the name Computer Engineering, Electrical Engineering, Electronics Engineering, Chemical Engineering, or Material Science.

It would be difficult to learn on your own as explained in the article: you need a lot of specialized equipment, a high class of clean room, and a lot of very dangerous chemicals. (My brother once described what the hydrofluoric acid he used semi-regularly does to person and completely horrified our parents).

Downside of this field is that there are very few job opportunities without relocating. If you're in the US, you can work at Intel... or Intel. Unless you're willing to move to Taiwan and work at TSMC.

Hydrofluoric acid and other "fluorinating" chemicals like chlorine trifluoride really are horrific. Most of my experience in chemistry is from a brief stint working as a student helper in an undergrad chemistry lab, and (thankfully) never encountered HF, but we were told many times just how dangerous it is.

It's been a while, but I remember the biggest danger isn't the acidity itself, not even being a strong acid, but fluorine's tendency to "deep dive". It just sort of slowly eats into things and creates layers that are comparatively hard to remove. So if you spill hydrochloric acid or whatever on yourself, you wash it off, maybe get some severe tissue damage, but it's localized and washes off.

On the other hand, the HF tends to stick around, and as a fun side-effect, the fluoride salts it creates are poisonous to the body. And HF is tame compared to some fluorine chemicals used in chip etching/production...

> Downside of this field is that there are very few job opportunities without relocating. If you're in the US, you can work at Intel... or Intel. Unless you're willing to move to Taiwan and work at TSMC.

Only if we're talking specifically about cutting edge logic. There's Texas Instruments and GlobalFoundries in the US on the trailing edge for logic. In memory, where the fabrication techniques are similar, there's Micron, IMFT in Utah, and Samsung in Texas. Not to mention that TSMC is building a leading edge logic fab in Arizona.

And then of course there are all of the capital equipment suppliers, where the US punches way above its weight. Applied Materials, Lam Research, and KLA are all headquartered in California and employ a lot of the same talent that Intel does.

I was a bit hyperbolic in my statement, mainly because I was parroting some dinner table conversations. However your list is still fairly short and few are in the same location. If you say, get a PhD in the Midwest in EE specializing in novel transistor fabrication techniques, you likely will have to move states to get a job in the area of your expertise. And if you are unhappy in your job for any reason, you will have to move again to work for a different "big name" company.

I guess the point I was trying to make in my trite (and I admit, inaccurate) statement was that it's not as accessible of a career path as, say, coding. It's more like becoming a rocket scientist: there are very few companies to pick from in that field. And they're not typically in the same place geographically.

There are other fabs in the US - Samsung, Global Foundries, NXP, and other smaller places.

Also, if you want to be a "chip designer" then you can work for a fabless company too. They're basically everywhere.

I got my electrical and computer engineering (ECE) degree in 1999 from UIUC and learned everything but the very lowest-level chemistry, because I specialized in the VLSI (circuit design) side instead of fab. At that point, stuff like MIPS and the DEC Alpha were popular, and computers were just breaking the 1 GHz barrier.

Unfortunately the dot bomb happened right after I graduated, and the anti-intellectual backlash of the early 2000s killed independent research during the outsourcing era, which never recovered.

Sadly from my perspective, very little has changed in 20 years. Computers only reached about 3-4 GHz, and kept doubling down on single-threaded performance for so long that companies like Intel missed out on multicore. Only Apple with their M1 seems to have any will to venture outside of the status quo. The future is going to be 256+ symmetric cores with local memories that are virtualized to appear as a single coherent address space. But that could take another 20 years to get here.

Meanwhile we're stuck with SIMD now instead of MIMD, so can't explore the interesting functional paradigms. Basically I see the world from a formal/academic standpoint, so I think in terms of stuff like functional programming, synchronous blocking communication, ray tracing, genetic algorithms, stuff like that. But the world went with imperative programming, nondetermistic async, rasterization, neural nets.. just really complicated and informal systems that are difficult to scale and personally I don't think much of. Like with software, honestly so much is wrong with the hardware world right now that it's ripe for disruption.

Also hardware was a dying industry 20 years ago. We wanted fully programmable FPGAs to make our own processors, but they got mired in proprietary nonsense. There really isn't a solution right now. Maybe renting time at AWS blah.

I feel a bit personally responsible for the lackluster innovation, because I wasn't there to help. I wasted it working a bunch of dead end jobs, trying to make rent like the rest of you. And writing text wall rants on forums that nobody will ever read anyway. So ya, don't be like me. Get involved, go work for a startup or a struggling company that has the resources to fix chips, and most importantly, have fun.

> Meanwhile we're stuck with SIMD now instead of MIMD…

What about multi-core/multi-threading combined with massively out of order CPUs? Intel and AMD’s chips have a dozen or so execution ports. So you can have your PADD running on one port, and a PMUL on another. It just happens all being the scenes.

Intel tried a VLIW architecture with Itanium, but it was a flop for a variety of reasons. One of which was the lack of “sufficiently smart compilers”. There’s also the benefit to all the nuances of execution being in hardware: programs benefit from new CPUs without having to be recompiled. It has a much more intimate knowledge of how things are going than the software does (or even the compiler).

>Only Apple with their M1 seems to have any will to venture outside of the status quo

I find this an interesting opinion considering that the M1 is really just "The same, but a bit larger" - IE slightly higher performance at a higher cost.

What exactly do you see with the M1 that makes it so different?

Well you're right, the M1's unified memory is not technically that much different, but it's a start. And I don't like the mix of components either. They seem to be copying previous trends, like when FPUs were integrated on-chip. Eventually we'll have some kind of standardized SIMD unit like with Intel's integrated GPUs.

But I don't want all that. I just want a flat 2D array of the same core, each with its own local memory. Then just run OpenGL or Vulkan or Metal or TensorFlow or whatever the new hotness is in software. All Turing-complete computation is inherently the same, so I feel that working in DSLs is generally a waste of time.

Arm is a relatively simple core so scaling an M1 to over say 64 cores is probably straightforward, at least on the hardware side. People complain that chips like that are hard to program, but it's only because we're stuck in C-style languages. GNU Octave or MATLAB or any vector language is trivial to parallelize. Functional languages like Julia would also have no trouble with them.

Once we aren't compute-bound, a whole host of computer science problems become tractable. But we can't get there with current technology. At least not without a lot of pain and suffering. What we're going through now isn't normal, and reminds me a lot of the crisis that desktop software reached in the mid 90s with languages like Java just before web development went mainstream.

This doesn't really answer the GP and reads like sour grapes from a "formal/academic" type who missed the boat on the last 2 decades of advances in computing and AI.

> But the world went with imperative programming, nondetermistic async, rasterization, neural nets.. just really complicated and informal systems that are difficult to scale


Don't worry, we got you covered on the MIMD front ;)

> anti-intellectual backlash of the early 2000s

I've never heard of this. Could you elaborate, please?

this is not an ellaboration, but a mere pointer based on one example: the sitcom Friends is sublty anti-intellectual and it came out around that time.

Eh, Futurama was close, too.

I'm born in 1994, I would also like to hear about this!

Strange, because for me, the stark contrast between the 90s and 2000s is unmistakeable. After the dot bomb and 9/11, the political climate in America went dark and that's also something that's never recovered.

America decided to double down on neoliberalism with the war on terror, so we've had endless bizarre legislation like the DMCA and PATRIOT act coinciding with our exploitation of developing countries and fear of the other. But we've only had a handful of the really important innovations like blue LEDs, lithium iron phosphate batteries, and enough Moore's Law to miniaturize computers into smart phones. We needed moonshots for stuff like cheap solar panels and mRNA vaccines a long time ago. We needed pure research that we didn't have. Yes we have these things today, but to me, having to wait around seemingly forever for them when we had the technology for this stuff in the 1980s, that looks like 20-40 years of unnecessary suffering.

For example, academia warned about the dangers of GMO foods and unpredictable side effects like autoimmune disease. Nobody ever listens or cares. Nobody cared when they warned about global warming or leaded gasoline either. But I am hopeful that this prolonged period of anti-intellectualism is finally ending and maybe the people standing in the way of progress are finally retiring. I've largely given up on real innovation from the tech world, so I've got my attention fixed on solarpunk now.

Asking how to work in chips is kind of like asking how to be an engineer -- there are a million sub-specialties so lots of paths.

Generally the degrees are Electrical Engineering, with classes along the lines of https://ocw.mit.edu/courses/electrical-engineering-and-compu... (note that's from 2003, just an example)

There's also a ton of physics, and chemical and industrial engineering in the process steps.

This was about 20 years ago, things maybe different now.

I got hired on the architecture side for GPU's after working a few years. My academic background was computer graphics with a focus on parallel algorithms and performance optimisation. After a couple of years working mostly on low level code, C/C++ and assembly. I got a call from a recruiter.

The semiconductor industry is larger than just Intel and AMD. Like any job, taking some time to look around the career pages should give you a good idea what skills they are interested in.

https://www.nand2tetris.org/ is a nice introduction to the how processors are put together. Book wise Hennessy and Patterson's books, Computer Organization and Design and Computer Architecture: A Quantitative Approach are good for background. I never did much on the layout side but learning Verilog and/or VHDL would be helpful but not essential.

Not trying to be rude, you need to be motivated enough to seek the answers yourself if its your truly important. If you want it badly you will find your answers. Here's a google sponsored inititive to help custom chip designers.

https://www.hackster.io/news/efabless-google-and-skywater-ar... https://www.electronicsweekly.com/news/business/diy-chip-10k...

Places like Shenzhen have a very good environment for this as well. https://www.youtube.com/watch?v=taZJblMAuko

> Where would you start / what path would you go down to be a chip designer / work for an Intel / AMD ???

Perhaps by studying Electronics Engineering (also called Computer Engineering, which is different from Computer Science).

At my university in USA, I remember recruiters from Intel setting up a stall or something to recruit students. It was in the building which mostly has computer science and computer engineering students, so I guess that's who they were looking for.

This was less than 5 years ago.

> this is a very hard (impossible?) thing to teach yourself

It isn't! The book "Code" by Charles Petzold is a great introduction to digital electronics and computer architecture. There's also the "Nand to Tetris" course (which I didn't take but people here are always recommending). You can build a simple CPU in a digital circuit simulator. If you're feeling adventurous you can write it in Verilog and simulate it, and even get it to run on a FPGA. This is all stuff you can teach yourself.

Of course this is not quite enough to make you a chip designer at AMD, but you'll know enough to get over the feeling that a microprocessor is an inscrutable artifact of alien technology brought from Alpha Centauri.

Nand2tetris, which is far more sophisticated than Code, would be a good background (highly recommended), but it has almost nothing to do with the technology actually used in a wafer fab.

The relevant disciplines are physics (mostly condensed matter), inorganic chemistry, industrial engineering, and electronics.

Find a school that teaches semiconductor engineering, take their courses through vlsi and ASICs.

Then land a job at a fab and the rest is learn on the job training.

It’s like the difference between a PC board fab and an electronics design engineer, taken to the google power.

At the public university engineering program I attended in 2000ish (UConn), we offered CompSci (mostly software focused), CompEng (mostly hardware focused), and CompSci+Eng (a balance of both). Either Eng program included several courses on hardware-level EE courses and circuit design, hardware engineering, etc.

One of the guys I did my senior design project with ended up working with AMD on processor stuff, so there are educational opportunities, I think you just need to be more on the CompEng/EE side of things and make it your focus.

I studied computer engineering (EE/CS) from 1996-2001. My senior year the college offered a minor in VLSI design, it was a 4 course series covering the very basics of semiconductor design and test and as one of the projects we actually paired up and designed a small chip, here's a photo of the finished die. Simple 2-layer metal 1 micron process - this is a serial multiplier for two sixteen-bit integers. https://vazor.com/drop/mulman.jpg

Most of the students in these classes were graduate students, so with our normal course load as seniors in engineering, this was a tremendous effort. For a four-credit class I would sometimes have to work 20+ hours a week just on one class.

But, it was a good stepping stone to get into the industry - my first job was at LSI Logic executing physical design, timing closure, etc for their customers. I learned a lot but eventually stepped away from it to focus on software and startups - I didn't want to die at that desk - the designs and teams were getting bigger and the design cycles longer. I did not relish the idea of working for 3 years on a single project.

I do look back on it fondly though as it was closer to what I consider 'real' engineering - we did a ton of verification work and if you screwed up, it might be a million in mask costs and 3 months of time to fix. We did screw up from time to time and the customer often had some fixes, so on a new design, there were expected to be a couple iterations of prototypes before you went to production. I think the last design I taped out was in the 110nm node - ancient by today's standards.

…electrical engineer?

If you're into this, Huygens Optics has an awesome series where he builds a wafer stepper at home. His is a little less advanced than Sam Zeloofs but the videos are way better -- https://www.youtube.com/watch?v=_w0Z2Y5vaAQ

The hilarity illustrates an important point. We never just make a thing. A recipe or blueprint is a convenient fiction. Rather we participate in a dynamic evolving process which itself evolved through many cycles of copying, repetition and debugging. Even the first version wasn't strictly original because the idea was borrowed from elsewhere. 'Oh you work at the olive press. How would you like a job with this new-fangled printing machine?' And so on back to the initial and highly controversial creation of the Universe.

You might not do great to start with computers, and instead, do better to start from the dawn of civilization.


What I got from this is that I could make homemade coal by myself, MAYBE. I don't know if there's any climate on Earth where I could eek out a net energy return on primitive crops. If there was no one telling me what to do, I would surely starve in early agricultural times. But hey, that's what the Pharaoh's for, amirite?

Basic bronze tools are a mind-numbing mess to mentally process.

You might like this video "I tried blacksmithing and only got slightly burned": https://www.youtube.com/watch?v=e2HUg144liM

https://bootstrapping.miraheze.org/wiki/Main_Page also feels relevant

It's interesting to imagine the human experience of reality when living in such times.

Take a look at the Toaster Project if you want to see building an item from scratch carried through all the way: http://thetoasterproject.org/

For a real case of someone making integrated circuits at home, this boy did it: http://sam.zeloof.xyz/ for real.

Nice illustrations, though step 16 is pretty much "Draw the rest of the F** owl."

I'm pretty sure producing the monocrystalline ingot with seven 9's purity also counts as the "rest of the owl", along with basically every other step in this guide. I'm not even convinced that the sawing step is simple – thermal stresses, limitations, impurities, tool hardness and dimensional accuracy concerns, minimising material losses, etc. (I loved his butter knife).

Yeah fair. Pretty much every part of the process is bonkers. Just the light sources draw something like 1 megawatt and weigh over 100 tons (for extreme ultraviolet). EUV gets absorbed by everything including air, which is bad, but the light source requires plasma so somehow you need both the plasma and vacuum in the same machine without a window or whatever to separate them. Then the features on the chip are so small that random variance in the spatial distribution of landing photons causes defects. And that's just the light, it's amazing anyone can make the process work at all.

The knife was a great touch! It did look like he was lining it up carefully...

Also, it took me two passes through to notice the tooth brush in step 16.

As a film photography enthusiast, I love the parallels between making a chip and film development. I wonder if film developing was the original inspiration

Photo etching is used for all kinds of things including making plates for printing, I'd guess that's the more direct ancestor.

Direct ancestor probably is mask-and-tape PCB design process, where the optical step also serve to shrink the (hand made) mask to significantly smaller size of the final board. In fact silicon masks were for a long time designed using essentially same process.

In fact if you want to make something mostly flat with small features some variation of photo etching process tends to be the easiest and most repeatable way to go about that.

Here's how you make a vacuum tube:


After all that hard work, sell it at a few cents per chip.

I found this absolutely hilarious and I’m not entirely sure why, just the tone of the instructions as if it’s an ordinary thing to make in an afternoon. Brilliant.

The article doesn't correspond to reality.

>3) Now you have 98% concentrated silicon dioxide. Purify it to 99.9% pure silicon dioxide.

>4) Purify it further to 99.9999999% polysilicon metal.

>While cutting-edge nanometer scale features are not likely to be accessible for a hobbyist, micron-scale amateur chip fabrication does appear to be quite feasible. I have not tried this myself, but Sam Zeloof has, and you should definitely check out his YouTube channel. I think you could probably even build some basic chips with far less equipment than he has if you get the optics right. You could probably make it a hobby business selling cusom chips to other tech people!

>A Word Of Caution: In case it wasn't already clear, I don't advise that anyone actually attempt making integrated circuits in their apartment in the manner shown in this video. The 'photoresist' and 'developer solution' in this video is just a colored prop. The real chemicals are usually hazardous and you should only work with them with proper safety gear in a well ventilated area or in a fume hood.

Its outdated and in reality you would go to Shenzhen or use a custom fab to make custom designed chips with raw materials sourced from special exotic materials that only make sense for scaled operation.

I highlighted the steps 3 and 4 because its not how its done at all. High grade silicon is obtained in a pure state and doped for the chips rather than obtaining random types and refining them.

Its not even easy compared to homemade nuclear reactors, which need a lot of natural sources of uranium to enrich but can be done, the refinement is more related to older germanium chips.

Somewhat related, it does not seem too expensive (~$100 per chip, minimum *50) to get an ASIC done (purely from a fab perspective and ignoring the large elephant in the room of coming up with a design in the first place), e.g. see https://www.zerotoasiccourse.com/ (no affiliation).

As for the design, one way is to re-use existing IP and join it together, e.g. see https://efabless.com/ etc.

He’s explaining, as crudely as possible, what’s going on in the semiconductor industry as a whole. Even Shenzen and TSMC are only small parts of it.

High purity polysilicon is still produced by zone refining.

Yes it is refined and doped, my point is that he leaves out the sourcing of essential materials that you don't refine but must get that is already pure, it has to be purer than any random silicon its like saying just get any sand that people will assume comes from beaches for construction when its essential that they are not worn or have rounded edges, except this is so specialized that it must come from one mine in the world. The limit was well understood in Soviet Russia with their inability to source the raw materials of comparable purity to produce microchips, even if the design was the same, the USSR was not able to produce them without what is considered essential to chips of good quality. https://www.sciencedirect.com/science/article/abs/pii/S00652...

Even today China cannot match the quality of American chips and relies on US raw materials from Spruce Pine for manufacturing chips. It isn't optional, its like saying you can make a 1:1 steak with nothing but beef bones or chicken. https://ashvegas.com/bbc-report-spruce-pines-high-quality-qu...

>This ultra-pure mineral is essential for building most of the world’s silicon chips – without which you wouldn’t be reading this article.

The BBC article is referring to high purity quartz, not silicon. It is used to make a quartz crucible - a large cup - of that high purity quartz. It’s essential that the crucible not contaminate the polysilicon.

(A note in passing: the semi industry doesn’t use hyperpure silicon. They use a lesser grade and add epitaxial layers.)

The crucible can stand the high temperature of molten silicon. The purified polycrystalline silicon is melted in the crucible. Then a single crystal ‘seed’ is dipped in the molten silicon and slowly withdrawn, while rotating. That’s how you make a high purity single crystal silicon ingot.

Before it’s zone refined, the poly is synthesized by reducing high purity silane gas (SiH4), which was in turn produced from quartz sand.

It would be interesting to know if the industry is still using natural quartz crucibles - the latest wafer size is now 450 mm - nearly 18 inches. Maybe someone else here can comment whether the traditional pulling process will be used at 450 mm.

You are right I mixed up silica and silicon my mistake. I found the wired article thats now paywalled and I mixed those two up. https://web.archive.org/web/20180808115837/https://www.wired...

>It would be interesting to know if the industry is still using natural quartz crucibles - the latest wafer size is now 450 mm - nearly 18 inches. Maybe someone else here can comment whether the traditional pulling process will be used at 450 mm.

Is that why they're round even though chips are square? Are the round parts thrown away or do they use them somehow? Reminds me of chicago style thin crust. https://en.wikipedia.org/wiki/Chicago-style_pizza#Thin-crust...

I had not heard of Spruce Pine - thanks for the references! In the very interesting Wired article you linked, I notice that the quartz goes to GE, which spins it into crucibles. So, I guess the size is unlimited.

The current silicon ingots are amazing - cylinders a foot and a half in diameter and maybe six feet long. They are handled with cranes.

The ingots are so large and the chips so small that there isn’t much waste. The edges are often used for test patterns.

The chips don’t have to be rectangular, but it’s easier, because they are separated with diamond saws or wire saws, which cut straight lines.

Some companies make photosensors in weird shapes using ultrasonic cutters.

Even the traditional disco cutters are probably worth their own book

the smaller the heavier :)

For those who wants more, the wikipedia article on semiconductor manufacturing is actually pretty good: https://en.wikipedia.org/wiki/Semiconductor_device_fabricati...

It’s really really fun doing this in software. You should try it.

Fetch decode execute cycle. Registers. Memory. An instruction set. An assembler. And plug it all into an emulator to watch your factorial(n) at work!

Here’s one someone else made earlier:


It’s rubbish. So is yours, but you’ve got to build it first before you can brag.

Mentioned a couple other times in this thread but I don't think anyone has linked Sam Zeloof's video of essentially doing this: https://www.youtube.com/watch?v=IS5ycm7VfXg

It's fantastic and I highly recommend watching it.

So if I somehow time travelled to ~1800 there's no way I'm making my own modern computer ?!

Jokes aside, its fascinating to see how complex computers are under the many layers of abstraction that we've built on top of them.

You can start with figuring out how to make good insulated wire and then try to invent the generator and the transformer.

More info on homemade raw sources and processes:


I was successfully able to follow step 1 and 2. Grab a rock. Crush a rock. Easy peasy.

Gave up on step 3: purify. Glad I did: each step after got more and more ridiculous.

This is absolutely gold. Very informative and hilarious pics.

Anyone else stuck at step 3? :)

A much fun article! Thanks

Looks like magic

Maxwell and Lord Kelvin would have no idea what it was all about.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact