Hacker News new | past | comments | ask | show | jobs | submit login

I'd start with learning a hardware description language and describing some hardware. Get started with Verilog itself. I'm a fan of the [Embedded Micro tutorials](https://embeddedmicro.com/tutorials/mojo) -- see the links under Verilog Tutorials on the left (they're also building their own HDL, which unless you own a Mojo board isn't likely of interest). Install Icarus Verilog and run through the tutorials making sure you can build things that compile. Once you get to test benches, install Gtkwave and look at how your hardware behaves over time.

You can think of "IP cores" as bundled up (often encrypted/obfuscated) chunks of Verilog or VHDL that you can license/purchase. Modern tools for FPGAs and ASICs allow integrating these (often visually) by tying wires together -- in practice you can typically also just write some Verilog to do this (this will be obvious if you play around with an HDL enough to get to modular design).

Just writing and simulating some Verilog doesn't really give you an appreciation for hardware, though, particularly as Verilog can be not-particularly-neatly divided into things that can be synthesized and things that can't, which means it's possible to write Verilog that (seems to) simulate just fine but gets optimized away into nothing when you try to put it on an FPGA (usually because you got some reset or clocking condition wrong, in my experience). For this I recommend buying an FPGA board and playing with it. There are several cheap options out there -- I'm a fan of the [Arty](http://store.digilentinc.com/arty-a7-artix-7-fpga-developmen...) series from Digilent. These will let you play with non-trivial designs (including small processors), and they've got lots of peripherals, roughly Arduino-style.

If you get that far, you'll have discovered that's a lot of tooling, and the tooling has a lot of options, and there's a lot that it does during synthesis and implementation that's not at all obvious. Googling around for each of the phases in the log file helps a lot here, but given what your stated interest is, you might be interested in the [VLSI: Logic to Layout](https://www.coursera.org/learn/vlsi-cad-logic) course series on Coursera. This talks about all of the logic analysis/optimization those tools are doing, and then in the second course discusses how that translates into laying out actual hardware.

Once you've covered that ground it becomes a lot easier to talk about FPGAs versus ASICs and what does/doesn't apply to each of them (FPGAs are more like EEPROM arrays than gate arrays, and for standard-cell approaches, ASICs look suspiciously like typesetting with gates you'd recognize from an undergrad intro-ECE class and then figuring out how to wire all of the right inputs to all of the right outputs).

Worth noting: getting into ASICs as a hobby is prohibitively expensive. The tooling that most foundries require starts in the tens-of-thousands-per-seat range and goes up from there (although if anyone knows a fab that will accept netlists generated by qflow I'd love to find out about it). An actual prototype ASIC run once you've gotten to packaging, etc. will be in the thousands to tens of thousands at large (>120nm) process sizes.

A quick warning about Embedded Micro: do not use the "Mojo IDE". It's extremely barebones, and lacks many important features -- in particular, it has no support for simulation workflows or timing analysis.

If i correctly understand what you are saying, it cold pe possible to make a custom small chip for doing some crypto capable of a little more than what those smartcards offer for under 10k$? That would be awesome, from a trust perspective, at least if you could realistically compare the chip you get back with what you know to expect, using an electron microscope.

Not for an ASIC without spending a LOT on tooling, and really $10k is awfully optimistic even if you had all of that tooling (I probably should've just said tens of thousands).

For <100k, yes, you can absolutely do a small run in that range.

Honestly, you might be better off just buying functional ICs (multi-gate chips, flip flops, shift registers, muxes, etc.) and making a PCB, though. Most crypto stuff is small enough that you can do a slow/iterative solution in fairly small gate counts plus a little SRAM.

If you do that, why wouldn't you use a FPGA or just a fast CPU? Microcontrollers and CPUs are blending in performance, and are cheap enough to plop on a board and call it done for many applications.

Sure, but if really want to avoid trusting trust (and you're of the mind to build your own hardware), FPGAs and µcs offer a lot of room for snooping.

Given the GPs suggested use, it seemed trusting trust was not on the table.

Certainly even a tiny FPGA can fit pretty naïve versions of common crypto primitives, as can any modern micro-controller. Assuming you only need to do a handful of ops for whatever you're looking to assert/verify, that is by far simpler than building a gate-level representation :)

I was thinking about a chip with only sram for secret storage that could be bundled into a ID-1 sized card with some small energy storage for the sram (there are affordable .5mm LiPo Cells that fit inside such a card), and then use the card to fit some display capable of giving some little data out, as well as a touch matrix,possibly by just using a style similar to carbon-contacts on cheap rubber membrane keyboards, but gold plated like the smartcard interface. But it seems like you can't afford to store one decompressed ed25519 or dare rsa, so the idea is moot by virtue of requiring sub-100nm technology to fit at least some sram.

In the usecase a lack of accessibility of sram content through probing is very important. The benefit of this over some μC or fpga is that you can account for every spec on a scanning tunnel electron microscope. And the high resolution xray you made before while the chip was still in it's package. Which you can compare with all the chips you will use and ship. It is sadly easy to backdoor with just a single hidden gate.

I actually want to understand the chip manufacturing process - design, prototyping (using FPGAs, etc), baking process in foundries. And also at least a basic understanding of how IP is managed in chip industry - like "IP core" is a term that I frequently hear but due to the myriad interpretations available online, I don't really understand what an "IP core" really means. Hoping to get useful advice from veterans in the chip industry.

I was about to write a thorough answer but jsolson's very good answer mostly covered what I had to say. I'll add my 2 cents anyway

- To understand the manufacturing process you need to know the physics and manufacturing technology. That's very low level stuff that takes years to master. There are books and scientific papers on the subject but it's a game very few players can play (ie. big manufacturers like intel and TSMC, maybe top universities/research centres). You can try some introductory books like Introduction to Microelectronic Fabrication [1] and Digital Integrated Circuits [2] by Rabaey if you're curious

- You can design ASICs without being an "expert" on the low level stuff, but the tools are very expensive (unless you're a student and your college has access to those tools). You need to learn VHDL/Verilog, definitely need the right tools (which, I'll say it again, are too expensive for hobbyists) and extra money to spend for manufacturing.

- FPGAs are different. No need to know the physics and you don't have to bother with expensive tools and foundries, the chip is already manufactured. My advice is to

(a) get an FPGA board. Digilent boards are a good choice as jsolson said, but there are other/cheaper solutions as well [3][4]. You'll still need software tools but (for these boards) they are free

(b) learn VHDL and/or Verilog. Plenty of books and online resources. A good book I recommend is RTL Hardware Design Using VHDL [5]

(c) I assume you know some basic principles of digital design (gates, logic maps, etc.). If not you'll need to learn that first. A book is the best option here, I highly recommend Digital Design [6] by Morris Mano

[1] https://www.amazon.com/Introduction-Microelectronic-Fabricat...

[2] https://www.amazon.com/Digital-Integrated-Circuits-2nd-Rabae...

[3] https://www.scarabhardware.com/minispartan6/

[4] http://saanlima.com/store/index.php?route=product/product&pa...

[5] http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0471720925...

[6] https://www.amazon.com/Digital-Design-3rd-Morris-Mano/dp/013...

The website http://semiengineering.com/category-main-page-sld/ is also has lots of interesting articles about semiconductor industry.

Applications are open for YC Summer 2021

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact