Hacker News new | past | comments | ask | show | jobs | submit login
Learning hardware programming as a software engineer (athrunen.dev)
324 points by diffuse_l on Dec 25, 2019 | hide | past | favorite | 46 comments



That's a good article for the simple reason that it is written by someone who still had enough 'outsider perspective' that it can help guide other people new to the material in ways that old hands never could. It is impossible for me to tell a person new to this material what the pitfalls will be because most of what they will be struggling with is obvious to me and it would never even occur to me to explain it.

The site is a bit slow to load but if you are new to working with Arduino's and other small boards like that and want to play around with them it is worth the wait.

Title nit-pick: I'd have used 'low level programming' rather than 'hardware programming', programming hardware has a different meaning.


+1 for the nit-pick. Hardware programming make me think of FPGA design and Verilog, etc. I'd call this 'embedded programming'.


Thank you, the surprising amount of attention this got kinda hugged my website to death earlier ^^

And TIL that there is a distinction between programming on the hardware and programming the hardware, might be something to look into


I actually thought this was about learning verilog or another hdl. I was a little disappointed, maybe I should write a blog about my experience learning an hdl as a software engineer.

Nonetheless them learning embedded systems programming is still an interesting read! Especially when coming from a language like C#


Yes, I ran into the same confusion. Was expecting a nice beginner's tutorial for hdl development/fpgas (would be nice to dump on some of my colleagues), instead this is more directed at newbies for microcontrollers. Still nice, but perhaps needs a different title.

Aside, I'm curious what resources you used to self teach hdl/fpga development. I have a formal background so I usually give poor explanations to people without any background. It's hard to figure out where I can start from when people ask questions.


Thank you, now I want to learn about hdl. ^^

Languagevise coming from C# and learning c or c++ for low-level programming was quite smooth, also coming from python was the hard part, that was a lot of comforts I had to give up.

But as I am also learning some micropython, I might write about porting stuff to it in the future.


Since you mention MicroPython, TinyGo (eg Go instead of Python) might be interesting as well: :)

https://tinygo.org

https://github.com/tinygo-org/tinygo


Yes, it is a different experience because with software programming we are letting system programs like the OS handle the hardware and we don't manipulate it directly. And computer hardware is standardised.

Where as when doing hardware programming, you have to necessarily understand the individual hardware to program and control it - there is often no abstract software lawyer in between to make this task easier for us. And the hardwares are quite diverse.


For any high level software engineer, looking to get into embedded development: A lot of Arduino, ESP tutorials do not teach you fundamentals. I recommend spending sometime (if not to code along) watching this course: https://www.youtube.com/playlist?list=PLPW8O6W-1chwyTzI3BHwB...


For any high level software engineer, looking to get into embedded development: A lot of Arduino, ESP tutorials do not teach you fundamentals.

Even books. I read ~two Arduino books that would write something along the lines of "we need a pull-up resistor to keep the pin from floating", without ever explaining what floating pins, pull-up, and pull-down resistors are. The concepts are not all that hard, except that they are often not properly defined or explained.


Thank you for sharing this resource. Arduino guides are so opaque without a foundation. Watching this series as we speak.


I can wholeheartedly recommend designing your own PCBs as well. It's a lot of fun, and routing the tracks on the PCB is surprisingly calming.


I am currently designing PCBs for my next post using EasyEDA, anything to pay attention to?

I'm especially concerned about the track width and clearance and stuff. For 12V and I think a max of 8A I calculated 2mm width, 0.6mm clearance, 1.5mm via diameter and 0.75mm via drill diameter.


You can use this for calculating trace widths. I think 2mm is too narrow with 2oz copper and definitely too narrow with 1oz.

https://www.4pcb.com/trace-width-calculator.html

Most PCB houses can handle 8mil (0.2mm) traces and clearance with no problems.

Best thing way to handle vias in high current designs is to avoid them. Run the trace from point A to point B on the same layer. Second way is to use planes for distribution. Other things you can do. Make sure there isn't a thermal relief. Use multiple/arrays of vias.


Make sure you have your fab's clearances entered in, I had PCBs come shorted because my clearances were smaller than what the fab could deliver. 2mm sounds too narrow for 8A unless your length will be small, my calculator says 5mm+.

Other than that, it's pretty foolproof, so enjoy the process!


This is why I went with a Computer Engineering degree instead of only a CS degree. I love software development, but I wanted to stay as close to the metal as possible.


Can't view the Web site (500 Bad gateway).

In any case, I sort of went the other way. I started in hardware, and gravitated towards software.

I think that each discipline could learn from the other, as most hardware, these days, relies on software as a critical part of its operation, and a lot of software is written as if the hardware on which it depends, doesn't exist.

I look forward to reading it, once the site is back up.

Happy Christmas!


Just got around to reading it.

Good stuff. Welcome to my world, although I seldom work on any kind of firmware, anymore. I'm mostly about driver-level stuff, and I try to leverage platform transport, as much as possible.

I have blown up $40,000 (in 1987 dollars) devices with software errors.

FUN!


The largest negative ROI over time I've ever seen was when someone accidentally dropped a dime into a prototype image processing rig in the 80's. Months of work and very exotic hardware gone in a fraction of a second.


> The largest negative ROI over time

Why do you have to obfuscate the language so much with technical jargon? Just say 'the costliest mistake'.


Because costliest mistake isn't equivalent, and it isn't nearly as funny like that.

Just imagine a whole team of people working on a combination of hardware and software, some of the hardware prototype level floating point gear obtained directly from the manufacturer under NDA (that's non disclosure agreement) and then someone decides to drop a dime into the machine.

The dime is the investment, the time it took to cause the damage the factor that you normally would use to indicate your positive gain on that investment, in this case a couple of milliseconds and a few hundred thousand dollars was gone beyond recovery. Project down the drain.

I guess when you explain it it is no longer that funny, and HN is more than capable of decoding the original.


On of my old bosses said he dropped a scope probe and shorted 120VAC onto the 5V bus. On a system with an 8008 emulator (cost $25k) plugged into the processor socket. Smoke poured out of the emulator and the breaker tripped.

Fortunately a heroic 7400 quad nand gate sacrificed itself to save anyone. He said all that was left was the leads sticking out of the PCB.

My cousin hooked up VCC and gnd backwards and blew up a prototype RISC processor. Only a dozen working ones in existence. Please sir may we have another?


Reminds me of when rockets launching to space blow up instead, taking out their payloads (months/years of work). ;)


it's toxic but "(that's non disclosure agreement) " made me actually LOL


My favorite part of hardware development is that you learn the true meaning of "bricking". People often use the term when something is recoverable, having never felt the pain of 1-2 engineer years worth of assets turning into paperweights in an instant.

How on earth did you blow something up from software? I'm guessing you were touching power delivery from software for some reason?


Way back in the early 80s, if you programmed in the wrong settings to a video chip, you could drive a monitor out of spec and severely damage it (I seem to recall several computers of this era were susceptible to this).


Yup. Those were the days before everything had switching power supplies.

I was designing ATE systems, and I accidentally delivered 240 volts to a 110 volt configuration, and discovered the fuse didn't work.


Excerpts:

"The most basic pins are digital pins, they can only either be on or off. You would, for example, use them to check if a button is pressed. Or if you use them as output, turning a led on or off."

[intermediate content deleted for brevity...]

"And while those pins cannot output a true analog signal, they can use a technique called PWM to approximate one by only switching the signal on for some time."

Fascinating! Knew about digital pins before this, and PWM generally speaking (with respect to PC fans, power supplies, etc.) -- and yet, I did not proverbially put "2 + 2 together" in my head! (For some inconceivable reason I always conceptualized PWM as analog/multiple waves per unit time, in nature...) Yup, makes a ton of sense!

Anyway, thanks for the great article!


Right now is a great time to learn about lower level, "on the metal" type programming! Has been for a while now.

People, like Ben Eater, are making great kits.

Components are not so expensive. You can (and will) fail. Get more, get gear, try again.

One can get good gear, scope, meter, solder station, etc... at good prices. Used pro gear is a good option too.

Making things, signal generator, logic analyzer, are also fun projects.

And there are many programming options from assembly to Python showing up.

I love this stuff and often couple it with retro computing. The speeds are low enough to make most things possible for fairly new comers.

Jump in. It is fun!

Great post OP.


https://blog.athrunen.dev/content/images/2019/10/ESP32-Pinou...

Weird, the breakout pin between IO21 and IO19 is silkscreened GND, but the label says NC, not connected. The dev board datasheet[0] says that pin is not connected, which would be a fun trap for someone who tried to use that pin as a ground, but a different schematic on the site says it's tied to module ground. They don't seem to have a PDF of the actual PCB anywhere I can find.

0: https://www.espressif.com/sites/default/files/documentation/...

1: https://dl.espressif.com/dl/schematics/ESP32-Core-Board-V2_s...


Also, might wanna learn some basic electronics.

With very modular devices, it's easy to avoid that - since you're basically building stuff like as with lego bricks, and things are abstracted form you - but if you need / want to build your own sensors, or customized setups, you'll need to know a thing or two about electronics / circuit analysis and design.


I’ve tried diving into this in my spare time. Where do you start if you have a web dev background?


Learning Arduino might be a good starting point, since they are easy and popular. You'll probably need to learn C, since Arduinos (or the controllers on the Arduino boards) are programmed in C or assembly.

There is a cost for equipment and parts when working with hardware.

Creating effects in the physical world with your code feels awesome. Learning and building physical tech is just awesome all around.


See this thread for pointers: https://news.ycombinator.com/item?id=21871026


Any suggestions for self study?


For books, I’ve read “Make: Electronics”, and “Teach Yourself Electronics”. Both were decent, but I’m current going through MIT’s 6.002 class online to go more in depth.

I would also be interested in some good self study suggestions.


See this thread for pointers: https://news.ycombinator.com/item?id=21871026



funny thing is, when "software engineer" as a term came along, it was a bit of a fuzzy term, but here in NZ a lot of people took it as someone who had a engineering background and understood something of electronics/mechanical/chemical/civil engineering, a number of courses made SE have a common first year with the other engineering disciplines. This was more my background I did software / electronic engineering and went into embedded systems for electronic/mechanical/software type systems. Though these days I do more "full stack" systems development but still highly involved in embedded development.

But of course, software engineer has no particular definition and lots of people use the term to describe themselves, even if they have no real engineering background.


My advice is to read the board and MCU datasheet. Check MCU's peripherals, and how they can be used in your application. Some MCU's have a PPI (Programmable Peripheral Interconnect) it is clever to use it, as peripherals can operate independently together from the processor, especially combined with direct memory access. Learn bitwise operations as you'll need it to configure the peripheral's registers. Use a logic analyzer to confirm the bitstream to pinpoint errors.

Lately I started experimenting with radios. It is difficult. Let me know if someone has a good guide they read.


I don't feel like creating a userid on his website just to tell him this, but one thing he seems to have missed that may have caused some of his problems is that the ESP32 I/O pins are 3V and NOT 5V tolerant. Elsewhere in the article he seems to assume 5V levels (when he's talking about using PWM to output 2.5V from a 5V pin). With an ESP32 or ESP8266 when you need to interface with 5V devices you should use a level shifter.


Thanks for that information, but I learned that the esp32 is not 5V tolerant the hard way. ^^

Will fix the bit about the PWM in a second


That might be something good to mention in the article as something to watch out for. You mention checking the specs, but not that kind of issue with specs.


When I wanted to learn hardware programming for a similar small project, I took on online course [1]. It took a few months, but it gave a pretty solid framework to start from.

[1] https://www.edx.org/course/embedded-systems-shape-the-world-...


All those multi-function pins! That implies a boot-time setup where you tell the system on a chip what each pin is supposed to be doing. Lots of bit field constants to construct. Often more trouble than hooking up to the pins.


This is such a gem of a post. I learned the hard way too and I wish this post was posted 5 years back! :) All the best to the author for his future projects.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: