Maybe I was just looking at the wrong job listings, but the technical difficulty (relatively low-level programming, manual memory management, dealing with janky firmware, antiquated toolchains, and incomplete documentation) seemed much harder than the compensation being offered.
At least in comparison to other types of coding you could be paid to do.
The only answer I could come up with was that unit profit margin * sales volume imposed a much lower cap, relative to pure software products?
To be functional and independent at it, you need:
- a very solid grasp of C/C++,
- a good understanding of the concept of a state machine,
- a decent grasp of digital electronics and communications protocols,
- some comfort in debugging (or at least diagnosing) electronics issues.
These skills take years to accumulate.
At the same time, I keep hearing the claim online and IRL (particularly among SWEs who try to get into embedded) that "it's just software - you don't need a good understanding of electronics" to be a competent embedded systems engineer. These are also the same people who get hung up on pretty fundamental problems with simple solutions, like:
- I2C buses missing pullups
- Power supplies not delivering enough current (Hi, everyone who's ever tried to drive a huge NeoPixel panel from a RasPi alone)
- IO logic levels not meeting input specs (e.g. driving a 3.3V input with a 1.8V signal)
Maybe I'm needlessly cranky but I've seen a non-trivial number of people go from "It's just software" to "OMG what do I do help" in the face of very basic hangups like these. It highlights to me that folks who have the breadth to do all of these things comfortably are extremely valuable, and hiring companies know this.
There is a "Computer Engineering" major in my university. C/C++ and state machines are taught in a freshmen year course. However, digital electronics / communication protocols / debugging is never taught in class. Practically everyone who know what SPI or I2C means is either an electronics hobbyist or member of a student competition team.
I personally am very interested in the "software for hardware" realm, but it is a little difficult to find people doing interesting approaches. It's usually all Verilog or all VHDL on FPGAs with a smattering of HLS, all C/C++ embedded software, all high-level programming languages with soft real-time on single-board computers or something similar, or somewhere in between with something like LabVIEW. But there's usually almost no overlap between the people that do work in each of those categories.
There's a lot of constraints in these systems and almost all of the innovation has come at sort of the chip level in terms of hardware performance. The design and development process, tools, and experience have seen little innovation, as far as I can tell. Everything remains hard in embedded systems. I don't see any reason why it needs to be that way though.
Because nothing works by default on ES systems, you have to implement it yourself most of the time. If you can clear that hurdle, your company is unlikely to be open source friendly. If you can clear that hurdle, people on other teams won't have time to adopt it for their systems, so it won't spread. If you can clear all of that, all you've won is a single tool. In the meantime, the rest of the industry as written half a dozen.
There are things you can do here, but... yeah.
Also if time is scarce, following pointers is slower than calculating offsets, but that's usually not an issue worth thinking about.
Another reason to keep memory contiguous is that fragmentation can be much more punishing (sometimes), especially when latency is important. On large devices it doesn't matter much if you're under-utilizing the CPU while waiting for RAM, because the OS will do other things and then let you do a lot of work all at once. Embedded processors will have nothing else to do.
It seems like allocating nodes out of a fixed size array wouldn't be too bad, though.
On devices with 1 GB of memory, the contiguous structures are usually better, but embedded developers often still reach for linked lists. On devices with 1 kB of memory, trained software engineers will scoff at linked lists and instead waste hundreds of bytes on large buffers sized for peak usage.
There are alternatives, of course. Your particular target might perform poorly with linked lists, so you might make your malloc use a different internal data structure. Or you might have plenty of flash but not much RAM, so you might trade code size for runtime overhead. Or... There are lots of alternatives, but a linked list is the simple default.
If you've got 6 bytes of free code space and/or you're scrambling for RAM, you look at everything for space savings.
If the hardware team comes to you apologetically and says, "We're sorry, but for cost reasons we have to /double/ the amount of DRAM and flash and give you a faster, more power efficient SOC" then you may not have to worry about this level of detail.
Both of these have happened to me on prior projects. There are no fixed answers.
(Whenever I've gotten one of these types of questions in an interview, I try to have a broader conversation on the subject rather supply the answer they are looking for. If that goes badly, it's a big red flag).
College curricula vary wildly in quality and content. I don't know of one that quite teaches you everything you need to know to be a functional embedded developer.
Is it because linked lists can be slow and can sometimes pocket arbitrary data?
I feel like strong typing and limiting pointers in favor of runners combats this.
Most of the people I know in embedded development are greybeards. There aren't a lot of younger people going into the field. The schools don't seem to be prepping a lot of people for these kinds of jobs. (not sure what we're going to do because a lot of these guys are in their 60s and about ready to retire)
> These skills take years to accumulate.
Indeed. Which is why the low pay is all the more puzzling, but I have theories:
As mentioned above, most embedded devs are greybeards. They don't move around much. They probably aren't as familiar with the much higher pay people are getting at the FAANGs, etc. They aren't as likely to have a LinkedIn profile or if they do they don't spend much time there. As an older dev myself, it's not like I'm clamoring for higher pay: my mortgage payment is quite low compared to rent (and will be paid off completely next year), my cars are old but paid off, don't have any debt other than the soon-to-be-paid off mortgage. All this to say that older devs maybe don't make as much money because they're not as aware of market rates and they don't really feel like they need a lot more money if they're happy where they're at.
The pay is there. I'm not sure why there's this persistent notion that it's not.
I generally agree, but "cache line" is a pretty low bar though, no? I'm a web developer with no formal CS background and I've known what a cache-line is since the first year I started to learn to code. Every textbook on my shelf that even tangentially relates to the kind of work embedded devs do makes reference to cache-lines, and god knows how many articles I've read over the years.
Training people from what is probably nothing is a pretty big bet on the part of a team and employer...
I don't disagree with any of this, I just think there's a certain "base" of knowledge for certain jobs that needs to be so well understood that it's automatic. For an embedded engineers I think "cache line" is part of that base in much the same way I would consider it a hard requirement for any incoming frontend web-developer to know what the DOM is.
So if you're hiring for MCU work (bare metal or RTOS) the concept of a cache line isn't important. If you're hiring for application processor work (often Linux), it's very important.
I look around at the 6 figure web dev jobs and it honestly breaks my heart. I look at embedded and engineering roles (ones that interest me anyway) and it's rarely 6 figure. I just don't have the mental fortitude to deal with web dev monotony.
The only other time I got bad pay was game development.
So, my experience is, the more technical the job, the less you get paid (but more interesting/ demanding). Look at middle management, why do they get paid better? (Btw, I've been CTO too)
If you move to be the guy that interfaces between developers and business people, it's not monotonous - each group is retarded in its own unique way.
You either stess over it, or you dont take it too seriously and have fun :D
Any idea how you can do that? I noticed companies are very picky who they put in those spots like need you to have the same experience in the industry or connections in the company.
Business Analysist, solution architects, presales engineers, product owners, sometimes even project delivery managers and developer team leads have to deal with this.
You might stand better luck of getting into one of these positions in a consultancy company, because there each customer has a separate set of requirements and often they don't have enough of these people.
This is why some web devs have 3+ full-time jobs.
Not true at all. Pay has nothing to do with trendiness, but the other way around, being a webdev is trendy because of the high pay, great benefits like WFH and low bar to entry. Meanwhile embedded dev has a higher bar to entry due to the difficulty of the work, low relative pay and low flexibility in terms of WFH, making the jobs untrendy.
And no, the outsourcing of most HW related dev work to China has had a big impact in lowering embedded dev wages in the west, coupled with the fall of great electronics giants in the west. When I started uni to become an EE there was Nortel, Blackberry, Siemens, Ericsson, Nokia, Motorola, Sagem, Philips, etc. developing HW and mobile devices in Europe, or the west, and were hiring like crazy. Fast forward 6 years when I finished my Master's and most of those companies have either went bust, or have become just brand names for Chinese OEMs, or have become sweatshops for far east workforce, keeping only some of their sales and senior management in the west.
The EE market in the west has went way down in the last 10-15 years in comparison to the web dev market which went way up. The only western HW company making insane profits is Apple and semi titans like ASML, Intel, Nvidia, AMD, Quallcomm, while the Japanese, Korean and Chinese companies fighting for the rest of the scraps, and most of the European ones throwing the towel completely. The commodization of HW and FW dev has meant the commodization of dev wages as well.
After graduation, some of my colleagues went into mobile app dev (the iPhone has been out for a few years but were far away from becoming the norm) and are now making several times what I do as an embedded dev at the same level of YoE. Talk about betting on the wrong horse. I still can't stop kicking myself for choosing such a poor career path and wonder if I can still switch as most companies seem reluctant to hire a thirty-someting embedded senior to do junior web dev work usually done by a teen or twenty-something out of bootcamp.
If you are based in UK, drop me a line.
If you are an embedded developer with experience, you probably understand the difference between a linked list and an array, I am interviewing 'senior' web devs and half don't.
Many 'boring' companies like corporate accounting or whatever, need developers badly and pay maybe 70-80% of what 'fashionable' ones do. It could a good place to break in.
From your answer I take it you're well compensated. From another embedded dev who is not well companensated, may I ask in what market you are in?
It was difficult to find a company that compensated well. A lot of the non-tech companies that rely on embedded developers like automotive and consumer electronics companies pay shit.
The large tech companies all have need for embedded developers, for example Apple, Google, Nvidia and Amazon.
That's absolutely insane to me. In EU I make 55K (considered a great mid-senior wage in my area) and could make up to 75K for senior, if I move to an expensive city in Germany and work at a top company but that's the top end of the market. There's no amount of interviewing or job hopping that could get you further as an embedded dev IC here. Higher pay is reserved for management positions which are tough to get. Even making six figures as an embedded IC here is impossible, let alone 200k+
When I was looking for a new job last year I had a hard time finding any company that would even match my compensation. Luckily for me, large tech companies have an absurd amount of money these days are and paying absurd salaries.
Are you comparing apples to apples?
Does a mediocre Java developer make as much as a mediocre embedded systems developer?
Does a elite Java developer make as much as a elite embedded systems developer?
I find your excessive faith in businesses disturbing.
Markets aren't rational  and the wage-labour market isn't either. Sometimes there's no good reason for something, only a bunch of chain of consequences that don't necessarily have to make logical sense from an utility perspective.
Embedded systems are part of tangible products. They have an order of magnitude greater non-recurring engineering costs, then cost of components, manufacture, packaging, inventory, distribution and after-sales service and support. Due to the huge up front costs, volumes are generally limited by capital funds. So the net profits are likely to be 15% - 20% range.
When you look at the differences, it becomes clear why tangible product companies are unable / unwilling to pay like software-only companies.
In contrast, the scale, and thus the complexity and cost, of embedded tends to be limited by physical constraints.
Seems worth the investment from a rational standpoint.
I seriously doubt it. In all my EE gigs, untrained EEs weren't shipping anything but were there learning and helping the graybeard ship. In fact you need a lot of industry experience and learning from the graybeards to ship products that will actually sell for profit and not flop due to some weird edge-case that didn't cross the junior's mind to look for before shipping, since the answers to most difficult issues are had through years of blood, sweat and tears burning the midnight oil with an oscilloscope in hand, not a google search away on stack overflow.
If you're into actual product development and not consulting work, getting untrained EEs to ship products will mean the shortest path to your company going bust in notime.
And this is not specific to IT - many academic researchers have to make ends meet with almost minimum wage.
I've also wondered how employers see them from the business perspective. It's hard to determine a 'business profit contribution' value attached to the 'quality of work' they produce except that things don't break down, and stuff gets done on time.
Furthermore, if the embedded code is the sort where once it's done, it's done, and little to no maintenance is required, the business people might see no need for the guy to be around. Maybe someone could clarify if this is true.
It's about in line with what electrical engineers make on average, slightly higher, at least in my location. The market places more value on people who can write code to give you a nice shopping cart interface over the ability to design digital control systems for engine control units to make sure your car can drive properly. It's just what we as contributors in a market value as truly important to us. To give some more insight: Look at the value of Twitter. A control system for a weapons-grade nuclear centrifuge - a weapon of mass destruction - is valued less than a 280 character limit.
I think there's been a lot of market distortion over the last 15 years or so due to the way money has flowed. VCs haven't been very interested in funding startups making hardware so they tend to be self-funded as in funded on a shoestring. There's not the money sloshing around for these companies.
So, from that huge pool (big supply) there's plenty for the few that are needed (little demand); enough even that will choose that domain over a better-paying one for interest's sake, so it ends up not having to pay as much.
Hardware eng. unsatisfied with $ -> ask their friends in the other 2 cos in the same niche how much they are being paid -> cry
Bootstraping a software business is trivial compared to any other eng. business. That, plus the internet's existence, is what makes dev so well paid.
In hardware world that translates to contracting or hiring a Chinese team to clone your competitor and eat the lower-end of the market. This is especially true for development boards, hobbyist and supply chain stuff. Digikey vs LCSC, Sparkfun/Adafruit vs AliExpress/Taobao, Arduino vs clone boards.
Sometimes the dumb clone company gets big enough and become a legit player in the Anglosphere. Say Seeedstudio or ALINX (FPGA dev boards).
Also EE here. Because in EE you need to invest a lot of funds into equipment, prototyping, manufacturing, RMA, shipping, logistics, customer support, etc. only to make pennies in profit when you actually sell your finished products, while it's the opposite for SW dev work, developing it and selling it costs you next to nothing, you just need a laptop and an internet connection, and once you sunk the cost of developing it you can scale it virtually infinitely and sell it across the globe with low costs and high margins.
Long story short, it's a lot more profitable to make apps where you get consumers to click on ads, buy stuff, or invest their savings into meme stocks or crypto, than to build and sell physical products to them (unless you're Apple).
I think it’s because the software is generally simply considered part of the BOM and thus a cost to be reduced. Also, the hardware side typically rule the roost in such projects, and hardware engineers tend to look down on software developers because, after all, how hard is it to just type code in (if you’ve ever looked at code written by EEs…ugh).
As supporting evidence: the pay is better at companies that do take software seriously. Examples: at one class, Cisco; at another, FB, Google, Apple…
Additionally, I'd add that many embedded developers work for hardware companies that don't understand or care about software; why pay so much more, and they already think they're paying "way more" (it's more than we pay EEs!), for something you don't get.
Finally, Embedded Software Engineers do quite well often. We don't pay people based on how challenging their jobs are.
With much of embedded you don't need to worry about updates, maintenance, logs, security, multiple cores.
The quality of code I was submitted was atrocious.
Any halfway decent Chinese firmware/embedded developer is likely going to work for a bigco where they get paid well, like Tencent or Huawei.
So parent's point still stands that Chinese competitors along with customers in the west who prioritized low cost over everything else, destroyed the embedded dev market in the west.
I'd accept "undercutting", but that's a corporate/macro pricing trend, not a trend for salaries among embedded devs in China. I suspect many companies who are undercutting pricing are getting funded by the Chinese government. They're still paying people quite well, just because they need talented engineers just as much as their competitors do.
Maybe you can but they don't exist where I live in EU and that's the only market I can speak of.
Because there is no demand.
Because today, you get all the software along with hardware.
And that software is already written somewhere in China, by same people who make the hardware.
Google, Facebook, Snapachat, MicroSoft, Amazon all run their hardware R&D in China for obvious reason that the hardware is already being manufactured there. They all copy Apple, and surely think that they can become Apple if they do like them.
With this foundation, one can understand a microcontroller as a big digital circuit. A CPU executes instructions by using a digital circuit. The instruction's op code is selecting the appropriate circuit to operate on the instruction's operands. Microcontrollers also have peripherals like timers, serial communications (spi, i2c, uart), and analog to digital converters. All of these are digital circuits that are configured by writing bytes to registers. Those registers can be thought of as the inputs and outputs of those circuits, just as if they were built with discrete flip flops and logic gates. In fact, many microcontroller datasheets provide diagrams of the logic circuitry for these peripherals.
Once you can truly understand how the hardware operates, writing the code to configure and operate that hardware is pretty straightforward.
Good stuff. I wonder if the author has recorded new lectures. Probably not, since the website still links to 1ed...
edit: oh, and there's new levels. there goes my afternoon...
I normally dislike videos and prefer books, but these were well done. They did an excellent job of pointing out the specific abstractions of each component as they are introduced.
I'm switching to full-time C++ backend and GUI development (so far it was 70% embedded and 30% this. I find that there is much more complexity in developing server-client systems, which optionally integrate with other systems, than it is to write low level code (maybe I'm just good at writing low level code). It's more intelectually rewarding, and you get to use higher level concepts. There's always room for advancement and computer programming is deep in a way that embedded engineering never will be. I've worked on embedded systems for 6 years and I've tried pretty much everything there is to try, from the lowest level to graphics from scratch, RTOS development and networking.
Embedded pay isn't worth the pain of fighting hardware and working with people that don't know how to write code, writing code.
Can confirm, I'm one of the only two CS grads at our robotics company and I'm vaguely in charge of all the web and GUI related stuff that EE grads don't know much about.
Wages in other fields aren't that much higher here in Europe though.
Basically the job market consists of Apple, Lab126 and perhaps Logitech in the US. The rest of the job market is either in Asia or in niche industrial markets or early stage robotics companies.
That said - I do think having an embedded background - particularly embedded Linux - can actually improve your systems level knowledge - since you have to get into Linux drivers, the networking stack, package managers, etc.
It never dawned on me that different buses run at different speeds and you have to literally setup the clocks. Heh and before that you have to load the assembly to jump to the address where main() is located (basically your own boot loader). I wouldn’t say it’s harder than web dev per se but it’s just very different.
Then I just bought a feather from Adafruit, started using micropython, and never looked back. Much better, although a buddy of mine holds his nose whenever I talk about it.
The first is "Can you build your code, for the correct target with the right cross compiler. Is your linker script set up correctly etc.". This isn't too bad once you've done it once, as long as your toolchain isn't proprietary and the target is sane.
But the second part is painful every time which is "what code do I need to run at startup so that my hardware is correctly configured". This is before you even write a driver. This is so specific to the particular chip, and even what you're particularly doing with it that it always takes time to get right.
I'll tell you that as a "pro" in this field, any time I can skip the second part I will do. STM32 chips for instance need a lot of clock config and selecting the right muxes because their chips are very flexible as to which peripherals can come out of which pins. They have a graphical tool called "STM Cube" where you can select your chip, configure the function for each pin and then it'll generate you a full project with startup code. We threw away the project and just used the startup code, regenerating it occasionally when we changed something. It saved so much time and suffering and is a big reason why I'll always choose an STM32 if I can.
In production, where you have deadlines to build firmware that is set to ship on millions of units, and your company's future is on the line, you just buy how many Windows Keil/IAR licenses you need, and get on with your work, regardless of how you feel about proprietary software.
Remembering the lengths a former employer would go to fit code into the 32kB limit of a free dev tool because they didn't want to shell out for the unlimited paid version.
We had the opposite experience, we paid for IAR since that was the only compiler to generate binary compact enough to fit in the 256KB version of the chip and not have us spend extra to upgrade to the more expensive 512KB SKU.
If you're laughing at this, do note that at scale, costs add up significantly even if we're talking about $.50 and a license to a good compiler and debugger can have an amazing return on investment.
None taken. Not just poorly funded, but often penny-wise and pound-foolish. That's one of many reasons they're a former employer.
I do work in the filed yes, worked in automotive, semiconductor and IoT.
>Because in my experience, it's not like that at all.
In what way is it "not like that"?
That's because in automotive the microcontrollers are more complex, with multiple cores of different types doing different things, so the build process is more complex therefore the final binary goes through several steps like patching various calibration data, FPGA bit-streams, cyptopgraphic keys, fuses, debug data, etc. but even so, those custom toolchains still call a comercial compiler via command line using some perl/python scrips at the end of the day.
You can script same steps with IAR or Keil if you wish, their compiler support the same command line functionality, except that in IoT we mostly used their solution 'as-is' since it worked well with their debuggers out of the box meaning quicker time to market.
Automotive development has a lot more crust in general due to legacy, safety, legal, plus bs stuff like the entire Autosar stack needing to be added to each build whether you need it all or not so a lot of inefficiencies build up over time due to this inertia leading to some insanely complex and lengthy build processes.
This board looks good for a beginner since it has interesting sensors built in:
Shopping for electronic parts is kind of involved actually. It seems like a kit with a bunch of basic electronic components would be a good idea.
I know a lot of people hate debuggers, but the information they provide is extremely useful.
You can also do real debugging on AVRs. I believe its not actually JTAG, but you can still use breakpoints, watch variables, inspect memory, and step through code.
The official 328 breakout board uses debugWire I think, but you get a nice USB interface. I recommend the dev boards, they're very cheap and quite capable (all pins broken out, prototyping areas, usually a few LEDs and you can use the debugger). They also have a footprint for arduino Shields!
Watching variables is flaky in my experience, but hardware breakpoints and memory inspection work fine.
It seems like getting to "blink an LED" early is important for a reasonable "getting started" experience? I'd like to have a real debugger, but it looks awkward to set up compared to just plugging a microcontroller into a USB port.
While a debugger is helpful, you can still work at the register level without one. You can simply print the register values to the serial port. In fact, if you wanted to, you could create a serial console that would allow you to set the registers over the serial port.
For something leaner and more "clean", while staying with Adafruit, their take on the RP2040 might be good . The official Pico  is still just $5, which might make it an easier buy for some.
For something even smaller, I rolled my own ATtiny214  board but it's not presentable at the moment (and it's slightly more "hardcore" since it's not programmable over straight USB). Also, it's of course dropping down back to 8-bit land again.
There are also printed versions.
Getting started with embedded systems is being repeatedly told "you're wrong" or "that won't work" until you give up and do something else.
And indeed, just like "making games" can be attractive for young people but far less so as an actual job (from what I've heard), embedded can be fun when playing with IoT-like toys, but it actually needs a lot of patience and tenacity when you do it as a job (from experience).
 No disdain here. Those MCUs and SoCs are used in actual products, but here they are not used "seriously".
I had no idea I was doing embedded systems for so long.