Hacker News new | past | comments | ask | show | jobs | submit login
My Business Card Runs Linux (thirtythreeforty.net)
2595 points by rcarmo 25 days ago | hide | past | web | favorite | 397 comments

Wellington Ma's business card was a rectangular slice of pink synthetic quartz, laser-engraved with his name, 'The Ma-Mariano Agency,' an address on Beverly Boulevard, and all kinds of numbers and e-mail addresses. It arrived by GlobEx in its own little gray suede envelope while Rydell was still in the hospital.

"Looks like you could cut yourself on it," Rydell said.

"You could, many no doubt have," said Karen Mendelsohn, "and if you put it in your wallet and sit down, it shatters."

"Then what's the point of it?"

"You're supposed to take very good care of it. You won't get another."

-- William Gibson, Virtual Light

Thought that was familiar.

Gibson's Law: If a shatterable business card is introduced in Act I, it will be sat upon and shattered by Act III.

I get serious Snow Crash vibes from this.

It's from Gibsons "Virtual Light", a vastly better book than Snow Crash. Highly recommended!

> a vastly better book than Snow Crash.

I've yet to read anything by Gibson that approaches anything by Stephenson

Other than the ending of every book Gibson has ever written, compared to the ending of any of Stephenson's works.

Thats where I thought it was from lol


Loved the article, but loved even more to learn about MicroPython.

As an amateur programmer that has never actually physically handled hobby micro-controller type boards, but has read a lot about them, it was amazing to be able to try out the emulator at https://micropython.org/unicorn/

Make the leds blink, see how servo control works etc, all in the browser without having to actually buy any hardware! I had never seen anything like this.

Would love if others could point me to more advanced emulators of hardware micro controllers (if there is such a thing!)

MicroPython is fantastic for education and rapid prototyping with hardware - definitely try it out! It can be difficult to make complex long-running applications because of post-GC memory fragmentation and frequent dynamic allocations by the VM, but the Python syntax makes it all worthwhile. It also has a supportive and friendly community where it's easy to get support, and the official 'PyBoards' are great.

MakeCode is also similar to what you describe - check out Adafruit's Blockly/JS simulator:


Note that CircuitPython is Adafruit's fork which focuses more on education than industrial prototyping. So even within the embedded Python ecosystem, you have options.

Embedded development is so easy these days, it is just fantastic. The barrier to entry, in both cost and complexity, is so low now that just about anyone can apply physical automation to the problems that they see in their life and community. I'm excited to see what emerges in the next couple of decades.

We've made some pretty capable hardware at my job on top of the pyboard/micropython. We've outgrown it in certain cases but it's really powerful being able to build something for potential critical business needs (like one off testing platforms) without spending weeks rewriting drivers.

Thanks for the Adafruit tip. My 9yo daughter is learning scratch, so the block based programming is going to be great to try out with her.

Not exactly what you're asking for, but you may enjoy Shenzhen I/O and other titles from Zachtronics: http://www.zachtronics.com/shenzhen-io/

yeah good tip =) i already knew them but thanks anyway!

Hardware is so cheap and accessible these days, just get an Arduino board. There are lots of kits available with sensors and actuators, and it's all so easy to use it's like building with Legos.

If you've never tried Python including a REPL for whatever project you'd use Arduino for it's definitely worth giving a shot. Different kind of game than C.

Today an esp32 is more bang for your buck. 80mhz dual core, has wifi and bluetooth, and so on

Edit: and you can program it with the arduino IDE

To me it's truly mind-bogging that a tiny $1.42 chip contains almost everything needed to boot Linux: a 500mhz cpu, 32MB SDRAM, 2D GPU, SD/MMC support, and USB controller, all packaged inside a 10mm x 10mm chip? It makes me really want to get into embedded development.

Calling this embedded system feels like insult to the spirit of 'running light without overbyte', because it's so comically capable.

My first PC had 66/33 MHz 80486 processor and 8 MB ram, 320 MB HDD. You could run AutoCAD, play Doom, Civilization, have dual boot for Linux, Windows 3.11 etc. It could compile Linux from source.

You were living large! My first Linux computer was a 20Mhz 80386SX with a 40MB hard drive partitioned half for Windows 3.0 and half for Linux (SLS) and swap. It had a whopping 4MB of RAM and I also compiled my own kernel making sure to trim down features as much as possible. It was magic seeing X11 come up for the first time just like the big-bucks Sun boxes!

Hah, you had it great! My first PC was a 286, which IIRC either had no HDD, OR. 20MB HDD (might be mixing it with our next PC, which was a 386).

I'm pushing 40, and still shake my head in wonder sometimes about just how much is possible at such small scale and cost.

But this is now turning into "when I were a lad..." :)

Oh, you kids!

My XT had an 8088 @ ~4mhz in it, and we pushed it uphill to school both ways, in the snow!

It was multi-functional though. The power supply was inefficient enough to heat the room, while simultaneously being heavy enough to bludgeon large creatures for sustenance.

Something inside me misses the radiant heat, sitting here on a cold day with my chilly metallic laptop.

My Heathkit H89 had a Z80, 16K of RAM and a built in cassette interface. And we liked it.

I just flipped a bank of 16 switches by hand with vacuum tube gates and core memory...punch cards are for sissies!

You had a computer? Back then I wrote C programs on paper, then executed them by hand...

One of the main causes of the fall of the Roman Empire was that, lacking zero, they had no way to indicate successful termination of their C programs.

C? All I had was a single book on MS-DOS, so I wrote batch files on paper, then executed them by hand.

(Not even joking, by the way - literally how I got into computers and programming.)

oh kids! i used to crank the wheel by hand on my analytical engine that my hacker friends and me built after there was a leak of babbage's plans! he never knew!

you kids and you petabytes of ram! back in my day....


(i'm looking forward for these comments to be commented on a certain website i'm not supposed to name)

you kids. We calculated with charcoal on the cave's wall while the adults were out hunting for wooly mammoths...

ZX81: Z80/1KB RAM... more recently though PIC12 (C501 IIRC) ...

I'm slightly surprised nobody has yet linked https://xkcd.com/505/

There's a website where people make fun of HN comments? Is it Twitter?

The prime directive of said website is to never name it on hacker news. But I'm pretty sure you can Google the prime directive itself and figure out where it comes from. As a hint, it starts with "ng".

more hints?

Webshit, once a week. I won't go further :)

Start a few Electron apps and it'll warm right up

Few? You can only run 1! We have to yet invent the technology to run two, let alone a few!

I honestly thought you were gonna reference the Four Yorkshiremen sketch by Monty Python after the second line!

Predates Python by a couple years. It’s from “At Last The 1948 Show” from the 60s.

Didn't know that. Thanks for sharing.

Run a threadripper with the window open!

laughs in 6502 dances in 68k cries in 486

Mine was too, albeit 40MB had and a CD tom!

Ran windows 3.1, Civilisation (VGA graphics), railroad tycoon, Star Trek 25th anniversary - and could have them all installed at the same time. Other programs included championship manager 93, and I think day of the tentacle.

But it wouldn’t run Linux - 386 and above.

I was a kid when my father first acquired a similar computer.

We ran Windows 3.0 and my father configured a boot entry for running Doom or Ultima 7, which required so much memory that it wasn't possible to run Windows, only DOS.

I remember feeling a bit envious about my neighbor having the more powerful 486, which could run Doom much faster than in our PC

Made me laugh about “Dad configured a boot entry”...I remember the hours with my Dad trying to stuff whatever device drivers into that 640K via Autoexec.bat and Config.sys. Got the CD-ROM running but damn if sound now doesn’t work. Those were the days trying to get Tie Fighter in my case to work.

Worse part was my Dad in his wisdom bought a family PC without a Pentium, but a Cyrix P166. Had zero floating point processing. Ran like a damn dog on any 3D game.

Any Brits out there might remember Time Computers. On every back page of any newspaper ever selling PoS computers with whatever subpar hardware they could cram into a big beige box ;-)

Not to brag but I got a 4.77MHz 8086 XT PC for my 8th birthday :) It had a whopping 256KB of RAM, a monochrome display (later upgraded to a 4 color CGA that blew my mind), and two (yes, TWO) Floppy drives.

My dad got one of those for us both to use...what was truly "whopping" about it was its price...

If I remember right it was almost $3k...but that included that blazing fast 300bps Hayes modem for wide-area networking.

And my mind was truly blown when we replaced the 2nd floppy with a more-memory-then-we-will-EVER-need 20MB hard drive...what could we do with all those bits??

Yeah the prices were crazy! My first was a 10Mhz AMD 286 clone with 1mb and a 40mb HDD. I recall it being $2600 with a 13” SVGA monitor.

Same! We got 640kb of ram and a mouse! The HDD didn’t arrive until we got a 286, tho.

And a decade before, the distributor DigiKey presented as a one-column advertisement in Popular Electronics, selling 7400 DIP ICs for $0.29 . Inflation-adjusted, that $0.29 now buys a an IC that can run Linux.

> 8086 XT PC for my 8th birthday

Wow, I admire the far-sightedness of your parents, to give a child a computer at such a young age (and in those early days of PCs).

I was of a similar age when my dad brought home an NEC PC 9801 - with an 8086 CPU (8MHz), 640KB RAM, kanji (~3000 characters) font ROM, even a Japanese word processor. I think it ran MS-DOS 2~3.

"In 1987, NEC announced one million PC-98s were shipped."

That was a big wave, and I'm so glad my parents let me play with a computer as a "toy" - it was a huge influence on my mental development.

Kinda like the monolith moment in 2001: A Space Odyssey. :)

My parents had no idea, and they couldn't afford it anyway :) My dad asked his cousin, who happened to be working with PCs way, way back and recommended he get a PC instead of a C64 which is what everyone else had. My dad asked his dad and my grandfather forked over what must have been an insane amount back in 1980's Israel.

Aww, that's so nice that your dad asked around and pulled resources together for you. And I'm sure grandfather knew it was a worthy investment in your future.

My friends around that age all had a wildly popular video game console called Nintendo "Fami-Com". https://en.wikipedia.org/wiki/Nintendo_Entertainment_System

My parents refused to buy it, and instead let me play with the PC-98, where I learned BASIC, Turbo Pascal, even some 8086 assembly language.

I suppose if I have children, I'd discourage mobile phones/apps and instead give them Raspberry Pi, microcomputers, sensors, devices they can build stuff with.

This time of year is ripe for nostalgia. I recall that it was just about this time in 1993 during Christmas break that I loaded Linux version 0.96(?) onto this 386SX machine. This involved taking a stack of 1.44MB floppies and driving to a place where I had access to the internet. I'd choose SLS packages and copy each to an individual floppy. Then I'd drive home and load them one by one until I had a bootable system. And of course with floppies, it was inevitable that you'd get a error now and then. So, back to the car, download, copy and repeat. All to get away from the limitations of Windows 3.0...

Talking about Christmas, we used to go to my aunt's on Christmas day and I remember the time they bought an Amiga for my cousins. We were very young so between games we used to have a good laugh with the program that did text-to-speech (can't remember the name right now), but it only spoke English and we were easily amused by typing phrases in Italian and having them read back as if it was English (so the pronunciation was all messed up).

same here, packard bell but iirc it was 20/40Mhz with 40MHz being the turbo, but before that I did have a zenith 8088 dual 5.25" floppy system that had to boot dos with one disk and run programs off the other (mostly written in gwbasic).

Mine was a Mac IIsi - 20MHz 68030, 1MB RAM. To run Linux, you needed to install a daughtercard with a hardware FPU.

My first few PCs weren't even PCs, I didn't know what they were until I was a little bit older. Somehow I intuited them naturally though; I liked exploring and being curious about what I was looking at.

First one I remember was a Commodore 64, along with a 600 page book full of BASIC that you could type out and record on cassette to have your own game. The book was total gibberish to anyone else; it was just pure code with no explanation. But that's what the C64 gave you; an interactive environment on boot where you could program a new game and write it to a cassette. By default. If you wanted to play a game you had to type `RUN` or maybe one or two other things to set it up. But you wouldn't know that, because you just had an interpreter on a basic blue and white screen.

Worst bit was the 10 minutes of spasmodic strobe animations that showed you the game was loading. But also each game controlled those loading animations. You had to know what game you wanted to play, and be sure of it, or otherwise you could just flip to the B-side and get a different game.

After that I think we had a BBC Micro at school but I'm not sure. All I remember is an adventure game and one of those classic 5" floppies. I still really love the look and feel of inserting a floppy and manually locking it in. Floppies and cassettes and VHS's minidiscs were truly fantastic for the time. They were still mechanical, unlike CDs.

Then on my dad's side I and my siblings got an Acorn PC and a bunch of random floppies. None of them made any sense but some of them did cool things when you ran them. I remember hiding from my family and putting in the floppies that made flashing colours and watching it until time passed.

Must have been 11 or 12 years old before we first got a PC and by that point I was utterly fascinated. It was some shitty off-the-shelf eMachines thing but it was the best we could get; I managed to retrofit a decent graphics card in it a little bit later.

There's embedded devices with an order of magnitude more CPU and two orders more RAM, if you're being strict about what counts as "embedded". If you're not, probably another order of magnitude on each.

For example Nvidia's Jetson AGX Xavier. 512-core NVIDIA Volta™ GPU, 8-core ARM, 16 GB RAM,...

Well now I feel old. I remember buying a 486 with 8MB and thinking "I'm really living in the future now!" The mhz were ridiculous and -- according to my memory -- most instructions ran in a single cycle too! (Warning: my memory is not a reliable gauge of the state of computing at the time.)

8MB was pretty extravagant but it turned out to be a good call even though it could be had for half the price within a few years.

'computing progress' seems mostly resolution increase.

Lucky! We had the same 486 system but only a 40MB disk. I had to spend some lawn mowing money to get a 400MB second drive so I could install Sid Meier’s Gettysburg. :)

Oh geez I can't imagine how long it would take to compile Linux on that slow of a processor.

Remember there was a lot less to the Linux kernel at the time. Wasn't too bad. The single-threaded compile time for Linux has stayed fairly consistent for awhile, since the tree grows about as fast as the average desktop improves. The big improvement in compile time has come from being able to compile with 8-32+ threads.

I once read somewhere that the time to compile Linux from source on high-end desktop hardware has remained remarkably stable over the years.

Yes, that's true. It has been O(1 hour) for pretty much the entire history of Linux. And if you think about it, this makes sense. If it were significantly less than that, the development cycle would speed up so it would be easier to add capabilities, which would slow down the build. If it were significantly longer, the addition of new features would slow down until the hardware started to catch up. So the compile time acts as a sort of natural control mechanism to throttle the addition of new features.

I'm not sure that is the primary reason -- many other projects have had their compile times explode over many years, even though the same logic should apply.

Not to mention if you build the kernel regularly, you benefit from incremental compilation. If you change a few non-header files the rebuild time can be as little as 2-3 minutes. Oh, and "make localdefconfig" will reduce your from-scratch compile times to the 5-15 minute mark. I highly doubt most kernel devs are building a distro configuration when testing (especially since they'd be testing either in a VM or on their local machine).

> many other projects have had their compile times explode over many years

Like, for example?

When I worked at Microsoft, Windows took far, far longer than an hour to build from scratch. I remember walking a few buildings over to the burn lab to pick up DVDs of the latest build. I don’t have any hard data, but running a full build on your dev box was very rarely done.

The dynamics of commercial projects can be very different from open-source. You're much more likely to tolerate sitting through long builds if you're being paid to do so.

Until management notices, does the X hours saved multiplication by number of developers, and the resources for improvement will be found soon.

Definitely not. People are far more expensive than machines.

Was it just the windows kernel? Or did it include all of the utilities in Windows?

It's a lower bound for a highly active project.

I'm convinced that compiler speed is all a matter of how it was initially designed.

Take for example Delphi which can compile millions of lines of code in under a minute or something ridiculous. Then we have D, Go, Rust, and such, they compile rather large codebases that would take C++ a good 30 minutes on high end hardware of today in shorter spans of time (not as familiar with how fast Rust compiles, but I know Go and D do pretty nicely, especially if you enable concurrency for D), which probably takes those same 30 minutes on high end hardware from about a decade ago.

Sadly from what I have heard C / C++ has to run through source files numerous times before it finally compiles the darn thing into anything meaningful. Facebook made a preprocessor in D to speed things up for them, it technically didn't have to be coded in D, but Walter did write the preprocessor for them, so he gets to pick whatever language.

The C++ language cannot be parsed with a context-free parser. A large part of the build times, however, is due to optimizations.

In the early days, clang was much faster in compilation than gcc. Over the years, it has improved on optimization output but as a consequence has lost the compilation speed.

There are many examples for https://godbolt.org/ to show how much work the optimizer does. As example, the http://eigen.tuxfamily.org library relies on the optimizer to generate optimized code for all sorts of combinations of algorithms.

Rust doesn't compile very fast unfortunately, but it's being worked on. General wisdom says it's about as fast as Clang but comparing compile speeds across languages in a meaningful way is difficult

Fair enough, thanks for that, I wasn't sure, all my Rust programs have been small enough to where I havent noticed. I wonder if it would of made sense for Rust to consider this earlier on in the same way that Pascal did. I believe I heard Pascal was designed to be parsed easily by machine code.

Parsing is not the reason why it takes a while to compile.

"Parsing" was probably not the best choice of word on the GP's part, but they meant that Pascal was specifically designed to be implementable with a naive single-pass compiler; of course that would exclude many optimizations we take for granted these days.

It's the layout of the code that allows Pascal (/Delphi) to compile everything in a single pass. By starting with the main file you easily build the tree of deps and public interfaces/functions vs the private & implementation details.

C and C++ even though they have header files, make no restriction on what goes in a header file, so it takes a bit to figure out the dep graph.

The best part about Facebook's preprocessor (warp) was that it was slower than Clang (https://news.ycombinator.com/item?id=7489532).

But the preprocessor isn't what makes compiling C++ slow. It's a combination of the complex grammar, and ahead of time optimization of a (mostly) static language. Turns out you can compile code really fast if you don't actually try to optimize it.

My first Linux, Slackware, took me a week to download (Slackware), was on something like 12 disks. I compiled a kernel, took two days.

That was on a second-hand Pentium-S. I probably wasn't doing it right.

Linux in the 90's wasn't as bloated as it was now. It definitely took well under an hour to compile a kernel. My first Linux box was a 386SX-16. I later upgraded to a 486DX4-100.

That doesn't sound right, it's still not bloated. The majority is drivers then comes the cpu arch folder that takes a big cut. You can check yourself with cloc:


Linux 1.0 was 1 meg, gzipped and less than 200K lines of code, total.

Linux 4.x is over 150 megs, gzipped. Just the lines in the arch/x86 folder are more than the total lines in Linux 1.0.

It used to take me just over two hours to compile a (very minimal stripped down) kernel on my 25 MHz 80486 with 4 MB of RAM (in the late 2.0.x/early 2.2.x days).

Long. I remember it being something like an hour or so to compile the kernel, but the kernel was also much smaller then. I specifically remember Linux 2.0.0 coming in a 2MB tar ball. Because it probably took me longer to download with my 28800 baud modem from an over saturated FTP server than it did to compile. 8)

depending on how you configure your kernel and what storage you're using, it might be doable.

I mainly work in web-dev, but working in embedded systems is kind of my hobby. It's kind of refreshing working on bare-metal, optimizing up-to single bytes of data, when I'm usually working on layers and layers of abstraction everyday.

I just finished Michael Abrash's Black Book. Lots of assembly. You may like it. Like 30 hours of reading.

http://www.jagregory.com/abrash-black-book/ is an excellent online version of that amazing book that I believe is hosted with Michael Abrash's permission.

Where does a web-dev turn to to get started in this as a hobby?

Start with an Arduino and the Adafruit tutorials. Very gentle learning curve. https://learn.adafruit.com/category/learn-arduino

From there, can decide where your interests might be. To continue with Arduino hardware but without the IDE, I'd recommend _AVR Programming_ https://www.oreilly.com/library/view/make-avr-programming/97...

For other low level programming, Atmel's AVR Studio and their dev boards are incredibley newcomer friendly. Their source level debugger (with the debug coprocessor) is a miracle.

If you'd like to get into "big iron", Embedded Linux is amazing. Raspberry Pi is a great start (lots of GPIOS programmable from userspace). To get into embedded linux kernel programming, start with a loadable module on the Ras-Pi. Also, build/load your own Ras-Pi kernel.

Other folks suggested the ESP8266 which is also great fun to use.

Edit: Learn C. Lots and lots of C. Embedded dev is almost all C. A teeny bit of ASM, sometimes C++. But almost all of it will be in C.

Can recommend https://buildroot.org/ for building complete Linux system images. Been using it on multiple embedded projects, both personal and professional.

Somewhat related, I've also used buildroot in both AWS and GCP to run workloads from read-only system images. Quite liberating in my opinion. No ssh, no ansible, etc. Build the image, launch it and off it goes. GCE even allows you to use the same boot disk, if mounted read-only, for multiple instances, perfect for these type of images.

I can second this as well, Arduinos have a bad rep for being too beginner-focused but AVR is a super solid platform and you can do a ton with them. I'd pick Arduino over RasPi for a starter project anytime.

Very helpful, thank you!

I'm a React Developer in my day job, but I'm disillusioned with the web as a whole and I've always had an interest in low-level stuff.

At the moment, I'm doing a structured C book course (C: A Modern Approach), and I've also signed up for the edX course "Embedded Systems - Shape The World: Microcontroller Input/Output"

It uses the TM4C123 board, which from a look around seemed to be a decent enough board for a beginner. I'd seen complaints of Arduino, but I'm not experienced enough to know the validity of their claims.

Either way, I'm having fun. Not sure if I'd switch career as Web Dev is full of decent paying jobs but it's nice to have a hobby that's different from the day job.

FWIW, those TI boards have tons of hardware bugs. Way more than you'd even typically expect from embedded hardware.

And if you don't use the arduino ecosystem, but use an RTOS on an Arduino, it's a perfectly valid dev board for learning real embedded systems development.

I've heard about the hw bugs a little but couldn't find much information about them.

I'm mostly just using it for the course which was highly recommended in multiple places, so I hope I won't encounter many of these bugs.

To be honest, the whole picking a board thing was quite overwhelming, with many recommendations, boards, variations, etc. Hopefully once I've finished the course I'll have some more knowledge to help me pick the next board.

I think Micropython is fantastic and you can get an ESP8266 + FTDI cable for around $25. You can make this a web server and do fun hardware integrations with it like flashing LEDs for successful CI builds. Here's a good tutorial: https://docs.micropython.org/en/latest/esp8266/tutorial/inde...

ESP8266's can also run NodeMCU, which is Lua but node.js stuff translates super easily: https://nodemcu.readthedocs.io/en/master/

I highly recommend the ESP8266 and ESP32 once you're a little more familiar with the arduinos.

They have Wifi, and the ESP32 is actually very powerful and has two cores. They make for great DIY home automation/IOT devices and are pretty easy to work with.

> ESP8266

Second this. The ESP8266 (a nodeMCU) is about as powerful as an Arduino Uno, is about half the size of a credit card, is very thin and costs around $3.

The wifi part is the biggest advantage. You can send sensor data directly to your server via simple http requests.

Esp8266 is way more powerful than an arduino Uno. It's got a 32bit 96MHz proc with 96k of ram and Wifi, often coupled with 4MB flash. Uno has 32k of flash 2k ram and (from memory) 20MHz 8bit cpu.

The Wemos ESP8266 was a fun start. The only thing I found a bit tricky, iirc, is the limited number of pins, and figuring our the mapping from D0..8 to ESP8266 number. This answer helped: https://community.blynk.cc/t/solved-wemos-d1-not-working-wit...

No matter what you do, you NEED tools. I cannot stress this enough. If you can spend $250-$500 on a nice desktop scope, awesome. If you only have $80, you can get a serviceable Hantek USB scope that will at least give you an idea of what you're doing. If you can only spare $20, you can at least get an el cheapo JYETech scope. In addition, you'll want to pick up a multimeter and probably a logic analyzer. Again, no need to go all out; a cheap $20 meter and a $15 Saleae knockoff will get the job done when you're just starting out. DO NOT SKIP OUT ON THESE. Embedded development without being able to see and know what you're doing is miserable, so unless you're doing this because you want to be frustrated, just buy the tools you need upfront.

As for what microcontroller to actually learn on, I would say the MSP430 is a very good starting point. It's a fairly mundane 16-bit RISC microcontroller series with very forgiving electrical design requirements, very good documentation, and very good community support. They make a devboard (TI calls them Launchpads) for the MSP430G2553 that's more than enough to get a beginner started. When you need a little more power, you can either opt to invest in learning the ARM ecosystem, or go for something a little more exotic. Just about every manufacturer makes an ARM microcontroller of some sort, so if that's what you're interested in, take your pick and go with it. If you're looking for something else, the Renesas RL78 and RX series provide a lot of functionality if you're willing to deal with how stodgy Renesas can be.

Some important notes:

1.) Don't bother with Arduino. They were a much more compelling product 15 years ago when you had to pay thousands in tools and compiler/environment licensing to get in on in embedded development. Today, what Arduino nets you is a painfully barren environment that abstracts away a lot of what you're trying to learn when you're starting out. Losing out on the debugging, profiling, tracing, disassembly, memory usage statistics, etc. that modern development environments give you will do nothing but stunt your growth, especially if you're used to having all these tools while writing desktop software

2.) Be careful with (or preferably just avoid) starting with embedded Linux; it's pushing the limits of "embedded". You're going to miss out on a lot of important knowledge and insight jumping straight into using an operating system (and a very heavy one at that), and for many applications, it is MASSIVE overkill. When you start, you're not going to need an RTOS. When you need an RTOS, you're going to reach for something more reasonable, like FreeRTOS. If FreeRTOS doesn't cut it, then you can start looking at Linux.

3.) Don't get tangled up with Raspberry Pis; the microprocessor on these is complex and the documentation is severely lacking/nonexistent. RPis are much closer to a desktop computer than they are an embedded system.

If you really want to get it, I would say one of the most useful exercises is implementing your own microcontroller/processor. You can pick up an FPGA devboard for fairly cheap, and there are plenty of textbooks (Harris & Harris' Digital Design and Computer Architecture comes to mind) that will get you through most of the key concepts. Once you've done this, a lot of the gaps in understanding when dealing with a microcontroller will be filled in. This exercise isn't strictly necessary, but I don't know anybody who has done it that wasn't better off for it.

My final note is to buy or "acquire" Horowitz and Hill's The Art of Electronics. Embedded development is inseparable from electrical engineering, so even if you don't read it front to back, there are some sections that you will definitely be visiting if your background isn't in electronics.

>Don't bother with Arduino

I wouldn't say that. Yes the Arduino "libraries" abstract away a lot of the complexities and hinder true understanding. However for a beginner it is a perfect platform simply because of the huge community i.e. tutorials, code and projects. Once they gain confidence from using this, they can move on to "traditional embedded development" by learning to program the underlying AVR directly i.e. use the toolchains installed by the IDE, delete bootloader and upload your program using the ISP interface (see the book; "Make: AVR Programming"). This gives you the best of both worlds using the same development platform. Another advantage is that many other MCU families also provide a "Arduino-like" interface (eg. Energia for MSP430) and thus all the skills learnt on one can be transferred to another.

My take is driven by me finding the Arduino platform too easy. My first exposure to embedded development was Energia, and I was so underwhelmed that I didn't bother with it again until years afterwards, when I downloaded CCS and got to play in the "big leagues". I know multiple people that had that same experience.

If you're already writing software, you're not going to be struggling with having to learn programming. Translated to Arduino, this is great for engineers who don't write software and just need to get simple shit done fast, but for those with a background in software, it feels equal parts mundane (I called analogRead and... got an ADC value, just as expected) and magic (what did analogRead actually do?). Given, you can go look at the source for these libraries, but a lot of them make very generous use of the preprocessor for the sake of supporting multiple products in one codebase, and it's often not pleasant to read. Working with an MSP430 (or basically any architecture if you're using more capable tooling, like you mentioned with AVR) has a certain clarity to it that the Arduino ecosystem just doesn't seem to capture.

I would make the same argument for AVR and the "real deal" tooling as I do the MSP430; why bother with Arduino when you're probably coming in with decent programming skills?

>why bother with Arduino when you're probably coming in with decent programming skills?

Pure Software programming skills are NOT enough when it comes to embedded programming. You need to know the logical interface to the HW and the EE/Electronics behind the HW itself. This is a huge challenge for people who have only worked with applications software over layers of abstraction. This is where the Arduino shines; hide the overwhelming HW complexities and provide a simple api for software guys to get their job done. This allows them to slowly gain HW knowledge and transition to "traditional embedded programming" as needed. Also, many experienced embedded developers are using the Arduino as a rapid prototyping platform to implement an idea quickly and get clarity on the project before implementation on the actual target using traditional means. Learners can do both on the same Arduino platform.

So here is my recipe for somebody wanting to learn Embedded Development;

1) Get a Arduino Uno, a couple of extra ATmega328Ps, a AVR programmer and a electronic components toolkit.

2) Get a book on Arduino programming and AVR programming. I can recommend "Exploring Arduino" and "Make: AVR Programming". You also need a book on Electronics and "Practical Electronics for Inventors" is a pretty good one.

3) Install the Arduino IDE and go through some blinky tutorials. Do some projects from "Exploring Arduino".

4) Now move on to direct AVR programming. The Arduino IDE has already installed the required toolchains. Setup a Makefile environment using them following instructions in "Make: AVR programming". Do some blinky and other projects from this book. This gives a command-line approach to AVR programming.

5) Next repeat the above using a proper vendor supplied IDE eg. Atmel Studio for AVRs. These IDEs are pretty complex but quite powerful and used extensively in the industry. Go through some tutorials and redo the blinky and other projects using the IDE.

6) Get some test equipment tools to look into the inner workings of the system. I recommend the multi-functional "Analog Discovery 2" which has lots of learning tutorials.

Congratulations; you are now a bare-metal "Embedded Developer"!

To enter the "big boys league", move onto ARM Cortex boards.

Finally you get to "Embedded Linux" and become a "Master Super-Duper Embedded Developer" :-)

Another good book is https://www.goodreads.com/book/show/4073235-msp430-microcont..., which I'm sure you can find online for free somewhere.

A great book particularly for learners/students is "Introduction to Embedded Systems: Using Microcontrollers and the MSP430" by Jimenez, Palomera et al. For every feature of Embedded Programming it gives a illustrated and platform independent explanation followed by how it is realized in the MSP430. This cements understanding like nothing else.

Good advice but I actually found FPGA very hard to start or very expensive. May be you can give more hints.

FPGA is pretty challenging to get started with. If you decide to go in on it, you're going to spend quite a bit of your time at the beginning getting your ass kicked. Vendor tooling blows, open source tooling is playing continuous catch-up or is much more focused on VLSI, and resources for beginners feel pretty anemic compared to what you're probably used to coming from the software world.

I can't tell you if this is "the way", but I can tell you how I started. I flipped a coin to decide between Altera and Xilinx and started with a Terasic DE2 board (an Altera Cyclone II devboard) that I borrowed from my university. I don't recommend using this board or something like it (it has actually been superseded by a nearly identical board with a Cyclone IV in place of the dated Cyclone II); the extra peripherals are a headache more than anything, and the simple breakout pins on the Terasic DE0-Nano are greatly appreciated. As for the environment, you can go and download Quartus from Altera basically no-questions-asked.

After pissing with the board for a bit, I decided to pick up a book. Rapid Prototyping of Digital Systems by Hamblen, Hall, and Furman is good, if not a bit out of date. I had this book to thumb through instead of read front-to-back. It is written for older versions of Altera's Quartus software, but with a little bit of exploring the UI I was able to find just about everything I needed. It makes a decent quick reference for Verilog and VHDL syntax, has quite a bit of info on interfacing various things to an FPGA (think PS/2 mouse, VGA, etc.), and a couple chapters on microcontrollers and microprocessors.

An important bit to know is that that a lot of the usage of an FPGA (at least in the way I use them) happens in two phases; the actual logical design, which you'll do on your computer via simulation, and the actual usage of that design on the FPGA. The logical design happens largely in ModelSim (with Quartus only being used to generate libraries at this stage), or some other Verilog/VHDL simulation tool. Altera ships ModelSim free-of-charge with Quartus. It is your run-of-the-mill Tk-based clunker of an EDA tool, but it works and I haven't had it crash on me yet, so I can't really complain, even if I have some gripes with it. This is where most of the work/magic happens, at least for a beginner; you write your logic in your editor of choice, write testbenches, and simulate those testbenches in ModelSim. Having read the Harris and Harris book mentioned in my initial post in the past, I took a quick look at a processor called "Ahmes" on GitHub, and just had a go at it. After getting a primitive processor working/executing instructions, adding some peripherals like some timers is where things started to come full circle, and I started realizing both the power of the FPGA, and why things are the way they are on a microcontroller/processor.

The bit where you actually put it on the FPGA hardware is largely uneventful, or at least it was for me. I didn't have multiple clock domains or anything like that, so I didn't suffer from any timing issues, and basically had the good fortune of just mapping pins and letting it rip. In theory, actually translating the design to the chip doesn't do much, but in practice, a design that exists only on your computer isn't terribly useful. Actually using the hardware adds that "visceral feeling" and lets you play with what you've spent so much time doing, along with getting you familiar with the process if/when the day comes that you actually need to do it. You also get to enjoy/experience the power of bolting on arbitrary hardware to your jalopy processor for the sake of VGA output, GPIO, timers, SPI communication, etc.

I wouldn't consider myself terribly talented or knowledgeable, so if you just throw enough time at it, you can probably end up having just as much fun as I did.

>Don't bother with Arduino

What alternatives are there then?

Buy an Arduino and don't use the provided std lib functions

I'd recommend an ESP32 (or it's slightly less capable little brother the ESP-8266) using the Arduino environment to get started. If you prefer Javascript, you can program it that way using something like NodeMCU or MongooseOS. Once you get comfortable with that, if you want to get a little lower level you can use PlatformIO with the vendor's SDK directly.

I like the ESP boards from Wemos. The US made ones from Sparkfun and Adafruit are also really good. The advantage of the Wemos ones (and the many counterfeits you'll see on sites like Aliexpress) is that they're so cheap you don't even have to feel bad if you fry one.

I’d highly recommend Espruino — a microcontroller that runs JavaScript http://www.espruino.com/

For anyone interested in embedded JavaScript but wants broader hardware support and/or modern JavaScript support[1], the XS JavaScript engine is microcontroller-independent[2], and Moddable provides supported ports for ESP8266, ESP32 (the successor to ESP8266), and the Gecko series of processors. (They also sell some hardware kits for folks who want that: https://www.moddable.com/product.php)

[1] Espruino supports ES5, XS supports ECMAScript 2018 [2] https://www.moddable.com/faq.php#microcontrollers

Thank you! Moddable looks like a dream come true for my embedded systems needs. Did not know they existed.

Check out nand2tetris !

I wonder how many Linux instances are running in an average modern automobile.

The roadblock has always been the MMU. Vanilla Linux won't work without it, and uCLinux is always in some strange phase of support for newer architectures if it's supported at all.

I found a decent port of uCLinux to the Cortex-M3 once, but the host core was 120 MHz and it was pretty much a science experiment and not much more.

I can't tell from the datasheets and the description. Does this ucontroller have an MMU?

It’s an A9, same as on Xilinx Zynq 7000 series, 32-bit ARM core with full MMU.

No the zynq chips have a cortex-A9, which is armv7, this is armv5. It still does have a MMU.

Oh, my bad, I misread the datasheet, read "A9" where it says "ARM9". Thanks for the correction.

For comparison, the Raspberry Pi 1 had an armv6 core.

I expect that it being a M3 it had a MPU but not am MMU.

Any numbers on current consumption of F1C100s?

Not to mention that a tiny chip in a micro-SD case can hold a library of a hundred movies.

Same. Anyone got a guide for getting it to run Linux? (except this article!)

I wish raspberrypi could release something similar. Their product are still too powerful for my taste.

Raspberry Pi's goal is computer science and coding education. They aren't going to make anything that isn't beginner friendly, and can't at least run a Linux GUI, have basic networking, support standard peripherals etc.

I don't understand how that's relevant. You can still release something smaller and less powerful, and still be accessible to beginners.

Why are you arguing about having standard peripherals? It doesn't let people learn about I/O.

I was in highschool and we were given given embedded hardware and were able to program it using assembly and uploading a program into it.

If you give students some very minimal embedded hardware with wifi and some terminal, they should be able to learn using that. Let them install things, maybe setup a 2D interface...

The RPi educational value is only enabled by I/O pins. I'm not sure that's really worthwhile.

The RPi is interesting because it's a powerful but cheap computer. I just wish there was much cheaper hardware to show people you can also do things with smaller stuff.

Reminds me of the (now defunct) CHIP for $9[1].

Have you checked out some of the alt pi boards like Orange Pi or Onion Omega2Plus?

1. https://www.kickstarter.com/projects/1598272670/chip-the-wor...

There are a lot of tiny travel routers that run Linux. Mostly little MIPS boards inside.

Why do you feel that a hardware product being too powerful is a problem?

Because it also means increased power consumption. If you're trying to run something off battery or solar, not using hardware that's more powerful than you actually need can be necessary for the project to be viable.

I think it's fantastic. I don't typically hand out a physical business card often. So the recipient would be someone that I would feel would generally find the project exciting and intriguing, and it provides a great conversation piece. It's simple enough to discuss it briefly and mention that for any security-conscious individuals, they can check out the GitHub that's stated on the card and see the code and still have the same insight if not more than if they had just plugged in the device. For anyone else, the information they need from a business card is all still nicely stated, in a design that's enticing even without functionality. The production cost is easy to justify with the chance of achieving whatever your goal may be, whether it be securing a job offer or generating new business.

Maybe I'm twisted but my first thought if that if I were a pen tester I would want business cards exactly like this. And yes, they would run Linux... and some other things. :-)

Ha someone on Reddit suggested this. See the thing is, they have my contact info printed on them, so I'm really easy for the police to find after everything is hacked!

If I were a pentester, I'd hand them out with your name and readme on it ;)

If he only hands out a few, they'll still lead back to you, but you'll make the police work for it too.

I do wonder how easy it would be for someone who has your business card to frame you... Is it possible to overwrite the OS on these with something malicious?

It is absolutely possible to overwrite the operating system. You have full root access when you log in. You can dd right over the flash device /dev/mtd0 with whatever you want.

As to framing me, well... I don't think it's much easier than framing me using some other method. And I think the cross section of people that want to frame me, and people who know how to compile Linux, is hopefully very small.

Your real contact info?? :)

Kevin Mitnick famously has detachable lockpicking tools in his card.


My thought too. There was some years ago a trend of doing business card that could fit in a CD player.

I had the same thought back then !

Or on a pen instead of a card!

While it's a fascinating project I'd have some security concerns about plugging someone's USB business card into my computer.

Well, it is appropriate as a business card for an embedded systems engineers, not so much for a system security one!

More seriously, I never managed to understand the "obsession" with ROHS (this here is just an example):

>I made sure to have a lead-free process—the boards, parts, and solder paste are all RoHS—so that I wouldn't feel bad about giving them to people.

I mean, you are creating something that (besides being very nice) is not needed, that very likely will end soon in the trash (and that contains anyway a few grams of "far from clean" materials) and you worry about the tiny amount of lead in the solder tin?

Industry wide, it's about sanitizing a highly toxic substance from the supply chain. I've met a child whose father worked with lead, and would hug him after coming home. The child ended up with permanent brain damage from ingesting traces of leaded dust on the father's clothing. In this case, it's more likely about keeping lead out of something likely to get crunched up in a parent's wallet and then shown to their child.

Lead causes permanent brain damage in children. There is no detectable amount of lead in the blood of children which is considered safe. While an adult body absorbs roughly 95% of lead into its bones, children's bodies end up storing lead in blood and soft tissue where it causes damage. When untreated, it can take months for lead to exit a child's bloodstream.

I would never consider reducing the lead used in something I've built to be wasted time.

Until what age?

Is it safe for children to solder with leaded tin? (Many people here did so as a child, I suppose)

I have no idea. I'm not a doctor, I just read a lot.

I used lead based solder as a child, and I turned out okay. That said, people used to use pewter plates and got sick from lead poisoning after eating tomatoes (the acid leached the lead from the plate); Roman soldiers would keep their wine in a lead container to sweeten it; pretty sure I've heard something about people painting their houses with leaded paint as well. On the one hand, society survived all that. On the other hand, if I have the opportunity to not risk poisoning a child by spending a couple extra dollars buying solder, that seems like a pretty easy decision to make.

I don't get this lead-free hatred on internet forums. People say they can't solder with lead-free, and that it is terrible, and that joints look dull. So much hatred that I actually tried it out. And I found the process is basically just the same and there is nothing to be afraid of. Just set the solder iron temperature a bit higher and that's it. Works like a charm.

My note was not at all about lead-free hate, actually, since several years, everyone in EU (and I believe in US as well) has been using lead-free solder (ROHS is 2003, if I recall correctly).

As you say it is just a matter of slightly increasing the soldering iron temperature, but it doesn't end there and there may be issues further on (JFYI):


It seems like lead-free alloys tend to be more problematic in these - fortunately rare - cases.

But because it is more than 15 years that ROHS came in force, I read in 2019 "and I made it lead-free" like I would read "and I buckled my safety belt" or "I put my helmet on", etc. I see it as "the normal" way, nothing worth a mention (nowadays).

I took a quick look at a rework station in an aeronautic company in France, all leaded solder.

>I took a quick look at a rework station in an aeronautic company in France, all leaded solder.

Very possible, there are a number of (pretty much wide) exemptions even in the last version:


I've heard about whiskers, but isn't it a mostly solved problem now with low Ag alloys like SAC0307?

Bismuth is used as a replacement for lead, if you want lower temperature solder. 58Bi/42Sn alloy is cheap and about as good as Pb/Sn, but is more susceptible to thermal fatigue. I used 57Bi/42Sn/1Ag to solder heatpipes. Works great!

I think it's very considerate to not add lead to a business card people may put in their pockets.

Lead is restricted because of what happens to it when PCBs are landfilled, not because it is harmful to touch.

Washing your hands after handling solder seems to be a fairly universal recommendation. It's not a concern for normal products because the users do not touch the bare PCB.

Lead is harmful to touch and absorb into your skin, so unless you're counting the PPMs coming off into your skin, you're wrong.

You misread. I didn't say it isn't harmful to touch. I said that isn't the reason it was restricted.

In any case it is safe to touch, despite the people downvoting me:


Sure, the lead may cross one or two layers of cloth and migrate from the pocket to your skin and bite you.

More considerate would be to provide the card in a protecting sleeve (that would protect from all the bad, bad substances it may contain) or just use a good, ol' plain paper business card, posibly printed with "bioetical" ink.

>Sure, the lead may cross one or two layers of cloth and migrate from the pocket to your skin and bite you.

You get it on your hands and then ingest it when you eat. (People don't usually wash their hands after touching a business card.)

>You get it on your hands and then ingest it when you eat. (People don't usually wash their hands after touching a business card.)

I was referring to the parent poster that talked of people putting them in their pockets.

Anyway, ROHS (or lead-free) is a good thing, but it is not like you get lead poisoning (or saturnism) because of the once in your lifetime you touched sometthing contained lead and then - if you eat with your hands - you managed to ingest it:


You need to drink from lead or lead soldered tap water or water contaminated by lead to become poisoned by lead.

In the case of the Linux business card the exposed surface that would eventually contain a minimum amount of lead - pre-ROHS soldering tin contained in the common eutectic Sn-Pb alloy 37% lead - is in the below 1 square mm range.

You would nead to actually lick hundreds or thousands of such cards to ingest any meaningful amount of lead.

And lead free soldering tin may contain (in minimal amounts):


bismuth and antimony.

I wish I could find a source on this, but when my 12mo got his routine lead test, his lead levels were elevated. It was then that I learned that even "safe" levels of lead can (it appears, inconclusively) result in long term permanent changes: reduced IQ, increased aggression.

I will forever feel guilt about letting him play in the yard as a baby (my best guess as to where it came from) and take other efforts to reduce the lead in his environment.

Depending on your life situation, it may be impractical if not impossible to reduce the lead in your environment. I am open to being further educated on this subject, but it seems like reducing lead in all products that will be handled by people is a net benefit for getting to home environments that will not cause brain damage to the infants that live in them.

> I was referring to the parent poster that talked of people putting them in their pockets.

Right, but how are you going to put it in your pocket without touching it with your hands?!

>You would nead to actually lick hundreds or thousands of such cards to ingest any meaningful amount of lead.

Lead poisoning is cumulative, so you want to avoid ingesting even very small amounts. I doubt that good data is available regarding the amount of lead that would be ingested in this scenario or its potential effects. Best to be cautious. Lead free solder is not expensive.

Avoiding leaded solder is trivial, and it is something that's hopefully handled a bunch, so why not avoid lead? I didn't know "following best practices you have to follow in your daily work anyhow" is "obsessing".

It’s actually probably harder these days to build something that isn’t RoHS compliant as a hobbyist than one that is. The vast majority of what you can buy from Digikey and Mouser and whatnot will be compliant, PCB shops will mostly do lead-free HASL (though you’d want ENIG anyway if you’re going to use a stencil), and lead-free solder pastes are more common than leaded ones. Basically you’ll probably have to either use old stock parts or pull some really old solder out of the fridge as a hobbyist to end up with lead.

Most of the Chinese board fabs will sell you prototype-quality boards for a lower price if you use a leaded HASL finish (e.g. https://jlcpcb.com/quote#/?orderType=1&stencilWidth=100&sten...)

HASL is perfectly usable with stencils, in my experience.

Were I a system security engineer, I should think I'd have a sanitized way to plug in a USB device.

How else would one do forensics?

I meant - jokingly - the opposite, i.e. you call for a consultancy a security engineer as you are interested in increasing/bettering the digital security level of your business and he produces to you the Linux USB business card, you proceed to insert it in your computer and he says:




In fact, a USB condom would be a nice variant of this project for a security engineer's business card, although the female USB portion might be a bit bulky for practical use.

I spent a month visiting a PCB factory in China. I met the people working with lead soldering.

Seriously, please choose lead-free.

Pregnant women weren't allowed to eat in the cafeteria due to the high number of miscarriages. The people inhaling the stuff all day... let's just say that had a reputation among the other factory workers for being a bit mentally challenged.

This is true, but given that I'm creating it, why _not_ use a lead-free process, especially given that lead is known to be very soft and toxic?

Hey you are the one that made that nice card!

Good to see you here, congratulations for your idea and creation.

I replied to that in another post, ROHS is 15 years old or more and most available parts and solders are ROHS compliant, I believe you would have had a tough time to intentionally make something that was entirely non-ROHS compliant.

It would be pretty sweet if it was Bluetooth instead. A Bluetooth serial console would be pretty kickass.

If their business card hires them, they’re hired!

Is that really the responsibility of the person making the card?

This is the responsibility of the creators of the USB hardware, firmware, motherboard firmware, electrical design, OS developers / driver developers.

It should be safe to plug in a USB device by default

The whole beauty of USB ports is that we have one unified interface for practically everything. It can't be safe without giving that up, adding some friction to user interaction, or doing some dark art involving centrally signed devices that we expect won't be broken. If anybody can produce storage and input devices that plug into a universal port and just work, malicious input devices can be made to physically look like storage devices in the eyes of the user. I'm not saying we shouldn't make more security conscious tradeoffs, I'm just saying that there are tradeoffs to be made, and most users probably won't appreciate the added friction.

should be safe, in theory, sure. But in practice?

Then we should stop any form of communication.

In practice it can be safe on multiple levels. He could’ve also just put some malware I. A pdf or word document, as word is what recruiters want.

OTOH, a security conscious engineer could exploit this to have the business card call home and automatically block communications from the company so one does not waste time with them ;)

I'd say that's a business opportunity lost.

It would be funny to have the business card list known exploits on their system -- whether this would taken lightly by the hiring manager is another question.

Pentest-by-business-card-as-a-service. Awkward acronym, probably not going to be a commercial success.

And just like that, you committed a felony.

Probably your second today, to be fair.

Wrong. Second this minute. (420 my man - fuck da US Gov in this respect - they do not have sovereignty over your body or mine)

Not a felony if it transmits over a cell network that the usb interface was activated.

Sorry, but unless you provide other data I still believe this would run afoul of the computer fraud and abuse act (CFAA - 1986).

That’s if you even have a USB A socket anymore!

I don't see why you were downvoted. I fully get why the blog writer didn't include USB-C with this awesome project. But there's now a growing group of people who simply can't use USB-A anymore with their laptops. Most will have an adapter, but not all.

I've played around putting USB connectors directly on PCBs, and USB-A is way more lenient than USB-C. With USB-A, you can use basically any PCB thickness between the standard 1.6mm and 2.2mm, and it'll work well enough.

USB-C's midplate is specified at 0.7mm, and all of the cheap chinese PCB manufacturers only offer 0.6mm and 0.8mm. 0.6mm is too thin to make reliable contact with most of the cables I've tried: I haven't gotten around to ordering another batch at 0.8mm, but plugging the corner of a 0.8mm M.2 SSD into a USB-C cable, it's pretty (i.e. probably slightly too) snug. Thin PCBs are also extremely flexible, which isn't great for brittle solder joints, ceramic capacitors, etc.

While this is generally true, I think it’s less true for embedded systems. All of the tools I use at work for embedded systems engineering (JLink, serial adapter, oscilloscope, logic analyzer, nearly all MCU dev boards and FPGAs) are USB-A. I use a MBP and simply could do any work whatsoever without a USB hub. I think it’s fair to assume anyone in the industry can use USB-A.

In the industry, yes, I agree. Except for maybe managers but even they probably have a dock on their desk that includes USB-A.

Edit: I remember something, having done work related to embedded stuff. About 6 years ago, we still had 3.5" floppy drives in a couple of scopes in the lab. We replaced those with a device that internally had the interface of a 3.5" floppy drive, but on the outer faceplate it had a USB-A connector for thumb drives.

I would assume someone doing embedded work would not only have to have USB, but serial. Is that incorrect? It seemed like that was even included in your list?

I usually have two or three USB to serial converters plugged into my computer at work at any time. I expect most embedded people would do similar (or use serial console servers over a network).

A female to C male adapters exist, but C female to A male adapters are forbidden by the spec.

That means you need to create A peripherals if you want the largest audience.

Are you sure this is the case? I bought a couple of these to have newer peripherals work on my older laptop.

Yeah, they sell them but they're not supposed to. It's because putting an A on each end of a C cable and connecting two A hosts together can destroy the two A hosts.

Practically, since you can find them as you say, is that as a peripheral vendor in the west, you can't ship a C device with a C to A plug because you'll lose your license.

What an absurd tragedy. Everyone was just fine with usb-A and ethernet ports and hdmi but no, now these laptops need dongles for literally everything.

USB-A is not 'just fine'. It's uni-directional. Which is an insane design for something being constantly plugged in and out. It's also low-bandwidth and low-power. Most devices these days have bluetooth, so it isn't really needed except for power and monitors. USB-C is a step-up in everything and I'm happy with only USB-C.

Not everything needs to be constantly plugged. My desktop doesn't even have Bluetooth in it. I'm quite happy with the USB A ports on my PC when it comes to plugging in mice, keyboards, USB storage, etc. If you use USB devices on a regular basis, you'll remember which way USB ports are oriented, and you can just look at the end of the USB cable to see which way it needs to go in.

Having said that, I will say the USB C connector on my phone is far superior, as I plug/unplug it daily.

Power and bandwidth aren't even related to connector type.

> If you use USB devices on a regular basis, you'll remember which way USB ports are oriented, and you can just look at the end of the USB cable to see which way it needs to go in.

You shouldn't need to remember orientation or look at it! You're apologising for a bad design.

> Power and bandwidth aren't even related to connector type.

Not true! For example the max bandwidth you can currently put through USB-A is 10 Gbps with USB 3.1. Through USB-C it's up to 50 Gbps with Thunderbolt 3.

The asymmetry of the USB-A connector is at worst a minor inconvenience. Definitely not in itself a good reason to break compatibility. And most USB devices are not frequently plugged in and out (indeed, for those that are, Bluetooth is a better choice!)

Growing pains, we'll get over it. USB-C fixes the problem everyone complained about forever of turning the plug over three times before getting it to be rightside up, and gives us enough contact points for modern video streams. I feel like making it compatible with the older USB standards by just connecting a subset of pins is super useful, and it's about as much backwards compatibility as we can expect with major sacrifices in other regards. I've got a phone with two USB-C ports, a hub that gives me two USB-A ports, an ethernet, and another USB-C, and a separate USB-C to HDMI adapter. I wouldn't trade either of my USB-C ports for a dedicated USB-A.

Absurd tragedy? Maybe for you. For me, it's been a good life since the 2016 MacBook Pro. I got a decent dock, and connect everything with a single cable.

I would assume you paid a good $150 for one, given that the others either don’t supply enough power or can’t do 4K@60.

More. I paid 250 euros for a CalDigit TS3 Plus. Thing is amazing.

When I first got my MacBook Pro, I just got USB-C cables for my external monitor and my (USB-A) hub. That was very cheap, but I love the convenience of Thunderbolt and shelled out the cash.

I am struggling to find good info - does usb-c support more than one display on one cable at 4k@60?

Yup, all Thunderbolt 3 docks support two 4K displays @ 60 Hz.

Then build an embedded system to plug it into ...

That was my first thought. If someone gave me this at defcon I would not use it.

Would you use anything given to you from a random individual at defcon?

In an airgapped VM, on someone else's hardware? Probably not even then!

Haha very good point! I would barely feel comfortable about a regular business card.

ethernet on a business card is just not as viable.

Perhaps you could do something like this, incorporating the jack into the PCB instead of an elevated component? https://upload.wikimedia.org/wikipedia/commons/d/df/XJACK_ne...

I never understood why Apple and other thin laptop manufacturers didn't do this? I remember X-Jacks fondly, and would love to have a super-thin laptop with ethernet.

Some ultrabooks have ethernet sockets that fold out. In my experience the reliability is ... mixed.

From the project webpage:

> It has a USB port in the corner. If you plug it into a computer, it boots in about 6 seconds and shows up over USB as a flash drive and a virtual serial port that you can use to log into the card's shell.

The parent of this thread has a valid point here and the insecurity implications of this is nefarious if one isn't careful.

While it is a technically cool fun project, I can imagine a bad actor taking this to DEFCON/CCC or even a tech-centric concert and mass producing this to emulate a USB drive, but is also a keylogger or a remote access tool of some sort which reminds me of the nightmares of BadUSB.

You and your parent clearly shows the current boring, uninspiring state of HN community. Well done.

Now, where is the predictible sandboxing guy to counter your predictible "security implications" argument?

Sort of agree, but the "well done" part in your reply also represent a sad aspect of HN, in my view

Yes, you're right. Can't edit it out, sorry.

I'm worried that the potential employer infects the card, uses it to take over their competitor personal computer network and appends: "Don't forget! The S in IoT stands for Security!" to all of their documents and audio files. Then the traffic lights after we colonize mars will be like "It is safe to cross. Don't forget!..." and some ATM on the moon will be like "Thanks for using the bank of the moon. Don't forget!..."

While I'm impressed by how low the bill of materials is, I can't but put into perspective the kind of skills the authors of the post has, and how mindbogglingly expensive they must be. Contrary to what the media says, it seems that the digital era has made (some) human beings more essential than ever ...

The bias of HN towards web dev is pretty funny sometimes. I'd definitely be very impressed if I received these business cards, it's more novel than other PCB business cards I've seen and took a lot more initiative and effort with all the sourcing and research required. But the design and integration work is something I'd expect any embedded systems engineer to be capable of, and the vast majority of those engineers are making less than all of you guys at FAANG talking about kubernetes or whatever it is that goes on there

After reading the post, i was under the impression that the author really did some shopping for major components and assembled everything into a fun shape. Obviously i know nothing about electronic board design, so could you explain which extraordinary skill this required ?

EDIT : ok, i missed the part where he actually designed the whole board, obviously..

Designing the layout of the PCB. Populating it with surface-mount components using home-made solder paste stencils and an oven. Selecting the MCU. Figuring out what kernel drivers are needed and what can be stripped out. Searching the web for datasheets and drivers that others have created/ported. Building the userland. Debugging and testing.

Like many pieces of technology, once you lay out the steps it starts to look as though each one, and therefore the whole, are achievable to a highly-motivated but somewhat typical person. And it's true! But even so, few people develop the whole set of skills to build something from start to finish, even notwithstanding the incredible help you can get nowadays from open-source software, consumer access to services such as PCB manufacturing, and online documentation. Even in a high-achieving community focused on building stuff, like this one, I doubt many posters here have completed as impressive an individual project.

Yes, these components literally exist to make small embedded systems.

Since a few years now, there are entire Linux systems on USB drives, for example.

Embedded systems engineers are (usually) paid a lot less than full stack engineers.

Edit: I seem to be getting (a lot!) of downvotes on this. I thought it was quite well known, but apparently not.

Full Stack Engineer salary: $93,598 [1]

Embedded Software Engineer salary: $74,758[2]

[1] https://www.payscale.com/research/US/Job=Full_Stack_Engineer...

[2] https://www.payscale.com/research/US/Job=Embedded_Engineer/S...

Disclaimer: I haven't read those links.

In my experience both as an embedded systems engineer and as a hiring manager, the salary sites miss a lot of nuance.

A large amount (probably most?) of embedded systems positions are at the lower end and don't require a lot of experience so that is reflected in the lower salary numbers. At the higher end, where more niche skills and higher levels of complexity & system integration are needed, you'll see the kind of salaries you are more familiar with.

It's also very industry dependent. I've always found it surprising that Factory Automation engineers are paid generally below a typical web development salary, but that's where a lot of embedded engineers end up working and get the lower pay that goes along with the job.

When I worked in Medical Devices, programmer salary didn't make a distinction between embedded or database, or UI work.

In addition, the positions aren't always broken out that way. In my day to day work I do embedded systems programming, Windows desktop stuff, some network (IoT) stuff and even a small amount of web dev when needed. My job title has never reflected any distinction in the kind of programming I did.

LinkedIn reports embedded software engineer wages are a bit higher than the number Payscale gives - https://www.linkedin.com/salary/embedded-software-engineer-s... - still lower than the full stack engineer level but certainly within the same ballpark.

I suspect the number is biased to the type of website that's gathering the data. Maybe employees of higher paying companies don't report their salaries on Payscale.

One of the big caveats is that embedded engineering is very location dependend.

Barcelona (where I live) has big demand for web/app software engineers, but embedded engineer jobs are way rarer.

For embedded engineers getting >40k€ is just not a thing around here. In the mean time my friends with backend experience (2y) get that kind of offers on a regular basis.

> Embedded Software Engineer salary: $74,758

> Individuals Reporting: 97

> Full Stack Engineer salary: $93,598

> Individuals Reporting: 138

I think you're going to need more than 100 or so people's self-reported salaries before you can reasonably claim what pay is like across the professions or in comparison to each other.

True. This is what i dislike about the "market". There is no reward for knowledge, hardwork, solving complex problems etc. Just "commoditization" and "supply and demand dynamics".

The five following are the principal circumstances which, so far as I have been able to observe, make up for a small pecuniary gain in some employments, and counterbalance a great one in others: first, the agreeableness or disagreeableness of the employments themselves; secondly, the easiness and cheapness, or the difficulty and expense of learning them; thirdly, the constancy or inconstancy of employment in them; fourthly, the small or great trust which must be reposed in those who exercise them; and, fifthly, the probability or improbability of success in them.

-- Adam Smith, An Inquiry into the Nature and Causes of the Wealth of Nations, "Chapter 10: Of Wages and Profit in the different Employments of Labour and Stock"


Information and skill aren't rewarded of themselves, for much the same reason that information wants to be free. If the skills themselve are both rare and valuable, that tends to change, effectively serving as a type of rent (in the economic sense). Effort in aquisition, risk in employment, trust, and the underlying attractiveness of the activity, all factor in.

Smith's analysis doesn't hold in all cases, but is a good first approximation. It's remarkably poorly known, even among market advocates.

Nice! However your last two paras are a little too succinct; can you elaborate a little bit more?

There's almost certainly a book or two in there, to do it justice.

Tackling the latter: I recall listening to a BBC interview of a banker, probably in the wake of the 2007-8 global financial crisis, though the example's really an evergreen. The presenter asked how the executive justified his (it was a he) income. He responded based on value creation. Oddly, that's not a basis for compensation in a competitive market -- price should be set equal to marginal cost. Of course, if bank-executive services are a form of rent, he might be correct, but that's an interesting argument on its own.

It was a few years later when I finally got to actually reading Smith that I came across his "five following" discussion. Which a true market-capitalist banker really ought to have fully internalised. It made the interview response all the more curious.

I've also kept this in mind as other discussions of wages (particuarly concerning the tech world, but also other worker classifications) come up.

On the "information wants to be free" comment: skill itself is (in parts) an informational good, though one that lacks inforation's fundamental fungibility (we can transfer information from storage device to storage device across channels and networks, less so skills).

But like information, skill is difficult to both assert and assess. Beyond zero marginal costs, a chief reason information markets behave so poorly is that it's hard to assert high-quality information, and expensive to assess it. If you're reading a book, or article, or ... lengthy and/or cryptic HN comment ... somewhere in your head is the question "is this even worth my time?"

This is what makes tech recruiting, from both sides of the table, such a PITA. The front-line people writing ads and screening calls ... have all but no capability to assess skills. Responses are all over the map, from the absolutely unqualified to domain specialists and experts. "Expertise" itself is almost certainly a misnomer as so much information has so short a half-life -- it's generalised problem-solving and learning which are most required. And the tasks and projects to which talent is applied can itself be called into question. Take the 737 MAX development -- Boeing almost certainly would have been better for not trying to drag out the lifespan of that airframe, a decision which likely should have been made in the mid-1990s. Full costs (or benefits) of decisions cannot be known when they're made, and only manifest over years, decades, or even centuries (fossil fuel use).

Some coverage of that here: https://news.ycombinator.com/item?id=21864949


The notion of "manifest vs. latent" properties or consequences is one I've been looking at. Some earlier work by Robert K. Merton and others.

"The market" rewards short-term, highly-apparent, risk-externalising, liquidity-generating behaviours. There are inherent distortions to all of this. Skills and competence (as well as skills development, mentoring, training, preservation, etc.) are poorly served.

There's also some interesting discussion relating to this in Alvin Toffler's Future Shock, from 1970, which I'm reading for the first time. Bits drag, but there is much that's prescient.

You make some very interesting points. The Adam Smith quote points out the various axes to consider when looking at the "viability" of something in the market which is quite correct. But there is another wrinkle added by the "Modern" world. I presume you are aware of the works of Nassim Taleb. He makes a distinction between "Extremistan" and "Mediocristan" (and lambasts Economics and Economists :-) which i feel is very applicable to understanding "The Market". With the current "Information/Data" revolutions the frequency and impact of "Extremistan" events have increased dramatically and existing economic models are no longer sufficient to explain them. For example, what exactly is the "marginal cost" of software? How do we explain the rise of "The Internet" as a major economic driver? I lean towards the viewpoint that this dichotomy is applicable to the whole Software/IT industry. Thus "Embedded Systems Industry" has moved from "Extremistan" to "Mediocristan" (commoditization, cost reduction, large supply pool etc.) while "Web Development" is still in "Extremistan" and this gets reflected in their respective salaries.

Thanks for the extremistan/mediocristan reference. I've read some Taleb, but not much and not deeply, and that's a valid point.

Financialisation and capital accumulation allow tremendous inequalities in allocation. These have always been present to some degree, but recent history (say, 1800 - present) seems exceptional. Note that arguably the Gilded Age personal fortunes were probably even more extreme than those of today.

Unless I'm missing something, the marginal cost of software publishing is very near zero. Development is ... more expensive.

The Internet's role is a whole 'nother discussion, but pulling in a notion of an ontology of technological mechanisms (see: https://ello.co/dredmorbius/post/klsjjjzzl9plqxz-ms8nww), it's a network, with network effects, as well as a system. These ... tend to define general parameters and considerations.

Where specific industries fall on the mediocristan / extremistan scale .... is an interesting question. Though I'd argue that systems/networks elements tend toward extremistan. Whether hardware is/isn't systems/network (as I'm using the terms) is another question.

For successful fundamental component manufacturers (e.g., Intel, AMD), hardware certainly resembles extremistan.

Thanks for the ideas and the discussion.

Just FYI; You might want to look into the works of Jacques Ellul on Technology and Society if you have not done so already.

PS: Also a good starting point: https://en.wikipedia.org/wiki/Technology_and_society

This is true, had to turn down various jobs in big cities due to very low pay. Guys looked like I'd just slapped them when I'd named my price (previously done fullstack).

Also rocket scientist are paid less. Its a supply and demand market. There are simply more ppl wanting to build rockets and iot gadgets then there are ppl willing do do a crud website for an adtech company.

Don't know why the downvotes either. Its not intuitive, but a sad fact of life and just the way supply and demand works.

Yeah, maybe it's a "I don't like this" vote. I tend to agree - I find embedded software engineering skills quite intimidating. Nevertheless, the market demand just isn't as high.

Where's the mystery? Someone looks at what this guy did for his business card and, to paraphrase, says "Wow, look at this guy's skills. He must charge a very, very non-average amount of money for them."

Your response to this is "But actually, according to payscale.com, an average embedded software engineer only makes X."

1. Nobody likes but-actually posts.

2. Quoting average salary when someone is commenting on how non-average a guy's skillset is isn't really relevant.

I get the "but actually" thing, but I don't think it was that.

The OP said the skills are "mindbogglingly expensive". From personal experience they aren't (I worked in a team with embedded engineers with similar skills to those required for this).

I thought this was well understood - it's been discussed here a number of times.

I added the average salaries after getting voted down to -2 and that turned around the voting.

While there are a number of issues with average salaries I think it's notable that no one is claiming the opposite.

> but a sad fact of life and just the way supply and demand works

It's only hubris that makes us discount the importance and difficulty of, say, web client design and implementation.

How many developers on HN poo-poo "UX" yet don't even have a single fleshed out README on any of their products because it's "not important"?

Turns out raw low-level tech skills aren't the be-all of software and business.

I am sorry, you are very wrong.

A complete "Embedded System" (HW+SW) is orders of magnitude more complex than "Web Client Design and Implementation" (unless you approach Google/Facebook scale). The malleability of SW means you can quickly try out various techniques without too much "learning overhead". With all the existing Frameworks/Libraries/etc it becomes more of a "plug & play" proposition. Not so with Embedded Systems. You have to know EE/Electronics, programming interfaces to HW, Bootloader and OS kernel (if using one), Complex Toolchains/ICE/Simulators/Debuggers and finally; The Application. Each is a huge domain in itself and has a significant learning curve. To have all of them in "one head" is remarkable and non-trivial.

Be careful with that statement.

An "embedded system" can be as simple as "when this button is pressed for 3-5 seconds, turn on this light for 30 seconds." That hardly requires much of a skillset.

And as far as having all the things you mention in "one head," every member of my team can handle that easily and we're not particularly noteworthy!

I think in a sense you are proving my point. Real knowledgeable people have done the hard work to simplify much of technology so that it is easy to get started on. But the moment you are past the "commoditized" part, the "learning ramp" for an embedded system becomes exponential. Not so with pure higher-layer Software Apps. There is also the fact that an "embedded system" spans a very broad domain and hence requires a broad spectrum of knowledge and skills eg. developing a Network Appliance vs. Bread toaster.

In my own experience moving from pure applications software development (though not a web developer) to network appliances to lower-level embedded development i have been amazed at all the "hidden complexity" which you suddenly become exposed to. And this is just the software part. If you get into the hardware part you have a whole another world of knowledge to learn. Merely doing something without understanding is the difference between a "Technician" vs "Engineer/Scientist".

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact