"Looks like you could cut yourself on it," Rydell said.
"You could, many no doubt have," said Karen Mendelsohn, "and if you put it in your wallet and sit down, it shatters."
"Then what's the point of it?"
"You're supposed to take very good care of it. You won't get another."
I've yet to read anything by Gibson that approaches anything by Stephenson
As an amateur programmer that has never actually physically handled hobby micro-controller type boards, but has read a lot about them, it was amazing to be able to try out the emulator at https://micropython.org/unicorn/
Make the leds blink, see how servo control works etc, all in the browser without having to actually buy any hardware! I had never seen anything like this.
Would love if others could point me to more advanced emulators of hardware micro controllers (if there is such a thing!)
MakeCode is also similar to what you describe - check out Adafruit's Blockly/JS simulator:
Note that CircuitPython is Adafruit's fork which focuses more on education than industrial prototyping. So even within the embedded Python ecosystem, you have options.
Embedded development is so easy these days, it is just fantastic. The barrier to entry, in both cost and complexity, is so low now that just about anyone can apply physical automation to the problems that they see in their life and community. I'm excited to see what emerges in the next couple of decades.
Edit: and you can program it with the arduino IDE
My first PC had 66/33 MHz 80486 processor and 8 MB ram, 320 MB HDD. You could run AutoCAD, play Doom, Civilization, have dual boot for Linux, Windows 3.11 etc. It could compile Linux from source.
I'm pushing 40, and still shake my head in wonder sometimes about just how much is possible at such small scale and cost.
But this is now turning into "when I were a lad..." :)
My XT had an 8088 @ ~4mhz in it, and we pushed it uphill to school both ways, in the snow!
It was multi-functional though. The power supply was inefficient enough to heat the room, while simultaneously being heavy enough to bludgeon large creatures for sustenance.
Something inside me misses the radiant heat, sitting here on a cold day with my chilly metallic laptop.
(Not even joking, by the way - literally how I got into computers and programming.)
you kids and you petabytes of ram! back in my day....
(i'm looking forward for these comments to be commented on a certain website i'm not supposed to name)
ZX81: Z80/1KB RAM... more recently though PIC12 (C501 IIRC) ...
Ran windows 3.1, Civilisation (VGA graphics), railroad tycoon, Star Trek 25th anniversary - and could have them all installed at the same time. Other programs included championship manager 93, and I think day of the tentacle.
But it wouldn’t run Linux - 386 and above.
We ran Windows 3.0 and my father configured a boot entry for running Doom or Ultima 7, which required so much memory that it wasn't possible to run Windows, only DOS.
I remember feeling a bit envious about my neighbor having the more powerful 486, which could run Doom much faster than in our PC
Worse part was my Dad in his wisdom bought a family PC without a Pentium, but a Cyrix P166. Had zero floating point processing. Ran like a damn dog on any 3D game.
Any Brits out there might remember Time Computers. On every back page of any newspaper ever selling PoS computers with whatever subpar hardware they could cram into a big beige box ;-)
If I remember right it was almost $3k...but that included that blazing fast 300bps Hayes modem for wide-area networking.
And my mind was truly blown when we replaced the 2nd floppy with a more-memory-then-we-will-EVER-need 20MB hard drive...what could we do with all those bits??
Wow, I admire the far-sightedness of your parents, to give a child a computer at such a young age (and in those early days of PCs).
I was of a similar age when my dad brought home an NEC PC 9801 - with an 8086 CPU (8MHz), 640KB RAM, kanji (~3000 characters) font ROM, even a Japanese word processor. I think it ran MS-DOS 2~3.
"In 1987, NEC announced one million PC-98s were shipped."
That was a big wave, and I'm so glad my parents let me play with a computer as a "toy" - it was a huge influence on my mental development.
Kinda like the monolith moment in 2001: A Space Odyssey. :)
My friends around that age all had a wildly popular video game console called Nintendo "Fami-Com". https://en.wikipedia.org/wiki/Nintendo_Entertainment_System
My parents refused to buy it, and instead let me play with the PC-98, where I learned BASIC, Turbo Pascal, even some 8086 assembly language.
I suppose if I have children, I'd discourage mobile phones/apps and instead give them Raspberry Pi, microcomputers, sensors, devices they can build stuff with.
First one I remember was a Commodore 64, along with a 600 page book full of BASIC that you could type out and record on cassette to have your own game. The book was total gibberish to anyone else; it was just pure code with no explanation. But that's what the C64 gave you; an interactive environment on boot where you could program a new game and write it to a cassette. By default. If you wanted to play a game you had to type `RUN` or maybe one or two other things to set it up. But you wouldn't know that, because you just had an interpreter on a basic blue and white screen.
Worst bit was the 10 minutes of spasmodic strobe animations that showed you the game was loading. But also each game controlled those loading animations. You had to know what game you wanted to play, and be sure of it, or otherwise you could just flip to the B-side and get a different game.
After that I think we had a BBC Micro at school but I'm not sure. All I remember is an adventure game and one of those classic 5" floppies. I still really love the look and feel of inserting a floppy and manually locking it in. Floppies and cassettes and VHS's minidiscs were truly fantastic for the time. They were still mechanical, unlike CDs.
Then on my dad's side I and my siblings got an Acorn PC and a bunch of random floppies. None of them made any sense but some of them did cool things when you ran them. I remember hiding from my family and putting in the floppies that made flashing colours and watching it until time passed.
Must have been 11 or 12 years old before we first got a PC and by that point I was utterly fascinated. It was some shitty off-the-shelf eMachines thing but it was the best we could get; I managed to retrofit a decent graphics card in it a little bit later.
8MB was pretty extravagant but it turned out to be a good call even though it could be had for half the price within a few years.
Not to mention if you build the kernel regularly, you benefit from incremental compilation. If you change a few non-header files the rebuild time can be as little as 2-3 minutes. Oh, and "make localdefconfig" will reduce your from-scratch compile times to the 5-15 minute mark. I highly doubt most kernel devs are building a distro configuration when testing (especially since they'd be testing either in a VM or on their local machine).
Like, for example?
Take for example Delphi which can compile millions of lines of code in under a minute or something ridiculous. Then we have D, Go, Rust, and such, they compile rather large codebases that would take C++ a good 30 minutes on high end hardware of today in shorter spans of time (not as familiar with how fast Rust compiles, but I know Go and D do pretty nicely, especially if you enable concurrency for D), which probably takes those same 30 minutes on high end hardware from about a decade ago.
Sadly from what I have heard C / C++ has to run through source files numerous times before it finally compiles the darn thing into anything meaningful. Facebook made a preprocessor in D to speed things up for them, it technically didn't have to be coded in D, but Walter did write the preprocessor for them, so he gets to pick whatever language.
In the early days, clang was much faster in compilation than gcc. Over the years, it has improved on optimization output but as a consequence has lost the compilation speed.
There are many examples for https://godbolt.org/ to show how much work the optimizer does. As example, the http://eigen.tuxfamily.org library relies on the optimizer to generate optimized code for all sorts of combinations of algorithms.
C and C++ even though they have header files, make no restriction on what goes in a header file, so it takes a bit to figure out the dep graph.
But the preprocessor isn't what makes compiling C++ slow. It's a combination of the complex grammar, and ahead of time optimization of a (mostly) static language. Turns out you can compile code really fast if you don't actually try to optimize it.
That was on a second-hand Pentium-S. I probably wasn't doing it right.
Linux 4.x is over 150 megs, gzipped. Just the lines in the arch/x86 folder are more than the total lines in Linux 1.0.
From there, can decide where your interests might be. To continue with Arduino hardware but without the IDE, I'd recommend _AVR Programming_ https://www.oreilly.com/library/view/make-avr-programming/97...
For other low level programming, Atmel's AVR Studio and their dev boards are incredibley newcomer friendly. Their source level debugger (with the debug coprocessor) is a miracle.
If you'd like to get into "big iron", Embedded Linux is amazing. Raspberry Pi is a great start (lots of GPIOS programmable from userspace). To get into embedded linux kernel programming, start with a loadable module on the Ras-Pi. Also, build/load your own Ras-Pi kernel.
Other folks suggested the ESP8266 which is also great fun to use.
Edit: Learn C. Lots and lots of C. Embedded dev is almost all C. A teeny bit of ASM, sometimes C++. But almost all of it will be in C.
Somewhat related, I've also used buildroot in both AWS and GCP to run workloads from read-only system images. Quite liberating in my opinion. No ssh, no ansible, etc. Build the image, launch it and off it goes. GCE even allows you to use the same boot disk, if mounted read-only, for multiple instances, perfect for these type of images.
At the moment, I'm doing a structured C book course (C: A Modern Approach), and I've also signed up for the edX course "Embedded Systems - Shape The World: Microcontroller Input/Output"
It uses the TM4C123 board, which from a look around seemed to be a decent enough board for a beginner. I'd seen complaints of Arduino, but I'm not experienced enough to know the validity of their claims.
Either way, I'm having fun. Not sure if I'd switch career as Web Dev is full of decent paying jobs but it's nice to have a hobby that's different from the day job.
And if you don't use the arduino ecosystem, but use an RTOS on an Arduino, it's a perfectly valid dev board for learning real embedded systems development.
I'm mostly just using it for the course which was highly recommended in multiple places, so I hope I won't encounter many of these bugs.
To be honest, the whole picking a board thing was quite overwhelming, with many recommendations, boards, variations, etc. Hopefully once I've finished the course I'll have some more knowledge to help me pick the next board.
ESP8266's can also run NodeMCU, which is Lua but node.js stuff translates super easily: https://nodemcu.readthedocs.io/en/master/
They have Wifi, and the ESP32 is actually very powerful and has two cores. They make for great DIY home automation/IOT devices and are pretty easy to work with.
Second this. The ESP8266 (a nodeMCU) is about as powerful as an Arduino Uno, is about half the size of a credit card, is very thin and costs around $3.
The wifi part is the biggest advantage. You can send sensor data directly to your server via simple http requests.
…and read this: https://docs.rust-embedded.org/book/
As for what microcontroller to actually learn on, I would say the MSP430 is a very good starting point. It's a fairly mundane 16-bit RISC microcontroller series with very forgiving electrical design requirements, very good documentation, and very good community support. They make a devboard (TI calls them Launchpads) for the MSP430G2553 that's more than enough to get a beginner started. When you need a little more power, you can either opt to invest in learning the ARM ecosystem, or go for something a little more exotic. Just about every manufacturer makes an ARM microcontroller of some sort, so if that's what you're interested in, take your pick and go with it. If you're looking for something else, the Renesas RL78 and RX series provide a lot of functionality if you're willing to deal with how stodgy Renesas can be.
Some important notes:
1.) Don't bother with Arduino. They were a much more compelling product 15 years ago when you had to pay thousands in tools and compiler/environment licensing to get in on in embedded development. Today, what Arduino nets you is a painfully barren environment that abstracts away a lot of what you're trying to learn when you're starting out. Losing out on the debugging, profiling, tracing, disassembly, memory usage statistics, etc. that modern development environments give you will do nothing but stunt your growth, especially if you're used to having all these tools while writing desktop software
2.) Be careful with (or preferably just avoid) starting with embedded Linux; it's pushing the limits of "embedded". You're going to miss out on a lot of important knowledge and insight jumping straight into using an operating system (and a very heavy one at that), and for many applications, it is MASSIVE overkill. When you start, you're not going to need an RTOS. When you need an RTOS, you're going to reach for something more reasonable, like FreeRTOS. If FreeRTOS doesn't cut it, then you can start looking at Linux.
3.) Don't get tangled up with Raspberry Pis; the microprocessor on these is complex and the documentation is severely lacking/nonexistent. RPis are much closer to a desktop computer than they are an embedded system.
If you really want to get it, I would say one of the most useful exercises is implementing your own microcontroller/processor. You can pick up an FPGA devboard for fairly cheap, and there are plenty of textbooks (Harris & Harris' Digital Design and Computer Architecture comes to mind) that will get you through most of the key concepts. Once you've done this, a lot of the gaps in understanding when dealing with a microcontroller will be filled in. This exercise isn't strictly necessary, but I don't know anybody who has done it that wasn't better off for it.
My final note is to buy or "acquire" Horowitz and Hill's The Art of Electronics. Embedded development is inseparable from electrical engineering, so even if you don't read it front to back, there are some sections that you will definitely be visiting if your background isn't in electronics.
I wouldn't say that. Yes the Arduino "libraries" abstract away a lot of the complexities and hinder true understanding. However for a beginner it is a perfect platform simply because of the huge community i.e. tutorials, code and projects. Once they gain confidence from using this, they can move on to "traditional embedded development" by learning to program the underlying AVR directly i.e. use the toolchains installed by the IDE, delete bootloader and upload your program using the ISP interface (see the book; "Make: AVR Programming"). This gives you the best of both worlds using the same development platform. Another advantage is that many other MCU families also provide a "Arduino-like" interface (eg. Energia for MSP430) and thus all the skills learnt on one can be transferred to another.
If you're already writing software, you're not going to be struggling with having to learn programming. Translated to Arduino, this is great for engineers who don't write software and just need to get simple shit done fast, but for those with a background in software, it feels equal parts mundane (I called analogRead and... got an ADC value, just as expected) and magic (what did analogRead actually do?). Given, you can go look at the source for these libraries, but a lot of them make very generous use of the preprocessor for the sake of supporting multiple products in one codebase, and it's often not pleasant to read. Working with an MSP430 (or basically any architecture if you're using more capable tooling, like you mentioned with AVR) has a certain clarity to it that the Arduino ecosystem just doesn't seem to capture.
I would make the same argument for AVR and the "real deal" tooling as I do the MSP430; why bother with Arduino when you're probably coming in with decent programming skills?
Pure Software programming skills are NOT enough when it comes to embedded programming. You need to know the logical interface to the HW and the EE/Electronics behind the HW itself. This is a huge challenge for people who have only worked with applications software over layers of abstraction. This is where the Arduino shines; hide the overwhelming HW complexities and provide a simple api for software guys to get their job done. This allows them to slowly gain HW knowledge and transition to "traditional embedded programming" as needed. Also, many experienced embedded developers are using the Arduino as a rapid prototyping platform to implement an idea quickly and get clarity on the project before implementation on the actual target using traditional means. Learners can do both on the same Arduino platform.
So here is my recipe for somebody wanting to learn Embedded Development;
1) Get a Arduino Uno, a couple of extra ATmega328Ps, a AVR programmer and a electronic components toolkit.
2) Get a book on Arduino programming and AVR programming. I can recommend "Exploring Arduino" and "Make: AVR Programming". You also need a book on Electronics and "Practical Electronics for Inventors" is a pretty good one.
3) Install the Arduino IDE and go through some blinky tutorials. Do some projects from "Exploring Arduino".
4) Now move on to direct AVR programming. The Arduino IDE has already installed the required toolchains. Setup a Makefile environment using them following instructions in "Make: AVR programming". Do some blinky and other projects from this book. This gives a command-line approach to AVR programming.
5) Next repeat the above using a proper vendor supplied IDE eg. Atmel Studio for AVRs. These IDEs are pretty complex but quite powerful and used extensively in the industry. Go through some tutorials and redo the blinky and other projects using the IDE.
6) Get some test equipment tools to look into the inner workings of the system. I recommend the multi-functional "Analog Discovery 2" which has lots of learning tutorials.
Congratulations; you are now a bare-metal "Embedded Developer"!
To enter the "big boys league", move onto ARM Cortex boards.
Finally you get to "Embedded Linux" and become a "Master Super-Duper Embedded Developer" :-)
I can't tell you if this is "the way", but I can tell you how I started. I flipped a coin to decide between Altera and Xilinx and started with a Terasic DE2 board (an Altera Cyclone II devboard) that I borrowed from my university. I don't recommend using this board or something like it (it has actually been superseded by a nearly identical board with a Cyclone IV in place of the dated Cyclone II); the extra peripherals are a headache more than anything, and the simple breakout pins on the Terasic DE0-Nano are greatly appreciated. As for the environment, you can go and download Quartus from Altera basically no-questions-asked.
After pissing with the board for a bit, I decided to pick up a book. Rapid Prototyping of Digital Systems by Hamblen, Hall, and Furman is good, if not a bit out of date. I had this book to thumb through instead of read front-to-back. It is written for older versions of Altera's Quartus software, but with a little bit of exploring the UI I was able to find just about everything I needed. It makes a decent quick reference for Verilog and VHDL syntax, has quite a bit of info on interfacing various things to an FPGA (think PS/2 mouse, VGA, etc.), and a couple chapters on microcontrollers and microprocessors.
An important bit to know is that that a lot of the usage of an FPGA (at least in the way I use them) happens in two phases; the actual logical design, which you'll do on your computer via simulation, and the actual usage of that design on the FPGA. The logical design happens largely in ModelSim (with Quartus only being used to generate libraries at this stage), or some other Verilog/VHDL simulation tool. Altera ships ModelSim free-of-charge with Quartus. It is your run-of-the-mill Tk-based clunker of an EDA tool, but it works and I haven't had it crash on me yet, so I can't really complain, even if I have some gripes with it. This is where most of the work/magic happens, at least for a beginner; you write your logic in your editor of choice, write testbenches, and simulate those testbenches in ModelSim. Having read the Harris and Harris book mentioned in my initial post in the past, I took a quick look at a processor called "Ahmes" on GitHub, and just had a go at it. After getting a primitive processor working/executing instructions, adding some peripherals like some timers is where things started to come full circle, and I started realizing both the power of the FPGA, and why things are the way they are on a microcontroller/processor.
The bit where you actually put it on the FPGA hardware is largely uneventful, or at least it was for me. I didn't have multiple clock domains or anything like that, so I didn't suffer from any timing issues, and basically had the good fortune of just mapping pins and letting it rip. In theory, actually translating the design to the chip doesn't do much, but in practice, a design that exists only on your computer isn't terribly useful. Actually using the hardware adds that "visceral feeling" and lets you play with what you've spent so much time doing, along with getting you familiar with the process if/when the day comes that you actually need to do it. You also get to enjoy/experience the power of bolting on arbitrary hardware to your jalopy processor for the sake of VGA output, GPIO, timers, SPI communication, etc.
I wouldn't consider myself terribly talented or knowledgeable, so if you just throw enough time at it, you can probably end up having just as much fun as I did.
What alternatives are there then?
I like the ESP boards from Wemos. The US made ones from Sparkfun and Adafruit are also really good. The advantage of the Wemos ones (and the many counterfeits you'll see on sites like Aliexpress) is that they're so cheap you don't even have to feel bad if you fry one.
 Espruino supports ES5, XS supports ECMAScript 2018
I found a decent port of uCLinux to the Cortex-M3 once, but the host core was 120 MHz and it was pretty much a science experiment and not much more.
Why are you arguing about having standard peripherals? It doesn't let people learn about I/O.
I was in highschool and we were given given embedded hardware and were able to program it using assembly and uploading a program into it.
If you give students some very minimal embedded hardware with wifi and some terminal, they should be able to learn using that. Let them install things, maybe setup a 2D interface...
The RPi educational value is only enabled by I/O pins. I'm not sure that's really worthwhile.
The RPi is interesting because it's a powerful but cheap computer. I just wish there was much cheaper hardware to show people you can also do things with smaller stuff.
Have you checked out some of the alt pi boards like Orange Pi or Onion Omega2Plus?
As to framing me, well... I don't think it's much easier than framing me using some other method. And I think the cross section of people that want to frame me, and people who know how to compile Linux, is hopefully very small.
I had the same thought back then !
More seriously, I never managed to understand the "obsession" with ROHS (this here is just an example):
>I made sure to have a lead-free process—the boards, parts, and solder paste are all RoHS—so that I wouldn't feel bad about giving them to people.
I mean, you are creating something that (besides being very nice) is not needed, that very likely will end soon in the trash (and that contains anyway a few grams of "far from clean" materials) and you worry about the tiny amount of lead in the solder tin?
Lead causes permanent brain damage in children. There is no detectable amount of lead in the blood of children which is considered safe. While an adult body absorbs roughly 95% of lead into its bones, children's bodies end up storing lead in blood and soft tissue where it causes damage. When untreated, it can take months for lead to exit a child's bloodstream.
I would never consider reducing the lead used in something I've built to be wasted time.
Is it safe for children to solder with leaded tin? (Many people here did so as a child, I suppose)
I used lead based solder as a child, and I turned out okay. That said, people used to use pewter plates and got sick from lead poisoning after eating tomatoes (the acid leached the lead from the plate); Roman soldiers would keep their wine in a lead container to sweeten it; pretty sure I've heard something about people painting their houses with leaded paint as well. On the one hand, society survived all that. On the other hand, if I have the opportunity to not risk poisoning a child by spending a couple extra dollars buying solder, that seems like a pretty easy decision to make.
As you say it is just a matter of slightly increasing the soldering iron temperature, but it doesn't end there and there may be issues further on (JFYI):
It seems like lead-free alloys tend to be more problematic in these - fortunately rare - cases.
But because it is more than 15 years that ROHS came in force, I read in 2019 "and I made it lead-free" like I would read "and I buckled my safety belt" or "I put my helmet on", etc. I see it as "the normal" way, nothing worth a mention (nowadays).
Very possible, there are a number of (pretty much wide) exemptions even in the last version:
In any case it is safe to touch, despite the people downvoting me:
More considerate would be to provide the card in a protecting sleeve (that would protect from all the bad, bad substances it may contain) or just use a good, ol' plain paper business card, posibly printed with "bioetical" ink.
You get it on your hands and then ingest it when you eat. (People don't usually wash their hands after touching a business card.)
I was referring to the parent poster that talked of people putting them in their pockets.
Anyway, ROHS (or lead-free) is a good thing, but it is not like you get lead poisoning (or saturnism) because of the once in your lifetime you touched sometthing contained lead and then - if you eat with your hands - you managed to ingest it:
You need to drink from lead or lead soldered tap water or water contaminated by lead to become poisoned by lead.
In the case of the Linux business card the exposed surface that would eventually contain a minimum amount of lead - pre-ROHS soldering tin contained in the common eutectic Sn-Pb alloy 37% lead - is in the below 1 square mm range.
You would nead to actually lick hundreds or thousands of such cards to ingest any meaningful amount of lead.
And lead free soldering tin may contain (in minimal amounts):
bismuth and antimony.
I will forever feel guilt about letting him play in the yard as a baby (my best guess as to where it came from) and take other efforts to reduce the lead in his environment.
Depending on your life situation, it may be impractical if not impossible to reduce the lead in your environment. I am open to being further educated on this subject, but it seems like reducing lead in all products that will be handled by people is a net benefit for getting to home environments that will not cause brain damage to the infants that live in them.
Right, but how are you going to put it in your pocket without touching it with your hands?!
>You would nead to actually lick hundreds or thousands of such cards to ingest any meaningful amount of lead.
Lead poisoning is cumulative, so you want to avoid ingesting even very small amounts. I doubt that good data is available regarding the amount of lead that would be ingested in this scenario or its potential effects. Best to be cautious. Lead free solder is not expensive.
HASL is perfectly usable with stencils, in my experience.
How else would one do forensics?
Seriously, please choose lead-free.
Pregnant women weren't allowed to eat in the cafeteria due to the high number of miscarriages. The people inhaling the stuff all day... let's just say that had a reputation among the other factory workers for being a bit mentally challenged.
Good to see you here, congratulations for your idea and creation.
I replied to that in another post, ROHS is 15 years old or more and most available parts and solders are ROHS compliant, I believe you would have had a tough time to intentionally make something that was entirely non-ROHS compliant.
This is the responsibility of the creators of the USB hardware, firmware, motherboard firmware, electrical design, OS developers / driver developers.
It should be safe to plug in a USB device by default
In practice it can be safe on multiple levels. He could’ve also just put some malware I. A pdf or word document, as word is what recruiters want.
USB-C's midplate is specified at 0.7mm, and all of the cheap chinese PCB manufacturers only offer 0.6mm and 0.8mm. 0.6mm is too thin to make reliable contact with most of the cables I've tried: I haven't gotten around to ordering another batch at 0.8mm, but plugging the corner of a 0.8mm M.2 SSD into a USB-C cable, it's pretty (i.e. probably slightly too) snug. Thin PCBs are also extremely flexible, which isn't great for brittle solder joints, ceramic capacitors, etc.
Edit: I remember something, having done work related to embedded stuff. About 6 years ago, we still had 3.5" floppy drives in a couple of scopes in the lab. We replaced those with a device that internally had the interface of a 3.5" floppy drive, but on the outer faceplate it had a USB-A connector for thumb drives.
That means you need to create A peripherals if you want the largest audience.
Practically, since you can find them as you say, is that as a peripheral vendor in the west, you can't ship a C device with a C to A plug because you'll lose your license.
Having said that, I will say the USB C connector on my phone is far superior, as I plug/unplug it daily.
Power and bandwidth aren't even related to connector type.
You shouldn't need to remember orientation or look at it! You're apologising for a bad design.
> Power and bandwidth aren't even related to connector type.
Not true! For example the max bandwidth you can currently put through USB-A is 10 Gbps with USB 3.1. Through USB-C it's up to 50 Gbps with Thunderbolt 3.
When I first got my MacBook Pro, I just got USB-C cables for my external monitor and my (USB-A) hub. That was very cheap, but I love the convenience of Thunderbolt and shelled out the cash.
> It has a USB port in the corner. If you plug it into a computer, it boots in about 6 seconds and shows up over USB as a flash drive and a virtual serial port that you can use to log into the card's shell.
The parent of this thread has a valid point here and the insecurity implications of this is nefarious if one isn't careful.
While it is a technically cool fun project, I can imagine a bad actor taking this to DEFCON/CCC or even a tech-centric concert and mass producing this to emulate a USB drive, but is also a keylogger or a remote access tool of some sort which reminds me of the nightmares of BadUSB.
Now, where is the predictible sandboxing guy to counter your predictible "security implications" argument?
EDIT : ok, i missed the part where he actually designed the whole board, obviously..
Like many pieces of technology, once you lay out the steps it starts to look as though each one, and therefore the whole, are achievable to a highly-motivated but somewhat typical person. And it's true! But even so, few people develop the whole set of skills to build something from start to finish, even notwithstanding the incredible help you can get nowadays from open-source software, consumer access to services such as PCB manufacturing, and online documentation. Even in a high-achieving community focused on building stuff, like this one, I doubt many posters here have completed as impressive an individual project.
Since a few years now, there are entire Linux systems on USB drives, for example.
Edit: I seem to be getting (a lot!) of downvotes on this. I thought it was quite well known, but apparently not.
Full Stack Engineer salary: $93,598 
Embedded Software Engineer salary: $74,758
In my experience both as an embedded systems engineer and as a hiring manager, the salary sites miss a lot of nuance.
A large amount (probably most?) of embedded systems positions are at the lower end and don't require a lot of experience so that is reflected in the lower salary numbers. At the higher end, where more niche skills and higher levels of complexity & system integration are needed, you'll see the kind of salaries you are more familiar with.
It's also very industry dependent. I've always found it surprising that Factory Automation engineers are paid generally below a typical web development salary, but that's where a lot of embedded engineers end up working and get the lower pay that goes along with the job.
When I worked in Medical Devices, programmer salary didn't make a distinction between embedded or database, or UI work.
In addition, the positions aren't always broken out that way. In my day to day work I do embedded systems programming, Windows desktop stuff, some network (IoT) stuff and even a small amount of web dev when needed. My job title has never reflected any distinction in the kind of programming I did.
I suspect the number is biased to the type of website that's gathering the data. Maybe employees of higher paying companies don't report their salaries on Payscale.
Barcelona (where I live) has big demand for web/app software engineers, but embedded engineer jobs are way rarer.
For embedded engineers getting >40k€ is just not a thing around here. In the mean time my friends with backend experience (2y) get that kind of offers on a regular basis.
> Individuals Reporting: 97
> Full Stack Engineer salary: $93,598
> Individuals Reporting: 138
I think you're going to need more than 100 or so people's self-reported salaries before you can reasonably claim what pay is like across the professions or in comparison to each other.
-- Adam Smith, An Inquiry into the Nature and Causes of the Wealth of Nations, "Chapter 10: Of Wages and Profit in the different Employments of Labour and Stock"
Information and skill aren't rewarded of themselves, for much the same reason that information wants to be free. If the skills themselve are both rare and valuable, that tends to change, effectively serving as a type of rent (in the economic sense). Effort in aquisition, risk in employment, trust, and the underlying attractiveness of the activity, all factor in.
Smith's analysis doesn't hold in all cases, but is a good first approximation. It's remarkably poorly known, even among market advocates.
Tackling the latter: I recall listening to a BBC interview of a banker, probably in the wake of the 2007-8 global financial crisis, though the example's really an evergreen. The presenter asked how the executive justified his (it was a he) income. He responded based on value creation. Oddly, that's not a basis for compensation in a competitive market -- price should be set equal to marginal cost. Of course, if bank-executive services are a form of rent, he might be correct, but that's an interesting argument on its own.
It was a few years later when I finally got to actually reading Smith that I came across his "five following" discussion. Which a true market-capitalist banker really ought to have fully internalised. It made the interview response all the more curious.
I've also kept this in mind as other discussions of wages (particuarly concerning the tech world, but also other worker classifications) come up.
On the "information wants to be free" comment: skill itself is (in parts) an informational good, though one that lacks inforation's fundamental fungibility (we can transfer information from storage device to storage device across channels and networks, less so skills).
But like information, skill is difficult to both assert and assess. Beyond zero marginal costs, a chief reason information markets behave so poorly is that it's hard to assert high-quality information, and expensive to assess it. If you're reading a book, or article, or ... lengthy and/or cryptic HN comment ... somewhere in your head is the question "is this even worth my time?"
This is what makes tech recruiting, from both sides of the table, such a PITA. The front-line people writing ads and screening calls ... have all but no capability to assess skills. Responses are all over the map, from the absolutely unqualified to domain specialists and experts. "Expertise" itself is almost certainly a misnomer as so much information has so short a half-life -- it's generalised problem-solving and learning which are most required. And the tasks and projects to which talent is applied can itself be called into question. Take the 737 MAX development -- Boeing almost certainly would have been better for not trying to drag out the lifespan of that airframe, a decision which likely should have been made in the mid-1990s. Full costs (or benefits) of decisions cannot be known when they're made, and only manifest over years, decades, or even centuries (fossil fuel use).
Some coverage of that here:
The notion of "manifest vs. latent" properties or consequences is one I've been looking at. Some earlier work by Robert K. Merton and others.
"The market" rewards short-term, highly-apparent, risk-externalising, liquidity-generating behaviours. There are inherent distortions to all of this. Skills and competence (as well as skills development, mentoring, training, preservation, etc.) are poorly served.
There's also some interesting discussion relating to this in Alvin Toffler's Future Shock, from 1970, which I'm reading for the first time. Bits drag, but there is much that's prescient.
Financialisation and capital accumulation allow tremendous inequalities in allocation. These have always been present to some degree, but recent history (say, 1800 - present) seems exceptional. Note that arguably the Gilded Age personal fortunes were probably even more extreme than those of today.
Unless I'm missing something, the marginal cost of software publishing is very near zero. Development is ... more expensive.
The Internet's role is a whole 'nother discussion, but pulling in a notion of an ontology of technological mechanisms (see: https://ello.co/dredmorbius/post/klsjjjzzl9plqxz-ms8nww), it's a network, with network effects, as well as a system. These ... tend to define general parameters and considerations.
Where specific industries fall on the mediocristan / extremistan scale .... is an interesting question. Though I'd argue that systems/networks elements tend toward extremistan. Whether hardware is/isn't systems/network (as I'm using the terms) is another question.
For successful fundamental component manufacturers (e.g., Intel, AMD), hardware certainly resembles extremistan.
Just FYI; You might want to look into the works of Jacques Ellul on Technology and Society if you have not done so already.
PS: Also a good starting point: https://en.wikipedia.org/wiki/Technology_and_society
Your response to this is "But actually, according to payscale.com, an average embedded software engineer only makes X."
1. Nobody likes but-actually posts.
2. Quoting average salary when someone is commenting on how non-average a guy's skillset is isn't really relevant.
The OP said the skills are "mindbogglingly expensive". From personal experience they aren't (I worked in a team with embedded engineers with similar skills to those required for this).
I thought this was well understood - it's been discussed here a number of times.
I added the average salaries after getting voted down to -2 and that turned around the voting.
While there are a number of issues with average salaries I think it's notable that no one is claiming the opposite.
It's only hubris that makes us discount the importance and difficulty of, say, web client design and implementation.
How many developers on HN poo-poo "UX" yet don't even have a single fleshed out README on any of their products because it's "not important"?
Turns out raw low-level tech skills aren't the be-all of software and business.
A complete "Embedded System" (HW+SW) is orders of magnitude more complex than "Web Client Design and Implementation" (unless you approach Google/Facebook scale). The malleability of SW means you can quickly try out various techniques without too much "learning overhead". With all the existing Frameworks/Libraries/etc it becomes more of a "plug & play" proposition. Not so with Embedded Systems. You have to know EE/Electronics, programming interfaces to HW, Bootloader and OS kernel (if using one), Complex Toolchains/ICE/Simulators/Debuggers and finally; The Application. Each is a huge domain in itself and has a significant learning curve. To have all of them in "one head" is remarkable and non-trivial.
An "embedded system" can be as simple as "when this button is pressed for 3-5 seconds, turn on this light for 30 seconds." That hardly requires much of a skillset.
And as far as having all the things you mention in "one head," every member of my team can handle that easily and we're not particularly noteworthy!
In my own experience moving from pure applications software development (though not a web developer) to network appliances to lower-level embedded development i have been amazed at all the "hidden complexity" which you suddenly become exposed to. And this is just the software part. If you get into the hardware part you have a whole another world of knowledge to learn. Merely doing something without understanding is the difference between a "Technician" vs "Engineer/Scientist".