It may be impossible to know everything, just as one cannot claim to know "all" of history, but one can make a bloody good go of it.
The main problem I've encountered personally is simply limits on human memory. Things fall out of your brain.
I studied semiconductors, electromagnetism, quantum mechanics etc as part of my Physics degree. I've written emulators, webapps, backend code, etc.
There are gaps. Not conceptual gaps but rather in implementation.
Could I write a top to bottom tutorial? Not without research, and undoubtedly subject matter experts would pick at it, but I believe I have some understanding about every level of the stack and an idea of where to pick that knowledge back up if I need to.
Is it useful?
I mean, surely that's for an individual to decide. Financially perhaps you'll be better off learning the hot new framework of the day and hobnobbing with the cool kids.
But then you could also get rich making wanky adtech optimisation. Or just starting a run of the mill exploity corp.
Me? I just want to know how it works, in endless fractal beauty.
I don't live my life based on what others consider to be "useful".
Not just in programming, either.
Not everyone may want to know all the gory details, and nobody wants to get called on to explain it, but it can be, and should be done as a matter of course.
The issue is that the industry will kick and scream to keep from being held accountable to providing the customer with a way to be able to parse that information, and no one wants to hire people to document, because that isn't seen as positive value creation in today's corporate climate. It won't ever be short of a legal requirement to do so. That legal requirement will be fought tooth and nail on the basis of wanting to continue to protect trade secrets.
Nothing keeps a device from having it's actions fully elucidated except an unwillingness to abolish information asymmetry.
Combine that four year degree with a computer science degree (so eight years total), and I think you'd have a pretty in-depth overview of the principles of every part of the stack from React down to silicon, boron and phosphorus, albeit not all the implementation details.
One can pretty easily understand all of the sub-systems in isolation (well, it takes more than a couple of years!), but not on ALL of the systems, and complex systems, in total are more than any one mind can reason about in, in total. Too much for one brain to contain the dependency graph.
I'll never forget the day that my chemistry professor explained how a transistor worked... no quantum mechanics required.
Sure, I'd have to look some details up, but if I can learn it, so can you.
Not only is it possible, but it is important to demand.
NAND to Tetris (www.nand2tetris.org) seems like a pretty good take on the concept.
Another one is Wirth's Oberon system, which runs on Wirth's TRM/RISC architecture, which is simple enough for students to implement in Verilog (or another HDL such as VHDL or Wirth's Lola) and compile to an FPGA.
MIPS is also simple enough to implement in Verilog over the course of an academic term and also has good tool support for languages like C and C++.
Personally I find understanding low-level system behavior to be interesting, fun, and helpful for understanding many irritating application behaviors (often errors and slowdowns) which are caused by interactions with the OS and hardware.
And for a lot of projects today, "the bottom" ends with C for precisely this reason - while the fact that C is the chosen language for this task is completely arbitrary, as arbitrary as the status quo hardware paradigm of "x86 on desktops, ARM on mobile". I could imagine a universe where it was an extended form of Basic or Pascal that ruled the world instead.
Likewise, I was also looking into extending Lua with a small DSL interpreter that could accelerate certain byte-level tasks: copy around blocks of bytes, run some common algorithms quickly, define spaces for variables, apply some parameters. At first I thought of this language as a "VLIW" assembly language with a particular focus on having a big bag of tricks, but when I compared my semantics with actual hardware assemblers, I found that having no registers or stack manipulation and focusing only on direct memory addresses changed the character of it so much as to make it a different beast, one more like the "autocoders" of the 1950's: not quite ALGOL, and yet clearly headed in that direction.
As far as operating systems, Rust does seem to be a better entry point, with the Redox OS and general community.
A corollary reason is to improve your "mechanical sympathy" when writing high-performance code.
Many of what we once called "application devs" -- now more commonly "full-stack devs" -- will never have to discover real mechanical sympathy, which is fine. But there is a glass floor for such people.
There is one main reason to learn assembly: things WILL eventually go wrong, to the point that you need to debug at the machine level in order to understand what's going on.
Also, many debug tools such as ptrace are primarily accessed through C, and require a basic understanding of the assembler view of the machine.
I would also argue that C is pretty essential for anyone who wants to use EBPF/IOvisor.
I would point out that the vast bulk of software in the entire world is vertical market business software. Vertical market meaning for a limited kind of user, versus horizontal software such as Word or Excel that everyone might use. Vertical market software is literally everywhere and largely invisible. Your public library has custom software. Your hospital. Ever go to get a blood draw and notice the operator is using specialized software. The place that changes the oil in your car. A lawyer's office. Cabinetry makers have custom specialized software. Cities using utility billing software. Keeps track of what gas meters (models, serial numbers) and whether and where installed, tracks the meter usage and generates bills. Rental car companies and airlines have custom software. Hotels have custom software. Restaurants software may have a map of the tables and what "state" they are in, with a list of customers in line. Some of these categories like City, Healthcare, School Districts, and other specialized software are entire industries unto themselves with huge software companies that build software just for these specific industries.
Much of this software is now web based. Because web based is centrally controlled. Zero install at the workstation and zero maintenance at all the workstations. All you need is an OS and a browser. (hint: chromebooks in some cases, and iPads, etc)
Is it any surprise that languages like Java have been the top languages year after year? It's where the jobs are.
It's not the same thing as microcontrollers. And microcontrollers are in lots of things around us. But they get built once, and replicated millions of times. The vertical market business software has thousands of specialized categories, all different, and with ever changing regulatory requirements (federal and state) and reporting requirements, etc. An accounting system with a large complex payroll module probably pays your paycheck. And is integrated with a human resources system. And benefits system.
All this software is written thinking at a higher level of abstraction some distance from the hardware. Transactions. Currency amounts and conversion. Databases. Yes, it may sound boring, but it's what makes the world go around and is mostly invisible.
It's also very stable. Like decades of employment stable.
Never seen this.
I learned it out of curiosity.