And once You think you’ve run hw to its limits there are plentiful demos and examples around to blow your mind watching what people have been able to do with these machines (often in obscure ways).
Indeed, a whole book was written about it: https://en.wikipedia.org/wiki/Racing_the_Beam
The 2600 is fascinating and fun to code for, but for asm newbies I'd recommend the C64 or NES...
The Gameboy Advance is a marvelous little platform. It runs a modified ARM7, so if you learn its machine language a lot of that knowledge will transfer into the industry. It runs at a brisk 16 MHz, which is fast enough that it can run compiled C code quite comfortably, but slow enough that performance still matters very much once you try to push the system's limits. Even if you run it in C rather than assembly (perhaps ideal for a true beginner), the whole machine is chok full of bare-metal standbys. Most of the graphics and sound hardware is memory mapped, requiring very specific bit patterns written to special addresses to tell the rest of the hardware what to do. Working with graphics requires an understanding of interrupts and synchronization. Finally, being a games system with a screen, buttons, and a speaker built in, there's a lot of projects you could build... but the easiest is to make a game! And what a blast that system is to make games for. Powerful enough to do all sorts of compelling things, but simple enough to program (even as a beginner) that you can easily have backgrounds and sprites onscreen in about a day of effort.
While the processors are simple, making non-trivial programs is hard, because the machines as a whole have lot of limitations, making programming very intricate, compared to more modern systems (say, 16-bit 80x86, but I guess even Amiga and so on).
If the target it challenge for the sake of challenge, then nothing makes those machines special, I mean, one can code directly in machine code if that's the intention :-)
We had to simulate everything, and the final project was to take the professor's Basic program and execute it correctly, and compile his other program and then execute that correctly.
That class was the absolute best thing I ever did, I learned so much in that class. It was the only class that I went to every day and took seriously, I got the only A in the class.
I had started programing back in the mid-70's by the time I got to college in 82 I had a bunch of side jobs writing code. I didn't focus too much on my college career, I was working my side programming gigs, and I was tutoring other students. That class got my attention and I loved it.
MIX was, in theory, supposed really to be the opposite: so universal/generic that you'd ignore the language and focus on the lesson. Similar to the motivation for using Scheme in SICP. Obviously times have changed :-).
Sort of the difference between putting on eyeglasses to see the world better (MIX) or putting on eyeglasses to learn about how lenses work.
Check out this assembly programming series, it's less assembly code on the Amiga to get something on screen or getting mouse/keyboard input than with highlevel languages and APIs/frameworks today:
> I think the opposite is true because those machines were built around assembly programming.
This isn't true. It was extremely simple to get something work based on that tutorial, and to move to something more complex.
Doing that on an 80s 65xx-based machine would have been significantly more difficult, because that generations had significant limits in the coprocessors (AFAIK, the C64 could play music in games, but it was a workaround; it wasn't designed to do so).
Surely Amiga is easier, as the coprocessors were significantly more capable, but the parent based the discussion on C64 development.
> While the processors are simple, making non-trivial programs is hard, because the machines as a whole have lot of limitations, making programming very intricate, compared to more modern systems (say, 16-bit 80x86, but I guess even Amiga and so on).
Programming in those time was an art. Now some people have 32 GB of RAM and they are not able to use it efficiently.
> If the target it challenge for the sake of challenge, then nothing makes those machines special, I mean, one can code directly in machine code if that's the intention :-)
I've seen OS's that at 20 MHz and 4MB RAM did things that Windows, Linux or MacOSX cannot do today with 1000 times more resources. It is really a shame.
Because that was all the processor was doing. No virtual memory, no disk, no IO, no graphics. Just feeding the DSP data on one side and NAT routing on the other.
Under the same line conditions, the PII running windows was useless running at 450Mhz. Mouse would barely respond, keyboard was lagging, screen wouldn't update.
Like you said, depending on what you are trying to achieve, you can perform near miracles on confined hardware if you have confined demands.
It's the same as learning on GW-BASIC or equivalent: You learn some bad habits due to the environment which you must unlearn the moment you move on to a real system.
The 6502 was a real system. Conversely, I'd say Visual Basic is a terrible environment and I have a few friends that made millions using it, so, not sure who gets to be the judge.
What blew my mind was using a modern C++ compiler to build optimized software for that old hardware. Here's an example with the Commodore 64 .
Hence why microcontrollers like ESP32 are much more poweful than many think of.
When the CPU starts, it will start reading instructions from a hard-coded address on the memory bus / EPROM somewhere, right? How can I directly control these bytes?
I don't want some proprietary firmware sit between me and the CPU.
If it's not possible on hardware because "secure boot", or whatever, this should at least be possible in emulators like QEMU.
Does anyone know how to do that? ... or clear up my misconceptions? :)
A physical machine will still, despite everything, start executing at FFFF:0000 in "real mode", and the code run there will be found in a physical EEPROM. Some of these are socketed (although this is less common these days). So you can get in there and fiddle with the pre-boot code.
There is no way round the Management Engine, a source of distress to some. Oh, and you won't have any DRAM until you've run the DRAM training and turned that on, the early BIOS gets to use the cache as a scratchpad instead. See https://blog.asset-intertech.com/test_data_out/2014/11/memor...
If you like bare metal work with decent processing power ARM is probably the place to start.
Manpage currently claims, "QEMU uses the PC BIOS from the Seabios project and the Plex86/Bochs LGPL VGA BIOS." But it also looks like that's as easy to replace as passing `-bios` to qemu-system-
I learned assembly on TI's AM335x "Sitara" series and it was great, mostly because of the BeagleBone- it has onboard JTAG-over-USB, meaning you can single-step your way through your bare-metal code, set breakpoints, etc.
There is a bunch of peripheral set up but it can be done from C.
You can't really start completely from scratch in an understandable way on Intel platforms, and it's iffy on ARM. Because setting up the DRAM requires some insane magic, where it doesn't really help if you can see the source.
This simply isn't true - while UEFI firmwares do offer BIOS emulation, there's no "BIOS" underneath them on most modern boards.
As far as I am aware, UEFI is a replacement for BIOS.
No, you do.
A significant part of what that firmware does is initializing low-level hardware, like the system's memory controller. Replicating that is probably well beyond your abilities.
If you really want to understand and control a processor when it boots from nothing, you should look into a FPGA RISC-V development board.
You are generally correct in your assumption: that once the CPU comes out of reset, it will reach for a particular memory address to begin execution. Some will directly begin execution from a fixed address. A sibling pjc50 comment mentions, on x86, the CPU will be in 16-bit real-mode and begin fetching instructions from FFFF:0000. Other architectures, work slightly differently. Motorola 68k fetch the first 4 bytes from 0x00000000, loads them into the program counter register, and then jumps there to begin execution.
As you saw, the child of a pjc50's comment explains how to pass your code directly to the beginning of the CPU's execution in QEMU. If you want to do this with actual metal, various classic chips of yore (z80, 6502, 68k, etc) and their documentation are relatively easy to get. A nice thing about those older CPUs, is that their memory interfaces are rather simple compared to today. You can wire up a very basic system with the CPU, an EPROM, an SRAM chip, and maybe a couple other chips as glue logic, all on a basic breadboard. And then you really can control those first bytes of executed code, on actual metal.
In the embedded world, Bare Metal means you control the first byte executed by the CPU when it comes out of POST, and aren't using any kind of operating system or proxy loader in between. But it gets kinda fuzzy, because RToS is still "bare metal" and you have full access to the source-code.
I have still yet to really need an RTOS, even for BLE.
I am very glad that in my bachelor program our microcontrollers class actually made us hand enter hex codes in a kit. It got tedious after a while (maybe it should have been for only a few weeks not the whole semester), but it gave me a weird sense of being one with the machine. And it has as awesome ice breaker when talking with older programmers. For some of them I am the only one of my age that they have met who has ever done this. Another thing is it helped me sort of see the flow of it and encouraged optimization.
(I don't want to give too much credit to my college, they did it not as some great pedagogical trick but to save money and laziness)
That is to say "no layers" or "right on the substrate".
Just a guess though.
Ok, I feel very pedantic saying this, but the substrate is the thing that is mostly semi conductors :P.
But don't take this so seriously, I work in semiconductor manufacturing and even I don't take it that seriously.
All possible abstraction removed = bare metal
On x86/x64, the chips have long migrated to an internal RISC like architecture, with an Assembly => micro-op translation step on the decoding unit.
Yes, assembly is an (mostly fairly thin, on non-microcoded hardware) abstraction over machine code.
Unfortunately we now have a culture that views this type of knowledge acquisition as "gatekeeping"
I'm in the opposite boat. I've learned so much about the theory of computing that I almost can't program anymore because most of what we do today feels like a waste of time to me. It's all convention and application now, with so many barriers to entry that I feel like 95% of what I do on any given day is setup. The dreams I had for how computing might evolve and lead us to the endgame of AGI feel more distant to me now than ever before. It will likely happen through the biggest players, using whatever proprietary tech they come up with, and leave garage hackers in the dust. I don't have a good feeling about whatever artificial agents arise from that.
So there is a lot of survivor bias in programming today. I feel like Obi-Wan Kenobi, beat down by the industry, marooned on some distant planet. Meanwhile the youth subscribe to empire quickly, because all they see is glorious rewards. Seeing haggard old graybeards like me fall from such early potential makes them rightfully skeptical of the gatekeeping you describe, the adherence to the old religion of computer science.
Or I'm just full of it. I don't even know anymore. I wish I was part of something bigger again.
I keep getting back in my thoughts to an old carpenter who was making a truly wonderful kitchen in a strange corner of an old building. Non of the walls, ceiling or floor around it were straight and it had tons of weird niches. I ask him how he could attack such a problem with such confidence. I would have to spend days pulling my hairs just making a drawing. He said, carpentry is roughly 300 methods of which you only need 120 to 140 to do any job. The rest is just tricks that you don't really need but they are impressive to those who know the problem.
I keep thinking of that in programing context for some reason. Nowadays you just order a plug and play kitchen that fits exactly, a novice can ikea it into place, everything works and it looks fantastic. Programming will get there one day. Until it does it will just look really weird to the old carpenter. So you grind the wood down to Particle board, you glue plastic on it that looks like wood then it gets moist and you replace the entire kitchen? .....!
A few years into my dev career, I adapted to make maintainable stuff. Because I learned that in 6 months I'd have to fix my own bugs.
Now it seems most code is throwaway, one-off, write only.
I haven't been able to "let it go". I still obsess over making my code correct. No one else seems to care. They get rewarded for fixing their own bugs ("velocity!") which I mostly avoid. So my KPIs look terrible by comparison.
I still can't figure out how programming computers (the most complex task a human can undertake) managed to become singled out as the only profession in the history of humanity that is simultaneously assumed by so many to be something you need only a cursory understanding of to be proficient at.
Now this is not to say on the frontiers of it its not very complicated. But so is every other field. Ever thought about plumbing a space station? Or designing a rapid deploy bridge? You can't compare frontiers of one field with the middle of the other.
both fields that have strict educational and credential requirements to practice professionally. Both strictly - dare I say it - gate-kept.
While the exam can be avoided if one isn't into signing contracts and taking legal responsibility in project execution, the order still has a last word to say regarding which universities are allowed to actually claim that they give engineering titles in Informatics.
You do need a fair amount of education, though - for every meaningful profession _except_ programming.
This can be traced to the 1970s.
Computer Science is only half of a field. Semi-arbitrarily splitting it off from EE harmed both fields. Instead of one complex field, there are two very shallow fields.
I still can't figure out how programming computers (the most complex task a human can undertake)
Neuroscience, making pizza (or most cooking, really), marketing, high-speed motorsport, psychology, most weapons development (outside of guns, which are fundamentally simple), writing mass-market books and drug development all seem to have programming beat in terms of complexity for what ninety-nine percent of professional programmers do.
Jobs worded it well in an interview at one point. Something along the lines of, "I knew there was a market for people who would never be able to design hardware or put a kit together but who still would love to write their own software," in the context of why the Apple II was successful. It works just as well to show why the field is the way it is.
Most computer programmers don't have a clue how the hardware works. That used to be an essential part of it. It's not any longer. The bar has gotten lower and lower, and that's not necessarily a bad thing. Python can be learned in an hour, so why not? They can still make useful things, so there's no problem with it.
Just like in any other field, the bar for "proficient" is low compared to average, but the bar for "exceptional" is high.
The bar has multiplied into multiple bars all at the their own levels but in different dimensions.
You can have an innate understanding from a single transistor to how individual frames are handled by an ethernet PHY to how the packet scheduler will interact with your userspace app and be an "exceptional" developer.
But the moment someone asks you to write an Android app that hits and endpoint and displays the data with some formatting none of that matters if you've never written an Android app.
Nothing is exceptional in a vacuum, and "programming" is so vast with so much space between disciplines it might as well be a vacuum.
Writing an Android application is not something that requires anyone to learn an entirely new skillset. At most, it's a programming language and a toolkit of difference for any experienced programmer.
How is learning a new programming language and new toolkit not learning a new skillset???
Even in a literal sense of the two words:
Skill: "a particular ability."
Tool: "a piece of software that carries out a particular function, typically creating or modifying another program."
How is learning "a piece of software that carries out a particular function" not "a particular ability."
In a non-literal sense, learning to write an Android App is a new skill if you didn't know how to do it before. Like you could happen to have Java experience so it's less new to you, but either you knew how to do it before, or you didn't and now you do... so you learned how to do it.
Are we literally at the point of gatekeeping what it means to learn how to do something??
Riding a Mongoose bike is not substantially different from riding a Schwinn bike. Riding a pink bike is not substantially different from riding a green bike. Writing Java is not substantially different from writing any ALGOL-derivative.
It's like saying writing a program for FreeBSD requires a different skillset than doing so for NetBSD. It doesn't.
Not everything is gatekeeping, and it's disingenuous to claim so.
But riding a unicycle is somewhat different than riding a bicycle, even though they look a bit similar and even have some of the same features. You’re drastically underestimating the amount of time it takes to write a new language proficiently: perhaps you’re confusing the ability to read a language from writing it?
You literally reduced language design to changing colors on a bicycle.
Are you joking?
You have no idea what Android is if you think it’s development process vs Java (which Java? Embedded Java, desktop with JavaFX? Server side?) is like FreeBSD vs NetBSD.
Maybe more like FreeBSD vs Windows and you’re trying to write a UI application, but technically you can use C on both platforms so it’s the same right?
Thanks for the chuckle in these dreary times...
By the way do you actually think syntactic differences are all that separate languages so once you know the general syntax you pretty much know the language, or are you pretending to not know how programming in multiple languages actually works to prove a point?
A: I've written a couple.
B: Anyone who knows any ALGOL (exception: 68), Pascal, Oberon, so forth, will get Java in minutes.
You're demonstrating reasoning flaws, alongside misinterpreting my words. Writing an application for, say, NeXT, back in the day, isn't different at all from writing a modern Mac application. What you do carries over. It's the same basic steps every time. Write a desktop Linux application, write a desktop Windows application, write a Mac application, write an iOS application, write an Android application. You'll have to use a few different wrappers or libraries, but it's not a new skillset.
You claimed that learning a new language is giving yourself a new skillset. It's not with the majority of languages, especially languages like Java, which introduce little compared to their immediate predecessors outside of syntax changes.
This is a frankly ridiculous comment. ALGOL-derivatives grab more from ALGOL than syntax, and some don't borrow syntax at all. Any Pascal programmer can go from writing Pascal to Oberon to Java to ALGOL to Go with ten minutes per language, in any order, despite the differences in syntax. The languages are not differentiated strongly enough to matter; that C programmers could go to writing Java in the span of a day was a major selling point that Sun used, and C and Java are more different (though again, not that different) than any of the previously-listed languages.
Any J programmer can go from J to APL to K to Nial trivially as well, despite vast differences in syntax. Knowing one or the other doesn't mean you have a differing skillset.
The same is true for most Lisps (Connection Machine Lisp being an exception, as a counter-example; despite that, knowing any of these isn't a new skillset).
Just because something requires you doing something slightly and superficially different than what you were already doing doesn't mean it's magically a new skillset. Defining finding new libraries as "new skillsets" is just silly, and erodes the meaning of the term.
I don’t know if you actually believe this, or you’re defining write as in literally type letters that compile instead of being able to write useful productive code in each.
The rest of your comment is more wtfs kind of like that one.
This is not a productive use of my time because either you have no idea what you’re talking about, or you do but you’re intentionally throwing basic reasoning skills straight out the window and leaning heavily into playing games with semantics to support your point at all costs.
Now I’ll be charitable and assume the latter, but if that’s your goal, then what more is there to say?
Yes, Android vs Java is a pink bicycle vs a green one. Pat yourself on the back for that revelation.
Bootcamps demonstrate that not knowing assembly is not a barrier to earning a decent software engineer salary.
EDIT: It was 8085 kit, not 8086.
Also, TIS-100 from Zachtronics (the assembly language game you never asked for!) I think made assembly type programming less intimidating.
mov dx, 200
mov ah, 9
mov ax, 4c00
db "Hello, World$"
Edit: Geezus. It's just an example of how accessible getting something running in assembly language was compared to all the qemu, UEFI stuff in the article.
A BIOS CALL. :)
Still not bare metal.
(I'm totally gatekeeping for laughs: write your own BIOS you noob!)
You can still use INT 10 for BIOS video services.
Or just write into the text framebuffer directly; it's at B800:0000 in colour modes and B000:0000 in monochrome modes (it's surprising how much I still remember from when I exclusively did x86 PC Asm, ~3 decades ago.)
It look liked this 1200 page book, but this isn't it (also a book I used):
I also had the Apple ][+ ROM listing in one of those spiral-bound books from apple circa 1980 but that too has been lost, sadly.
https://sites.google.com/site/pcdosretro/ibmpcbios has the source code for various models typed.
I'm not sure it would work if you stuck this in a boot sector.
 Why do I even remember this stuff? I haven't used any of this since the early 90s! Sheesh!
IIRC, BIOS was 10 and 13.
I have a copy I'd sell, although you can find it for free, legally, online. It's about a thousand pages....
Back when I got started with computers it was 1982, and the available programming languages were BASIC and Z80 assembly language. A lot of people growing up around that time did assembly language coding, because they needed the speed, or because (like me) they wanted to hack games for either infinite lives, or to ease copying.
Of course assembly language is still used in some areas, it used to be necessary for optimization a lot more than today due to evolving general purpose hardware. But it is still useful for embedded and microcontroller programming.
I don't often touch assembly these days because I have a need for it, but do it for fun because it reminds me of simpler times and fewer abstractions. e.g. a recent project:
(Of course these days with microcode, and complex processors, it's not really bare-metal anymore. Not like it used to be. But try telling kids today that, and they won't believe you !)
I think it's deeper than that. I started coding professionally just before the internet took off. Before that I didn't have much interest in computer networks and seemingly overnight I was dealing with a huge new vocabulary of unfamiliar terminology like "port", "peer", "client/server", "protocol", "router", "bridge", "firewall". This wasn't just academic arcana either: whenever I had a problem, the network types would start asking me to check to see if the "firewall" was blocking my "client" from accessing the "port". I found myself able to muddle through by following their instructions, but I felt like a foreigner in a foreign land whenever anything didn't work the way it was "supposed to".
Web programming documentation wasn't helpful in navigating this territory; it assumed you either knew all this stuff of somebody else was taking care of it for you. One day I decided to start from "the beginning" and read TCP/IP illustrated and within a few chapters it all came together and I found myself able to troubleshoot problems that sometimes even the networking "experts" couldn't resolve.
Learning and understanding assembler is like that: once you understand EXACTLY how a computer works, there's no mystery in whether you coded it wrong or you didn't understand what to expect.
Because there are some engineers perhaps at ARM, Apple,
Microsoft, Intel or Google who deal with compilers or operating systems and UEFI booting devices that you're probably currently running or using to either build software or even reply to this thread, who still take interest in reading about blog posts like this.
> I don't see pursuit of this topic happening in my anecdata of programmers I interact with in day to day life.
Maybe not for you, but it's the reason why you are able to post your message here and browse the web or build software faster thanks to the engineers who did this sort of programming. Sure, general end-users shouldn't care but it's gives me confidence that there are some engineers out there who understand some OS internals to 'make' things happen even at FAANMG companies, rather than doing generic web apps all day long.
P.S. And if a low-level programmer goes into web, stuff like this  happens :-)
Oh right, not everyone is in my field.
> Seeing a program you wrote running directly on the bare metal is deeply satisfying to anyone who enjoys writing software.
This is it. I have been on this site for over a decade now and most the obscure stuff is popular here because it satisfies this criteria for some people. Its actually different for different people. For some its bare metal programming, for some its obscure languages like K, and for even others its some libraries.
Maybe you don't do any of this stuff, but there are plenty of engineers that do this daily.
What kind of dev are you, and what kind of dev do you usually interact with?