One example to illustrate how simple the Game Boy is: There is no operating system at all, only a 256 byte boot ROM whose sole purpose is to display the Nintendo logo from the cartridge and halt if it does not checksum correctly (for both cartridge integrity and legal reasons).
This entertaining 33c3 talk manages to describe the Game Boy hardware practically in its entirety (and with some previously unknown details, actually): https://www.youtube.com/watch?v=HyzD8pNlpwI
You find quickly that it is full of excruciatingly complex edge cases that aren't documented, and aren't well understood by anyone.
We're still making significant progress to this day on it.
And even in the early stages, the Z80-like Sharp LR35902 is quite a hassle. If you want an educational project, consider an NES emulator instead. Still has lots of edge cases, but the documentation and reverse engineering work is far ahead of the Game Boy.
Some reading if you like:
Once you want to do "serious" emulation that runs a wide variety of games and accurately reproduces them in every spect, things become much less simple and might also lead you to completely abandon initial approaches, e.g. because your straightforward but non-cycle accurate CPU main loop doesn't cut it anymore.
Compare this however with a more modern system that has an MMU, where you have a long, long road of busywork ahead before you can even get past any reasonable definition of "booting", whereas for the Game Boy even a shoddy first draft implementation might at least get you into a game's initial menu screen, completely with working buttons and all.
And strictly spoken, a conceptually more complex system will have more edge cases and implementation details to deal with, albeit I think that it often also means that those implementation details are less exploited, as developers stay on a higher level and are less trying to squeeze out every last bit of the "simplistic" hardware.
Your links are a good example, I love reading about the complex implementation details in emulators of superficially simple hardware.
...to the point where there are transistor-level simulations of the CPU and PPU:
The only other game console I'm aware of a public transistor-level simulation for is the much more limited Atari 2600:
The NES's PPU is simple in operation, but many cartridge types (MMC3 and MMC5 in particular) relied on the precise behavior of the PPU's reads and writes to time some of their extra features, often something like a scanline counter. This makes the complete system much more difficult to emulate, even though the base features of the console are straightforward to grasp.
What makes the NES such an attractive target for a first emulator is that, due to this quirk and the system's popularity, its hardware is just about perfectly understood. Take a look at the Nesdev Wiki's article on rendering, which has an extremely thorough breakdown of the timing, including the bugs in the sprite rendering subsystem and how to correctly emulate them:
At the very least, you might be able to badly emulate most gameboy games, but a mapperless NES emulator is only going to even be able to try to run the first generation of games from the console.
The most advanced mapper is the MMC5, that one can take some doing. And the VRC7's OPL audio is also a nightmare (yet used only for one game.) But beyond that, it's all really simple. Usually around 10-30 lines of code per mapper. Mostly very well documented, with the exception of how mappers detect scanlines (important for accurate emulation, not so much for full compatibility. You can just cheat and give them the PPU scanline counter.)
But, maybe I'm biased. I've emulated thirteen systems so far, and the NES was one of my favorites to work on due to the documentation and general simplicity. I know the rabbit hole goes on forever even with the NES, though.
But really, if you've had a computer architectures class (or do equivalent research on your own), it's doable to just start looking up info about a specific system, and digging down as necessary.
The NES is a different beast; nametables _alone_ will give an early NES author headaches, and they're incredibly important to get right, otherwise games won't be able to scroll properly. Even simpler games like Mario and Metroid rely on basic nametable support being there. In this case I think it's the limitations of the NES itself that led to the mappers doing so much more externally; game authors had no other choice if they wanted to pull off advanced tricks. This trend continued with the Super Nintendo; remember the SuperFX chip? It almost turns the Super Nintendo into a dumb terminal at that point, generating entire screens worth of graphics in the gamepak directly.
Agreed. The Game Boy's ability to generate interrupts at various times in the scanline, and access to the current scanline number, make timing things with vanilla hardware much easier on the Game Boy than the NES.
If it's that then my hat goes off to the devs who find it, timing issues are second to none in difficulty.
I just wanted to write I'm really happy that this post showed up on the HN front page. It's been more than a year since I wrote the post and from the time perspective I think it was the most rewarding project I've ever done. It doesn't matter that there's a plenty of more mature, accurate, user-friendly emulators - the "journey" was a reward in itself. If you're thinking about creating your own emulator, I highly recommend it.
I'm going into Game Boy Advance now. Even just efficiently decoding ARM opcodes is an adventure on its own. Different timings and bus widths in different parts of the memory map....but a large enough map that you don't have to worry about banking at all. Although I suspect that it's fast enough that it might start making sense to look at dynarec/jit techniques.
I'm reminded of the days I was writing a Z-machine emulator in Haskell , ostensibly as a uni class project but also as a way to teach myself Haskell and have fun; I had the infrastructure in place and was trying to run Zork, implementing opcodes, one by one, when finally I saw the familiar message "You are standing in an open field west of a white house, with a boarded front door. There is a small mailbox here." The sense of accomplishment was rewarding and elating.
2004. Good times.
Ahh, memories of the good old days... before I got married...
"[...] if I have 5 minutes of a spare time, I was trying to fix this weird GPU or sound bug. I guess it’s not too healthy (especially if you have a family) [...]"
Oh, crap, this guy apparently has family, too! Shame on me! :-)
Just wait until you have a baby... ;)
I think I'd probably finish a lot more projects if I wasn't a professional software developer -- I don't want to work in my free time as well.
What has helped me is to get better about goal setting. For me this means having some large goal that is easily associated with personal desire. Then decompose that into smaller goals. These smaller ones are one step removed from the base desire, and as such need a little effort to figure out how to attach the goal to desire.
I keep breaking down the goals until they are a size that I can accomplish in a day. Often that's too small a timeline but that's what I aim for.
At each step of the break down, I make an effort to attach that goal to personal desire. Whether the desire is related to the overarching desire for the top goal doesn't matter so much, but it does have to have a desire component.
Sometimes the desire attached to a goal is to end the day feeling like I have made progress. That's it. The whole desire for that goal is in the achievement.
And frankly, achievement is a huge motivator for me. If I feel like I am spinning my wheels, it makes me what to stop and do something that isn't wasting my time.
Breaking the goals down into smaller goals that work on shorter timelines allows you to get that dopamine hit from achievement. You can end the day feeling pleased and relaxed. And you can wake up excited that you have an achievable goal to work on for that day.
Simple life hack: Next time you're planning on starting a project like that, declare that the goal in advance.
If you're finishing projects at work, you know you could finish, if you wanted to, but whether you "finish" a project should be subject to cost/benefits analysis just like anything else. Your $COOL_PROJECT isn't a failure if it doesn't attain 15,000 stars on GitHub, unless you chose to make that a goal.
There so many things I would do before I'd do something unrewarding.
I've also found that how intellectually rewarding my day job is (and how much agency I have) has a direct affect on my hobby development. I have a lot of creative freedom and interesting projects at work right now so I care less about coding in my spare time. I only have so much free intellectual energy to devote to something.
But somehow I have successfully done that for my project https://github.com/joyread/server when I launched v1.0 and it is indeed rewarding.
One thing that sometimes helps is starting a couple projects in parallel. That way, if you get tired of one, you can switch to the other. Eventually one might take hold, and the other gets abandoned.
I have only "finished" 1 project enough for re-use (publishing) when I didn't have an audience. The docs, examples, build, etc. took as long as the actual code.
When I'm doing things for myself, thinking about solving it and then actually solving it is like doing it twice, and I'm already bored. Once I prove (or disprove) my notion, hunch, whatever, I generally move on to the next shiny object.
Suboptimal, I know.
I'm terrible at finishing personal projects. For paying projects it's much easier ($$$).
For work there is a deadline and the compromise to meet expectations. And being paid, of course.
Hobby projects hop at snail pace towards the years.
I look at some repo that is a mess, but I remember making widget X work and how great that felt.
When someone says they created something I always think about what they must have learned to do that something, and not so much about the actual something.
I get a bit frustrated when folks critique something about "Yeah but this other thing" and so forth. For me that's almost never the point...
I'm also mystified that there doesn't seem to be a .SAV format that's cross-compatible between emulators, which is required for basically the entire purpose of the app. (Users would work on a project online on any device, then transfer the .SAV to real hardware for recording the tune.)
If anyone has advice on these topics I'd love to talk!
Maybe I should take on the task myself like the OP did...
I can empathize, I think lots of us have GH profiles littered with projects we've hit real finish lines on but may not be useful to others. But I actually think it's healthy, especially if you have a family (assuming it's only your spare time). It provides sanity and brain maintenance simultaneously. Even better if you have an employer that doesn't track your every hour and recognizes the value of these kinds of things on down time.
> Right now I don’t have any obvious ways to move the app forward, so the addiction dissipated ;)
Run through https://github.com/konsoletyper/teavm to WASM and abstract graphics calls and what not :-)
I agree with the post that it was a very rewarding (if frustrating) project. But once you see that Nintendo logo come down the screen for the first time it's almost like magic.
It starts out with just wires, switches, and relays, and as the book progresses he goes through the process of building up a simple CPU and RAM one step at a time. The book even walks you through coming up with opcodes and assembly language.
Even if you already know this stuff, I found the book was helpful in developing an intuitive feel for how everything works and fits together.
After reading it, you'd probably have a good mental model of how you'd want to approach writing an emulator.
The Nand to Tetris courses and their accompanying textbook would probably be helpful here too.
Once you start on the emulator, the first thing you'll do is emulate the CPU, which can be done as a big switch statement inside of a loop. Knowing how to code each instruction is easy when you have an instruction listing handy.
Here's a quick example of how I would write an emulator, with 2 opcodes already implemented (6502 assembly): https://pastebin.com/raw/PhCEqh35
And the instruction listing I used: http://obelisk.me.uk/6502/reference.html#AND
The compiler probably optimizes the condition check out, so I doubt it really matters.
I've seen it said that "The most optimizing compiler is the most normalizing compiler."  The infinite loop structure is pretty easy to normalize.
The only real difference that I saw was that some compiler front ends would produce warnings that the condition in the while loop was always true, where they seemed to assume the empty for must be intentional.
The "for(;;)" loop is easier and faster for the human to parse. It stands out in source code.
With the "while(true)", you have to pay attention. It could be "while(tnie)" or "while(trua)". Whenever you see it, you have to read carefully.
In addition to having source code for several different emulators, they've written a ~150 page document about emulator development. It's easily the best resource I've seen on the subject.
It seems like people have put together a lot of good documentation for old systems though, which could at least be a good place to start reading:
It has enough for doing 2D games bare metal on ARM.
And then you can jump into PiFox.
ARM Cortex-M cores are also a bit different from the Application-class processors used in boards like the Raspberry Pi. They are closer to an Arduino than a "real computer".
That is how we did 8 and 16 bit coding on home micros back in the day. :)
My favourite SOC is the ESP32, but I guess it already has too much hardware to emulate and isn't an ARM anyway.
Yeah, I've wanted to learn more about the ESP32 - it seems like people have done some really cool stuff with it and it looks like you get a lot of power and connectivity.
But I'll bet you can empathize with not having enough hours in the day!
The downside is that you'll need to emulate 4 cores plus a GPU. Guest software (the stuff being emulated) may fail if the cores aren't decently fast. You may also need to do USB, audio, wireless LAN, and Ethernet.
Hence the idea of learning ARM Assembly, I guess I got it wrong.
Cool idea using VGA as the output; was that hard, or could you just use the DACs normally? I was looking into parallel LCD interfaces for awhile - some F4 cores have a 24-bit RGB LCD peripheral - but they look complicated and I'm still puttering around with SPI OLED/TFT displays.
Anyways, looks like a really cool project!
i wrote a 6800 emulator just from having this background knowledge and no reference in my college years. I did it because I didn't want to go to the labs early saturday morning to fight over the few 6800 hardware. I could write my code, run, debug, show up just as the lab is ending and hand in my work and go home.
https://github.com/Klaus2m5/6502_65C02_functional_tests is what I used.
This resonated strongly with me. I did a study of architectures in college where i built a bunch of emulators of real and made up chips and fiddling with the details of an emulator is incredibly engrossing.
WL4 is my favorite hand-held platformer of all time. Love that game!
I've played 3 too, but it just didn't get me.
I treated it as a programming riddle (or a series of riddles), quite similar to those you can find on the Project Euler.
It's like a lust to solve a problem. Idk maybe it IS add, I'm there too, but sometimes there are software problems that just completely captivate me in a way that nothing else does.
Time melts away, and I'm just typing.
Y'all know that feeling, that 6am,"oh shit I work in three hours but I was hacking on this project all night and it almost works if I could just get this thing over here to- Oh, work, real world.... Right."
That mental mode of concentration is elusive and indeed a bit habit forming. There are moments where I'll be washing my car or cleaning the house and I have that "I should be in vim right now" thought and the world just seems grayer and dumber until I survive past it and sit back at my keyboard again.
Said like that, that is weirdly analogous to when I used to drink.
Is programming a drug? Are some addictions healthy?
But the more I learned, the bigger the problems I want to solve, (and create, in some cases) and that drove me to become more dedicated to the discipline of it all.
I'm blessed to have the opportunity to do contracts or work a 9-5 with my skillset, and I can basically live whatever lifestyle I can afford that way.
But yeah, even if I was getting paid 1/5 of what I have now, as long as I can spend my days browsing through disassembly, outsmarting someone, I'd still show up.
I was once taking a shower when I cried out in joy as I had just figured out how to shave off an extra fraction of a micron from the side of a large state machine (we had a last minute change order and it hogged a bunch of area on the chip). I grabbed some clothes and jumped into my car (didn't think to rinse the shampoo out of my hair) and floored it down the freeway to the office (VPN? hah!)
I mean, I think so anyways. Idk anything hardly when it comes to hardware but I can use my imagination.
Do you think like barbers or chefs or fisherman have the same drive to push forward?
But that begs the question that this whole thing is just an emotional response.
MLKJ: “No work is insignificant. All labor that uplifts humanity has dignity and importance and should be undertaken with painstaking excellence.”
If a person is passionate enough about their work, maybe they have these same feelings and senses.