> Also, I'm curious - did you find yourself having to constrain your use of C in order to make sure that the compiler could compile itself? Or does it implement everything you would use naturally anyways?
That would be the "bootstrapping" process. Nearly a half-century ago I took a compiler lab class where we were given a working, but slightly lame, compiler, and were tasked with adding new, less lame, language features by bootstrapping. That is: 1) implement new feature without using the new feature, 2) using the compiler that results from step 1, re-implement the feature using the new feature, and compile again. 3) Repeat with more features until end of semester for best grade.
Assembly? Ha! In my day, we would have considered ourselves lucky to have had assembly, let alone a C compiler. No, we had to bootstrap our computers using switches and paper tape. You probably want a terminal too!
In all seriousness, the bootstrap process is fascinating to me. At some point, it did all start with switches and manually loading commands directly into memory. And over time, we’ve slowly kept this thing going and growing. Also… I’m old enough to have seen a Altair with toggle switches, but I’m not old enough to have had to toggle in a boot loader on one. :)
- Breadboarding a very, very minimal CPU. More like a simple Adding + Multiplication machine with 4 registers or so, with a few premade components.
- Eventually moving up to microcoding a simulated CPU
- Eventually writing binary code to control the microcoded CPU
- At that point I kinda cheated and wrote my own assembler because I got sick of checking so many bits.
- This project then stopped at an assembly level.
- But then we implemented our own ML-variant interpreter and later on ML-variant compiler in OCaml
- And later on we had that ML-Compiler compile itself and extended it from there.
Pick your poison where to start. Just be aware that breadboarding stuff becomes very... messy very quickly[1]. And I do include Hardware in this, because you needed simpler CPUs to design more complex CPUs.
In practice, you want to pick the best combination of a familiar language that is as high level as it can be. But back in the day, that would've been assembly, binary. In the case of C, it went from BCPL, in which a compiler for B was written, in which a compiler for New B (NB) was written, which then turned into C.
It depends on definition of full bootstrapping process. At some point, someone has to bootstrap in raw assembly but we don't need to do that in 2024. If I write a bootstrap compiler in c on a machine that can run c, I can write an alternate backend for any assembly language (that supports the necessary compilation features) and thus produce a compiler implemented in any assembly language without directly writing any assembly.
Taking this process to the extreme, no one could ever bootstrap anything because eventually you're bootstrapping mineral extraction.
That would be the "bootstrapping" process. Nearly a half-century ago I took a compiler lab class where we were given a working, but slightly lame, compiler, and were tasked with adding new, less lame, language features by bootstrapping. That is: 1) implement new feature without using the new feature, 2) using the compiler that results from step 1, re-implement the feature using the new feature, and compile again. 3) Repeat with more features until end of semester for best grade.
Oh, and to the OP, well done!