Writing a C compiler in pure shell is one of those projects that sounds absurd until you think about bootstrapping. If you want to compile C on a system where you literally have nothing but a POSIX shell, this is exactly what you need. The fact that the parser itself is BNF-generated from shell modules makes it even more interesting as a study in how far you can push shell scripting before it breaks. Would love to see this evolve into a proper repo with tests so it can actually serve as a minimal bootstrapping tool.
It's not just a toy or a fun hobby project, there's potential for practical use as a step in bootstrapping an entire software stack from human-verifiable artifacts.
A shell is almost always used to setup the bootstrap environment, so the dependency on a shell is more or less always there.
Otherwise, something special with POSIX shell is its large number of independent implementations, making it the ideal starting point for diverse double-compilation (https://arxiv.org/abs/1004.5534). The idea is to bootstrap a toolchain from multiple compilers (shells in this case), and the result compared to verify that no shell introduced a trusting trust attack.
The shell.c ouroboros is really cool. Being able to bootstrap trust through an entirely different language family (shell → C → shell) adds genuine value to the trusting-trust problem beyond just technical novelty.
The fzf integration is a really nice touch here. Half the battle with dev tool management isn't installing things, it's remembering what you installed and how six months later. I know everyone's going to recommend Nix (and they already have), but there's something to be said for a solution where the entire logic fits in your head on first read. I've had a similar Makefile-based setup for years and the biggest win is onboarding new team members who can just read the targets and immediately know what's available.
100x is a bold claim but the Zig approach to optimizing hot paths in Bun makes a lot of sense. There is so much low hanging fruit when you actually dig into how package managers interact with git under the hood. Nice writeup, the before/after benchmarks are convincing.
But then there's this: "When evaluating the complete bun install improvements, it came out speed-wise to about the same as the existing git usage (due to networking being the big bottleneck time-wise despite more cases being slightly faster with ziggit over multiple benchmarks). Except, it's done in 100% zig and those internal improvements pile up as projects consist of more git dependencies. All in all, it seems like a sensible upstream contribution."
Sooo, after burning these 10k+ worth of tokens we find out that it's sensible to use it because the language (zig) feels good as opposed to git itself which now has +20 years of human eval on it. That seems. Well. Yeah...
The original target was bun since it itself is written in zig, not because of anything specific to the language
When it was clear that there were benefits in filling in more of git's capabilities (ie targeting WASM), I then went and filled in more git features.
It's not by any means a universal win over everything but it does have notable wins like having git operations be between 4-10x faster on arm-based MacBooks than git itself
reply