At this point, optimizing runtimes to only include necessary functions, and optimizing files to only include necessary functions are both things that are pretty much a solved problem. For example, azer has a random-color library, and an rng library. Having both of those as azer-random or something means that someone automatically gets all the dependencies, without having to make many requests to the server. This makes build times shorter and a lot easier.
Sometimes, in order to best optimize for small, you have to have some parts that are big. Libraries tend to be one of those things where a few good, big libraries lead to a smaller footprint than many tiny libraries.
> makes security auditing more difficult
What? If you go all the way, you just review all dependencies too. And if they have a good API, it's actually much easier. For example if your only source of filesystem access is libfilesystem, you can quickly list all modules which have any permanent local state.
Splitting huge libraries into well designed categories would make a lot of reviews easier.
> Having both of those as azer-random or something means that someone automatically gets all the dependencies, without having to make many requests to the server.
Also disagree. One-off builds shouldn't make a real difference. Continuous builds should have both local mirror and local caches.