Hacker News new | past | comments | ask | show | jobs | submit login
Separation of Concerns in Cross-Compilation (nixcademy.com)
13 points by todsacerdoti 3 months ago | hide | past | favorite | 5 comments



> So why does it turn out to be hard whenever it’s being done for huge real-life projects?

I mean, given that this is essentially an ad to use Nix, of course it's going to come to the conclusion that the problem is something that Nix to use. But my experience is that the answer given here is the wrong answer.

No, the real reason why huge real-life projects have extra complexity in cross-compilation is that huge real-life projects tend to build tools to generate the code that needs to be built. This means that you have to keep track, in your build system, of what's being built for the host and what's being built for the target. And some things have to be built for both, which means only being built once when host == target.

Oh, and this also means that when you're probing your toolchain to figure out what flags you need to throw on, you have to avoid doing such probing in a way that requires actually executing the results of a configuration test, because that's not possible.

And maybe your project wants to have top-level builds automatically hook up emulation or something similar during the regular high-level test commands for developer convenience.

Overall, cross-compilation for huge projects is difficult for build systems because huge projects in general are difficult for build systems and cross-compilation is an extra layer of pain that has to infiltrate every layer of the build system, and be specifically considered at every layer. If your build system is more complicated than "compile this list of files and link the result," then introducing cross-compilation is necessarily going to be more complicated than just "point the build system at the cross compiler toolchain"--and that is pretty much the definition of "huge real-life projects."


I suspect the real pain is because someone bolted on cross compilation support after the project was developed. If they started that way from the beginning, it would be straightforward.

In ancient times, people were more focused on compiling for their current platform, not a different one.

The rise of Linux on embedded devices changed things. Embedded devices are not just simple microcontrollers running bespoke code. For example, today nobody wants to run their Linux build system on a Raspberry PI utilizing an SD card for storage!!


AFAIK, most distributions do exactly what you're describing: cross-compiling packages is enough of a pain that policy is paclages must be build on the target hardware. Fedora had a cluster of raspberry pis for this at one point (and maybe still do). Only the dedicated embedded tools loke yocto and buildroot go for cross-compilation by default.


I don't believe there is a single real reason for complexity in building software. Nor was the reason given in the article fake, for that matter.

In my experience, pkgsCross in Nixpkgs actually does make cross compiling a whole lot easier compared to other solutions. That's because a lot of the time, the techniques explained in the article allows existing packages to be cross compiled without any modifications. Even in cases where modifications are required, it saves a ton of work. Given the amount of packages in Nixpkgs, there's much less dependencies that you have to deal with.


It doesn't help that build systems often don't have tools for maintaining this distinction: Cmake, as far as I know, doesn't have any way to separate host and target, so you wind up losing a lot of the infrastructure it gives you on one of the two.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: