Nothing about 'zig cc' is related to language semantics, it is entirely a toolchain feature. Language semantics do have some bearing on cross compiler difficulty, but IMHO it's not a major factor, especially within the sorts of languages that Rust and Zig are. The largest factor I can think of (compile-time code execution) exists in both Zig and Rust. It's possible I'm forgetting something, though.
It doesn't exist in Rust because nobody has proposed it or implemented it, simple as that.
I think Zig does cross compilation (ignoring for the moment, projects with C dependencies) better because it comes out-of-the-box with the libraries and headers needed to target a lot of systems. In Rust, you typically get additional toolchains via rustup, whereas Zig works on all of the targets with a single install.
The whole toolchain for Zig, which appears to include complete cross compilation, is less than 40MB compressed[0]. I would hardly call that a "monster download".
I think developers can spare that much space. Besides, storage space is scaling much faster than the number of available targets, so I don't see how this could be "not particularly scalable" unless you either have a weird definition of "target" or predict there's going to be a sudden explosion in the number of targets.
Go also makes cross compilation to any supported target a breeze in the default toolchain download, and it is an incredible convenience for the developer.
The Xcode download, which is the only officially-supported way to develop native apps on macOS and iOS, is 9.8 GB and gets larger with every update. The Windows 10 SDK is 4 GB and likewise is only getting larger. Sure, you can develop some apps with only a subset of the SDK. But, as a language project, do you really want to be in the business of optimizing and subsetting every single OS SDK?
I think a lot of the issue here comes down to targeting server workloads and command-line tools vs. apps using more of the OS libraries. In practice, Go focuses on the former kinds of apps, so shipping all targets is a more viable approach for them. But when you start getting into apps that want to use a larger swath of the platform facilities, then the size of the platform SDKs starts to become an issue.
No one (that I've seen) is suggesting to include those SDKs, which you probably don't even have the rights to redistribute anyways.
I provided both Zig and Go as examples of what people are looking for, and their downloads are neither obscenely large nor do they include those massive SDKs.
So yes, you have a weird definition of "target" that no one else here is using, and it is disingenuous for the conversation. rustup can already add additional targets after installation, are you suggesting this downloads 10GB SDKs? Definitely not, and that sounds irrelevant. (Even once you do this, the toolchain experience is not as good as what Zig or Go offer.)
Compare what Zig and Go do to what Rust does, not some strawman argument that would require users to download 10GB of SDK with the Rust toolchain. Alternatively, find me someone who is saying they think these multi-gigabyte SDKs should be included, because I don't see that anywhere in this conversation.
People here are complimenting what Zig currently does, which takes less than 40MB compressed. Zig can cross compile to Windows just fine. There are limitations to everything, but what Zig and Go offer is strictly better than what Rust offers in terms of the out-of-box cross compilation experience, and it isn't unduly burdensome on the developers like your proposed 10GB download.
My nightly-x86_64-unknown-linux-gnu lib directory is 148MB. This is because it includes several libraries in both rlib and so format (static vs. dynamic linking), asan/lsan/msan/tsan variants, and so forth. You could easily imagine that ballooning to 1GB if we shipped all tier 1 targets. This is what I mean by shipping all targets not being scalable: it might have seemed so in the early days when the libraries were smaller and the targets fewer, but not now.
Yes, the Zig (and Go) developers have clearly put a lot of effort into ensuring the size of their toolchain remains reasonable. I fully believe the Rust developers could achieve similar results as well, if they really wanted.
Until then, the lack of a great cross compilation experience out of the box is just a limitation of the Rust toolchain. It's an acceptable limitation in many situations, but I don't buy your repeated arguments on this HN discussion that it is unclear that this is desirable to fix. Clearly many people in this discussion alone disagree with your position, and my own anecdotal discussions with other developers in real life aligns with this discussion. YMMV, obviously.
> Yes, the Zig (and Go) developers have clearly put a lot of effort into ensuring the size of their toolchain remains reasonable. I fully believe the Rust developers could achieve similar results as well, if they really wanted.
I strongly disagree. These are different languages. Rust leans heavily on generics and monomorphization. It implements most language operations, like ptr::offset, in the language itself, increasing the size of library metadata. It supports tools like the sanitizers. It has a rich serializable MIR format so that generics can be embedded at a higher level than just machine code and a lower level than source.
It is absolutely not true that nobody "really wants" smaller binaries. Rust already did a fair bit of experimentation with running crate metadata through gzip, etc. years ago (turns out there are some thorny tradeoffs around compilation time vs. on-disk storage when you go that route). I can't speak to Zig, knowing less about it, but with Go there were conscious language design decisions that favor binary size over runtime performance (e.g. hash table lookups all going through a single function instead of being specialized). This is fine! But it's contrary to the idea that Rust could achieve smaller binary sizes if we "really" wanted to.
> Until then, the lack of a great cross compilation experience out of the box is just a limitation of the Rust toolchain.
Rust has a great cross-compilation experience. It's as simple as:
The difference is really not just one command. I suggest you go out and actually try Zig cross compilation or at least read more about it. Andrew Kelley has written a lot about it. The work impressed me so much that it was one of the things that provoked me to become a financial supporter of Zig. (Even though I haven't written a line of Zig yet.)
Cross compilation in Rust is better than C or C++. But it's still a big pain.
I really think you are underestimating what the Zig folks are doing. Please investigate more deeply.
In particular: "Zig does not ship with any pre-compiled libraries; instead it ships with source code, and builds what it needs on-the-fly."
Obviously I don't. And that's totally consistent with learning from Zig about how to improve the cross compilation story for Rust. (Trivially: compile some things from source, but not other things.)
I'm honestly pretty disappointed at your participation in this thread. It seems like you're going out of your way to assume the worst possible interpretation of what folks are saying.
If you use the build-std nightly feature you can compile the entire standard library, even for a custom JSON target file, and it goes pretty quick. With a bit of attention that feature would be just fine at filling in the gaps in rustup’s prebuilt targets.
But Zig’s approach isn’t actually about making everyone compile more things from scratch. It doesn’t compile libc from scratch any time you select a new target. Reading the article you can see the extreme lengths it goes to to avoid this.
The main thing that’s missing from Rust is target-specific libc.so/etc, and an appropriate linker for every target. If you try cross compiling anything, you will soon run into this problem; your system will not have the correct target libc or an appropriate linker. You can generally only change one part of the triple before the experience starts falling apart. I can cross compile for iOS/tvOS/etc on my Mac; that’s about it.
Zig manages to solve this for all its targets by pre-processing simplifications of the various libcs, which are then bundled into the Zig binary (as 3 very small files). It generates a useless but linkable .so file from the preprocessed files for any target on the fly. It gets you the correct libc headers. Then it uses LLD instead of making you hunt around on Ubuntu forum posts finding and downloading the correct GNU linker for your specific architecture. So you don’t link to a real libc, but you don’t have to compile one either. (Until you want to execute a binary with QEMU, but usually non-simulated target machines have a libc.so already.)
(Aside: Golang solves this by not depending on libc at all, by reimplementing most of it including the syscalls in Go, and I believe by using its own multi-target linker. Many pros and many cons, but an approach Google is happy to sponsor.)
If you’re saying this approach would be infeasible for Rust std/core, then yes obviously. It’s statically linked, you can’t get away with a fake .so file.
But that was never really the issue. Rustc can build std’s rlib files for any target effortlessly. And yet cross compiling is still a pretty poor experience. The state of the art for Rust developers at the moment is rust-embedded/cross, which solves the same libc+linker problem that Zig does, naively, by literally using Docker and per-target Ubuntu images to download prebuilt GCC and libc packages from Apt. Basically we can do way better than that, because Zig showed us how.
> That's it. One more command than Zig or Go, to install the toolchain you need, and then you're off to the races.
Except it's not the same. Zig also cross-compiles C code beautifully, and lots of Rust code depends on C libraries. That "one additional command" does not solve cross compilation to an equivalent degree as Zig.
Go code tends to avoid C because of the heavy penalty that CGo imposes, so the cross compiliation experience is usually good there, but for different reasons than Zig.
I see. I'll let the real experts add some insight but I don't think language complexity has much to do with it, considering Rust compiles to an IR first anyway. And I would say Rust has a good cross-compilation story as it is. In 2016 (!) I managed to get Rust to compile my SDL2 game to a mipsel device that used a custom GCC toolchain (http://www.gcw-zero.com/develop)