I summarized this excellent talk here [1], but one of the main points is that compatibility with existing systems is important for adoption. (They learned that the hard way -- by having their entire project cancelled and almost everything thrown out.) He advocates unit-by-unit rewrites rather than big-bang rewrites, just like Kell does in this conference article.
And compatibility with C in Windows should be easier than it is in the Unix world, because the whole OS is architected around a binary protocol AFAIK -- COM.
My sense is that Rust may not have thought enough about compatibility early in its life. Only later when they ran into adoption problems did they start talking more about compatibility.
Also, it seems Rust competes more with C++ than C, and there seems to be very little attempt to be compatible with C++ (although perhaps that problem is intractable.)
Personally I don't think Rust will be a successful C replacement. It will have some adoption, but the Linux kernel will still be running on bajillions of devices 10 years from now, written in C. And in 20 years, something else will come along to replace either C or Linux, but that thing won't involve Rust.
> My sense is that Rust may not have thought enough about compatibility early in its life. Only later when they ran into adoption problems did they start talking more about compatibility.
Of course Rust thought a lot about compatibility with C in its early days. I remember fast FFI was in Graydon's very first presentation about the language in 2010. Almost everything about the language changed, but that focus did not.
> Also, it seems Rust competes more with C++ than C, and there seems to be very little attempt to be compatible with C++ (although perhaps that problem is intractable.)
Rust has gone pretty far in wanting to be compatible with C++, with the C++ stuff added to bindgen for Stylo. We've gone further than most other languages. It's not fair to say there's been "very little attempt": we literally couldn't have shipped Stylo to Nightly Firefox without doing the work to bridge C++ and Rust.
From your other post, it seems that one of your main complaints is that Cargo exists instead of having Rust use Makefiles. All I can say is that the reaction to Cargo from Rust programmers is overwhelmingly, almost universally positive, and abandoning Cargo in favor of Makefiles would instantly result in a fork of the language that would take Rust's entire userbase. Not solving builds and package management is not a realistic option for a language in 2017.
Following the logic of the article, Rust has made the exact same mistake every other language has made, which is to conceptualize compatibility with the C ecosystem has merely an issue FFI. Rust is hardly the first language to focus on easy FFI from day 1, but according to the article that's not nearly sufficient. And like most other modern so-called systems language, Rust hasn't gotten around to committing to a stable, exportable ABI. In fact, I think much like Go the general sentiment is that this is largely undesirable, as stable ABIs can cripple evolution of the implementation, especially those that rely on sophisticated type systems.
> And like most other modern so-called systems language, Rust hasn't gotten around to committing to a stable, exportable ABI.
That's not true. The C ABI is stable and exportable, and you can opt into it on a per-function basis. We do that for integration with existing projects all the time.
Again: All of you are talking as though the idea of integrating Rust into a large C++ project is some far-fetched theoretical idea, and that we made some obvious mistakes that make this goal impossible. In fact, we're shipping an integrated Rust-C++ project today: stable Firefox, used by millions of users.
I'm not arguing that it's too difficult integrate Rust with C or C++ projects. I'm simply trying to get at the distinctions that the article is making, which are rather subtle.
One aspect of Rust that fits well, IMO, with the characteristics the article argues are under appreciated is its emphasis on POD--objects as compact, flat bytes. That puts Rust much closer to achieving what C does best (again, according to the article), which is first-class syntactic constructs over memory--namely, pointers. But it falls short in the sense that to _export_ Rust objects (rather than import alien objects into Rust) you have to do so explicitly. And presumably the author would argue that Rust is significantly undervaluing the benefit of a stable ABI that would allow other applications to import Rust objects without an explicit language-level construct (i.e. explicitly annotating APIs with no_mangle).
Obviously when you're building a large application, cathedral style, the requirement to explicitly annotate is not only less burdensome, but quite useful (for many reasons). But in a larger, heterarchical ecosystem of software, that's actually quite limiting. Our first instinct is to argue that permitting such unintended peeking behind the curtain is dangerous and unnecessary, but the article speaks directly to that.
Imagine a Rust with a stable ABI that was exported via Sun's CTF format. CTF is like DWARF but much simpler (and thus little incentive to strip it), and it's being integrated into both OpenBSD and (I think) FreeBSD to facilitate improved dynamic linking infrastructure. Rust could even, theoretically, continue randomizing member fields. And this data could be consumed by any language's toolchain, not simply Rust's toolchain. That sort of language-agnostic, holistic approach to interoperability is largely what I think the article is getting at.
I'd be all for a standard language agnostic ABI. I'm not on the language design team anymore, but I suspect you wouldn't have any trouble convincing them to get on board with such a thing either. The ones you'd need to convince would be the C++ folks, I suspect :)
Of course they do. That's why Rust has a sophisticated tool, bindgen, which is used in production right now in Nightly Firefox (among other places) to export complex C++ interfaces in both directions across the language boundary.
> And I was also referring to the similar issue in Go where calling C -> Go and Go -> C isn't symmetrical. Not sure if that's true for Rust or not.
It's not. You just write "#[no_mangle] extern" on your function Rust and C can easily call it, with a stable ABI.
In order to meaningfully criticize Rust's FFI, you need to be aware of how it works.
Well, just saying it has fast FFI doesn't tell me much. Being able to wrap something like sin() was in Python 1.0, but most applications need more help than that. There have been 5+ popular systems since then trying to make the experience better... it still is barely solved.
That said, I admit I'm more on the pessimistic side. Having touched Go before it's open source release in 2009, I didn't think they thought enough about integration either. I think it was worse than Rust, because you couldn't call Go from C or C++, unless the main program was in Go.
Also their build system isn't used inside Google. And they do nontrivial stuff with signals and threads.
But Go seems to be being adopted. However there is an important distinction: Everybody is rewriting new versions of Google-style servers in the open source world. But all the stuff at Google is still in C++.
So I think nobody ever rewrites old software. They write new versions of similar things, and then hopefully those new things get adopted. But the old thing will probably be around for a long time too.
And to be fair C didn't replace Fortran or Cobol either -- scientific applications still use Fortran and old banks (apparently?) still use Cobol on mainframes.
Maybe that's the most you can expect. But in that case there still does need to be a "plan" for making existing C code like the Linux kernel and OpenSSL safer. I think my issue is that some people apparently think that plan involves Rust when it doesn't. Maybe the core team has never pushed that idea but some other people seem to be under that illusion.
-----
This is a different argument, but a language only needs to "solve" package management if it always assumes it has main(). I was looking for something more humble that you could stick in a file in an existing C or C++ project, e.g. for writing as safe parser.
Also the 5+ different Python + C/C++ solutions now need a Python + Rust analogue. For a language at the Rust layer, there's this O(m*n) problem or strong network effect to deal with.
Actually that was thing I was thinking while reading this PDF -- a lot of it can be boiled down to "C and C++ have network effects". Particularly C++.
Asking Rust to break the network effect is like asking Apple to break the Windows monopoly with Mac OS X. That didn't happen -- they built the new thing iOS, and beat Windows with that. So then the question is if Rust is more like OS X or iOS.
> So I think nobody ever rewrites old software. They write new versions of similar things, and then hopefully those new things get adopted. But the old thing will probably be around for a long time too.
That's very true. The most we can hope for is that Rust and other languages, such as Go and Swift, continue to chip away at the market share of C and C++. It'll be a long process.
I'm not a "rewrite everything in Rust" booster; as much as I would like to, that won't realistically happen. Instead, I see Rust as another player in the "programming language Renaissance" that has been going on since the mid-2000s. C and C++ are losing their dominance and instead are becoming part of a broad ecosystem of languages. And that's great: the fact that we have so many choices in languages now has been a very good thing for productivity and security.
> Actually that was thing I was thinking while reading this PDF -- a lot of it can be boiled down to "C and C++ have network effects". Particularly C++.
I agree. That's why I think this paper overanalyzes the success of C and C++. They became dominant because of network effects: simple as that.
I think the article helps to explain why C was able to leverage network effects so well. Neither C nor Unix came out of the gate in a dominating position. Indeed, it's arguably only in the past 20 years that it clearly dominated. Fortran, Pascal, and a bevy of other languages were at times much more widespread and influential. Even today C isn't the most used language. And yet it's influence continues to be outsized.
C isn't just a language, it's an entire ecosystem of toolchains and software that facilitate network effects. "Chance" is far too convenient an explanation. No doubt chance had a significant role, but if C were as useless, unsafe, and devoid of redeeming qualities as many people argue, then I don't see how C could have benefited so strongly from network effects.
C wasn't useless and unsafe at the time it became dominant. It was quite state-of-the-art at the time. We've just learned more about what works well and what doesn't in programming languages since 1978, which is why C is no longer as dominant as it once was.
"Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
This was in 1981, and "language designers and users have not learned this lesson." was a jab at C.
Also Xerox and ETHZ were busy using safer systems programming languages.
ESPOL and NEWP, already using UNSAFE blocks were the state of the art systems programing in the 60's.
Most of the criticisms you hear about C today (type safety, memory safety, no garbage collection) were criticisms that C got when it was first invented.
In fact there are fewer criticisms of C than there used to be: a lot of the early criticism centered around syntax, but the C syntax kind of won, so you don't hear that anymore.
> All I can say is that the reaction to Cargo from Rust programmers is overwhelmingly, almost universally positive
You can prove anything when you introduce a sampling bias that large.
> Not solving builds and package management is not a realistic option for a language in 2017.
Package management was solved decades ago, my OS manages the packages and a lean and mean system is the result. The rust solution results in massive binary sizes for simple command line tools. This is fine if there goal is to replace java, but not if they want to replace c.
> If it was, there wouldn't be new package managers popping up all the time; it's a non-trivial problem. They're not created for no reason.
Notice how all those package managers are for platforms or create platforms in their own right. Rust is meant to be a systems language, that means it's platform is the OS and it doesn't get to be a world unto itself like java.
So if you jump through a million hoops, limit yourself to c libraries you can produce small executables. At that point it's more complicated than just writing an app in c in the first place.
I'm not interested in what it can technically do though, I'm interested in what is practically happening. In practice most rust programmers seem to be writing apps for the cargo platform. In practice rust developers are producing huge executables. In practice rust has no stable ABI so all rust libraries get statically compiled. In practice this is incompatible with the LGPL. In practice a security vulnerability means every app using the library has to be recompiled to be secure.
> Rust is meant to be a systems language, that means it's platform is the OS and it doesn't get to be a world unto itself like java.
Rust is meant to be a cross-platform systems language, and sadly there does not exist a cross-platform package manager. Until one exists (and I'm not holding my breath here), every language which intends to be cross-platform will continue inventing its own package management.
> You can prove anything when you introduce a sampling bias that large.
I'm confident that programmers don't want to be writing Makefiles. We don't have to take formal surveys to observe the obvious trend away from raw "make" that has been occurring for decades.
Besides, if Rust programmers really had a problem with Cargo, they would tell us. Programmers don't suffer in silence.
> Package management was solved decades ago, my OS manages the packages and a lean and mean system is the result.
I'm glad you like your package manager. Most programmers, including me, don't want to have to deal with it when the goal is simply to put a Rust project together. Besides, we ship desktop software on Windows: we cannot tell our users "sorry, you need to install Ubuntu".
> The rust solution results in massive binary sizes for simple command line tools. This is fine if there goal is to replace java, but not if they want to replace c.
The Rust solution is customizable. You can use dynamic libraries if you like, and earlier prerelease versions of Rust did in fact do that. Dynamic libraries are a single rustc flag away.
The feedback we got was that people preferred the convenience of a single standalone binary to the complexity of dynamic linking managed by the OS.
> I'm glad you like your package manager. Most programmers, including me, don't want to have to deal with it when the goal is simply to put a Rust project together.
This is the same attitude that makes electron so attractive. As a user, I don't care what makes your life easier as a developer, I care that I'm getting a more bloated and less secure result. This is an awful attitude that's creeped into software development lately.
> Besides, we ship desktop software on Windows: we cannot tell our users "sorry, you need to install Ubuntu".
So bundle them on windows, bloated installers are the norm their already. You're probably going to have to include an auto updater and a lot of other stuff that windows doesn't provide as well. Not having to deal with that stuff is part of why I use ubuntu in the first place.
> The Rust solution is customizable. You can use dynamic libraries if you like, and earlier prerelease versions of Rust did in fact do that. Dynamic libraries are a single rustc flag away.
Until there is a stable ABI that isn't a solution because you have to distribute those libraries with the app.
I'm dealing with the result of this attitude on my phone right now. The end result is I can't even install your app because I'm out of space on my phone. I'm out of space because every other app maker favored developer productivity over being conservative with users resources.
Seems like you are shifting the goal posts. If I'm building something to run on resource constrained devices, then it makes sense to value use of resources more highly! But otherwise, most of your comments just seem to repeat the same old dynamic vs static linking debate that had been hashed out already for decades. There is no one right answer. Trade offs abound.
People who expect a stable ABI from Rust such that normal Rust libraries can be dynamically linked like you would C libraries would do well to adjust their expectations. It isn't happening any time soon.
> most of your comments just seem to repeat the same old dynamic vs static linking debate that had been hashed out already for decades. There is no one right answer. Trade offs abound.
Rust doesn't let me make that trade off, it's made the decision for me.
> People who expect a stable ABI from Rust such that normal Rust libraries can be dynamically linked like you would C libraries would do well to adjust their expectations. It isn't happening any time soon.
I think it's the rustaceans that need to adjust their expectations, as long as this holds rust won't be a real systems language, it stands a better chance of unseating java than c.
> Rust doesn't let me make that trade off, it's made the decision for me.
Umm, right, exactly, the state of having a stable ABI is one set of trade offs, and even if that were option, electing to use it for dynamic linking is another set of trade offs. I feel like I was obviously referring to the former, but if that wasn't clear, it should be now. An obvious negative is exactly what you say: you can't use standard Rust libraries like you would C libraries. That's what I meant by trade offs. But there are plenty on the other side of things as well.
> I think it's the rustaceans that need to adjust their expectations
Sure! We do all the time! I'm just trying to tell you the reality of the situation. The reality is that Rust won't be getting a stable ABI (outside of explicitly exporting a stable C ABI) any time soon. If that means flukus doesn't consider Rust a systems language, then that's exactly what I meant by adjusting your expectations. But don't expect everyone to agree with you.
From personal experience, a lot of folks don't care nearly as much as you do about things like "the binary is using 2.6MB instead of the equivalent C binary which is using only 156KB." Now if you're in a resource constrained environment where that size difference is important, then that's a different story, and you might want to spend more effort to use dynamic linking in Rust, which you can do. You won't get a stable ABI across different versions of rustc, but you can still get the size reduction if that's important to you in a specific use case.
I've previously suggested here that OSes and OS manufacturers should test and rate apps for tightness, and punish apps that aren't tight by handing them fewer resources - running them noticeably slower.
https://www.youtube.com/watch?v=CuD7SCqHB7k
I summarized this excellent talk here [1], but one of the main points is that compatibility with existing systems is important for adoption. (They learned that the hard way -- by having their entire project cancelled and almost everything thrown out.) He advocates unit-by-unit rewrites rather than big-bang rewrites, just like Kell does in this conference article.
And compatibility with C in Windows should be easier than it is in the Unix world, because the whole OS is architected around a binary protocol AFAIK -- COM.
My sense is that Rust may not have thought enough about compatibility early in its life. Only later when they ran into adoption problems did they start talking more about compatibility.
Also, it seems Rust competes more with C++ than C, and there seems to be very little attempt to be compatible with C++ (although perhaps that problem is intractable.)
Personally I don't think Rust will be a successful C replacement. It will have some adoption, but the Linux kernel will still be running on bajillions of devices 10 years from now, written in C. And in 20 years, something else will come along to replace either C or Linux, but that thing won't involve Rust.
[1] https://www.reddit.com/r/ProgrammingLanguages/comments/6y6gx...