Hacker News new | past | comments | ask | show | jobs | submit login
A brief history of Rust at Facebook (fb.com)
234 points by tomduncalf on April 29, 2021 | hide | past | favorite | 178 comments



I'm curious what people think about Rust as a long-term investment from an employment/job perspective. Is it going to pay the bills for the next 10 years? I've been using PHP for the past 10 years, and I've never run out of work. I can't say the same about Elixir, which I very much like, but the job market proved to be scarce. Just wondering what Rust's future looks like. Thanks for your opinions

A note(since I notice my comment is getting downvoted): I've actually been spending time regularly for the past few weeks reading multiple tutorials, and getting familiar with Rust's concepts, and I really like the language and I can see that a lot of work has been put into developing it. But my concern is that there are other languages as well that have a solid background (OCaml, Haskell, to name a few), but that are still niche languages and suffer from the same scarce job market opportunities. Which proves that adoption does not always depend on how good a language is.


Disclaimer: Embedded dev here, working with 8, 16 and 32 Bit microcontrollers. Almost exclusively C.

In Europe (and especially in Germany where I live) I don't see myself writing Rust in my day job for years to come (already twenty years in). Not because it's technically impossible or Rust is inferior, but simply because it seems to take decades for european/german companies to accept or at least try new ways of doing things.

Mind you, I'm not talking about startups (we don't have THAT many in Europe/Germany in the first place). Also, those are (mostly) web stuff companies.

Warning, unpopular opinion following.

I think HN is not the right place to ask this question. You will find a dev crowd here that is nothing like the big, quiet dev community out there in the real world. Most people here love trying new things (languages, frameworks, you name it I've tried it).

So don't base your conclusions on those answers alone. Talk to regular devs (those who say "Hackers News? Never heard of it."). Because those people set what will be used in companies.

EDIT: Corrected spelling of quiet.


I'm leading the software team at Northvolt Systems AB. Half the team is composed of embedded software engineers. We are currently using C to develop our battery management system. But I definitely think Rust as a bright future for that kind of application. The reason why is because functional safety is a major concern in battery systems. And it's really hard to achieve this in C, for the same reasons it's hard to write secure C code. The main obstacle to Rust adoption in embedded systems with functional safety requirements is that the toolchain is not qualified according to the functional safety standards we have to follow. If we could have something like the Arm Compiler for Functional Safety, but for Rust, then I think we could start using it. I'd be really interested to investigate this, in collaboration with other teams sharing that interest.

https://developer.arm.com/tools-and-software/embedded/arm-co...


Someone is already working on this apparently: https://ferrous-systems.com/ferrocene/


Also, since it was raised above: this company is based in Berlin :)


I can only fully agree on this.

In my specific case it would already be enough if the the two big ones (Keil & IAR) would provide a compiler.

If this obstacle vanished, the situation would be totally different here.


ARM should jump on the opportunity and support Rust in their qualified toolchain. (Since LLVM can target ARM)


> (those who say "Hackers News? Never heard of it.")

"learn to administer microsoft tools, like active directory and exchange."


Or SAP and any other ERP, or Salesforce and any other CRM, or any other kind of enterprise or banking software (Java, .NET, C++), VBA and "small" internal applications.

Probably 60-80%+ of the software developers and sysadmins out there :-)


A friend of mine uses Rust at Alstom. He wasn't the one who pushed for it.

I am not sure about what you said. Big companies are risk-averse, but that could play in Rust's favor. Some employees who are a few years short of retirement might push back on it, though.


Peek & Cloppenburg was looking for C++ devs with Rust being a plus point for their NRW sites, a couple of months ago,

So at least some German companies are slowly looking into it.

However I fully agree with your post.


I've been "a rust dev" for 4 years, and I honestly can't see Rust not becoming a majorly used language. It's just so _nice_ to use. And I don't mean just "write" also mean "refactor" and "read" which honestly have been some of the hardest things to do in C (the last major language I, I dunno, specialized in(?)). Between the compiler pointing out when you're too sleep deprived to be writing what you're writing or the general "write what you mean, even if it takes a few more keystrokes and the compiler yelling at you a time or two" attitude that seems to pervade the rust community, it's just a nice language to continue to develop in.

Which, again, this is maybe self-serving, but it really feels like a incredibly good long-term investment to learn. Even if you don't end up using it professionally (somehow) the compiler teaches you how to write better low level code.


I just got done cowboy coding 2500 lines of unchecked input validation code in Rust. While I didn’t particularly enjoy writing Rust — the syntax has taken some getting used to — I also don’t expect to enjoy writing anything in the domain of C or C++. What I did enjoy was correcting the hundreds of errors which had accumulated over the course of several days without checking the code at all, thanks in large part to rust-analyzer and VSCode.

On the cmdline, rustc displays gorgeous error messages — easily the best compiler error messages I’ve seen from any programming language — and Cargo is a dependency manager and build tool without equal. The Rust ecosystem in general is already quite large. Although I’m a neophyte at coding Rust, I haven’t run into an error message yet that StackOverflow or Reddit didn’t end up resolving.

On paper, Rust saves companies money by eliminating the memory safety bugs which would otherwise show up in C/C++ code. At the same time, Rust’s developer tooling ends up being so ergonomic to work with, and the language sufficiently expressive and flexible, that I can see Rust competing in areas beyond those reserved for C/C++.

IMO Rust is basically OCaml for the 21st century. My bet is on Rust usurping C/C++, and I expect Rust’s rise will be nothing short of meteoric over the coming years. The involvement of Facebook, Amazon, Microsoft and Google in the Rust Foundation bodes particularly well for this, as do informal surveys of developers which consistently show tremendous grassroots enthusiasm for the language. Just my 2c: but bet on Rust, bet heavily on Rust, and bet on Rust immediately, without delay.


It still misses the OCaml compile times, though.


I've never thought of myself as a C++ dev (or a PHP dev, or Rust dev, or Haskell dev).

I write computer programs. I don't "invest" in a language and only take jobs that use that language.

Learning more languages is almost always good. Not because you're learning more languages, but because you're learning more modes of thinking which will apply to any future language you learn.


I agree in the abstract, but Rust is a step-function improvement compared to most other mainstream languages. It is hard to go back to writing, say, C or C++ once you have a few years of Rust experience—not because you've lost competence in them but because Rust offloads most of the complexity onto the compiler in a way that's simply not possible with older languages. You really start leaning on the affine types, the compile-time mutability xor aliasing checks, the pervasive checking of thread-safety...


> It is hard to go back to writing, say, C or C++ once you have a few years of Rust experience ... You really start leaning on the affine types, the compile-time mutability xor aliasing checks, the pervasive checking of thread-safety...

C++ has semi-official Core Guidelines issued by prominent members of the ISO-WG21 standard committee, that describe how to express these things idiomatically. Of course Rust can actually check these things automatically and achieve something close to actual memory safety, but there are ways to at least make a meaningful effort in C++.


The key part is "Rust offloads most of the complexity onto the compiler".

Yes you can make a meaningful effort in C++, but one of the great benefits of Rust is that effort is handled by the compiler instead.


The burden isn't all that great, as most good C++ code ends up looking like how you'd write the Rust code anyways.


New C++ code that your team has written, sure. What about all the libraries that you integrate with, that all have different models for allocating/using/freeing buffers? Or system APIs? (At best for those you can build a wrapper layer, and the wrapper layer is subject to the same errors as C/C++ classically is.)

One of the things that struck me as a major advantage about Java and garbage collected languages is that you can simply combine code from different libraries without worrying about what's freeing the objects, because that's the GC's job. With Rust, the memory management model takes over this responsibility without runtime cost (or minimal cost -- perhaps freeing something from the heap, if you allocated it there, but definitely not GC).

With C++, unless the library that you're using is using the same smart point library as you, from the same language version, then aren't you stuck writing wrappers around the libs you want to use?

Types like std::async seem to have changed in every recent revision of C++ version (11, 17, 20). Can you reliably find multiple open source libraries that you could integrate into a codebase that uses futures and async/await?

(It's been a long time since I've done much C++ professionally. I'm asking skeptically but genuinely.)


While true, I don't need to rewrite those libraries and can just use them rigth away, and this is why C++ keeps being my to-go language for native libraries, despite some of its flaws.

I can spend all my development budget on the problem instead sacrificing part of it to building an ecosystem.


Yeah, but that's a losing race. Considering how much investment is happening with Rust right now (when Amazon, Facebook, Google, Microsoft, a ton of smaller companies are investing), Rust will become "good enough™" at some point.

I'd imagine that we're at most 3 years away from it having mature libraries for 99% of tasks you'd want to do as a professional developer.


C++ has hardly replaced C in some domains after 40 years trying, and it is mostly compatible with C at source level, build exactly to fit into existing C toolchains.

Some of those companies, like Google, Microsoft and NVidia, are heavily invested into C++, ISO C++ and selling C++ based products.

To use a common example that one puts against any technology that Microsoft touches, when will Office use Rust on its foundation and extension APIs?

Are you aware that Apple has rather created their own safe C dialect for iBoot firmware instead of using Rust?


A big company can push forward on many fronts in parallel :-)


Indeed, as long as it makes sense in the long run.

If we can have Singularity/Midori back, with a microkernel in Rust, it would be quite nice, but yeah, I guess that is more of wishful thinking.


Sure, they end up looking similar, but the point is that with C++ the developer carries the burden of making sure everything is correct and doesn't prevent you or other developers from breaking outside the bounds of what is "good" - either on purpose, or accidentally, e.g. when refactoring.

With Rust, the burden of checking all of that is shifted to the compiler, and that reduces cognitive load for the developer.


While Rust is a big help regarding memory corruption errors, you still need to watch out for the remaining 30%.


Rust is not a panacea for all bugs, but making memory problems and data races a compile time problem rather than a runtime problem is a significant benefit.


Indeed, and if that is what makes the industry rediscover Ada and Modula-2, more the better, however care need to be taken how it is sold, and also be understood this is going to take several decades.

Anyone that cares about memory safety for userspace applications has already moved on into other languages, the problem is the foundation and those that resist no matter what.

Also the data races prevented by Rust type system only apply to multihreaded code accessing in-memory data structures, there plenty of other data races in multiprocessing scenarios.


Your “only” clause captures the entire definition of data race - compared to the broader category of race conditions [0]

Safe rust prevents data races at compile time. It doesn’t prevent other race conditions.

And “only” preventing data races, is still a non-trivial achievement!

0: https://blog.regehr.org/archives/490


No need for the exclamation point.

Yes it is a non-trivial achievement, just I get the feeling that the advocacy for it tends to forget about the other ones.

By the way, John Regehr's example won't work if the transferXXX functions are using SQL statements without transactions across threads each with its own connection, instead of in-memory variables.


The Core Guidelines are great, but don’t attempt to go as far as Rust.


Microsoft and Google are taking care it does.

I would watch the upcoming VC++ static analysis talk regarding its progress.

https://visualstudio.microsoft.com/de/pure-virtual-cpp-event...


I don’t know what specifically you mean by “it does” but if you mean “goes as far as Rust,” that’s simply not true. It’s never even been a goal of the project.


Indeed, the goal has been to be as close as possible to Rust, given the constraints of C++ semantics.

Note I mean the work being done by Microsoft and Google on top of Core Guidelines on their syntac analysers, which on Microsoft's case are active by default as background process that validate your code as you type.

It will never be as good as Rust is able to, but in some domains a "worse is better" solution will already be better than nothing, because they aren't going to port their code to something else.


I have been following such efforts (esp. of MS) in C++ for years - progress is so slow. Feels like a slug that halfs it's speed regularly and thus will never arrive at the finish line


Me too, however I am the opinion that I rather having something that improves the status quo than nothing at all, specially because there are millions of C and C++ lines of code that no matter what will never be rewritten.

I will consider Microsoft is fully into Rust when Azure Sphere supports Rust instead of being a C only SDK, despite its security marketing.

https://techcommunity.microsoft.com/t5/internet-of-things/de...

The first step in reducing those 70% is to actually use the tools that exist, but unfortunelly the macho attitute is still quite prevalent.

https://isocpp.org/files/papers/CppDevSurvey-2021-04-summary...

Those that are open to embrace Rust, are most likely already using static analyzers.


Learning a language in depth along with its extensive standard library can buy you months or even years of expertise. This is especially true in languages like C++ (a language with massive depth, such as template metaprogramming) or C# (a language with a massive set of libraries, ASP.NET alone can take years to truly master).

This is what companies are paying big bucks for, immediate competency and productivity.


The extensive standard library and the big libraries or frameworks that are the gold standards for their niches, in that language's ecosystem.

For example Hibernate for Java ORM, Spring as the big Java "LEGO brick bucket", Rail for Ruby web development, etc.


If you ever need to work with PHP, use PHP/Hack developed by Facebook: https://hacklang.org/

It's a compiled language built partially in Rust, being rewritten entirely in Rust, JIT compiled using HHVM: https://hhvm.com/

I don't believe it's compatible with existing PHP codebases, but you could probably convert one over, catching safety issues while you did it.

Hack a modern and highly productive language that's in many ways as nearly sophisticated as Rust. (How many other languages have both opaque and transparent type aliases, for example? https://docs.hhvm.com/hack/types/type-aliases ). It has a mature async/await system, for another example. One feature it's missing is the macro system that Rust has, though.


I agree but many job postings contain stuff like "3+ years of professional experience in language x".


Honestly as someone (on the team) hiring for a job where someone else wrote the job posting, ignore the number. Just read that as "do you think you enjoy maintaining a codebase written in X".

Job postings are the fucking worst.


Yes but with time you gain experience in one or more languages and you can choose a job with the ones you like (or tolerate more). You don't need to pick or specialize one to apply for this kind of postings.


I've never actually found that to be enforced. None of my jobs have shared a tech stack, all of them have included such language. YMMV of course.


They don't enforce the number of years, but they could ask you quite deep questions about the language.

Even if you're a senior Python/Javascript developer, with experience in C# in the past, etc., if they start grilling you about Ruby, you probably won't prove that you're as equally productive as a similarly smart dev that is an actual senior Ruby dev.


I think your statement implies that you and I do not agree at a very fundamental level.


I'm saying that the world is a big place.


> Learning more languages is almost always good

Learning for the sake of learning and betting on it for employment and professional use are very different beasts.


Rust has been picked up at Amazon, Google, Facebook and Microsoft. I think that gravitational pull is going to keep Rust a going concern for a long time.

That being said, it's an investment in the future mostly. There aren't anywhere near as many Rust jobs as PHP, Java, or Python jobs.


Rust at Google is in a bit of an awkward state. There's a bunch of teams/projects that have used it, but we haven't yet staffed a team to manage an internal Rust toolchain/ecosystem.

The most recent one was Kotlin, in large part because our Android ecosystem was increasingly having to play catch up, so we'll see if/when a Rust team gets staffed.


Rust has been adopted by multiple teams at Palantir. The bootstrap launcher for Palantir's ancient government software was rewritten in Rust to avoid having to ship Java 6. And some small piece of the custom software that Palantir wrote for BP to manage oil wells was written in Rust, because the single dev on the project was also a maintainer of some Rust crates and could get away with it.


While it has been picked at Microsoft, its relevance on the Windows developer eco-system is quite tiny.

It remains to be seen when it will even have a place on VS installer.


As a long term investment, I think it'll be great. I expect it'll be another 5-10 years before it's as employable as C++, but it'll last at least another 20 beyond that.


Once/if it reaches that point, and considering its main niche (systems software), it will outlive roaches.


I don't personally write Rust for a living, but it's a chicken and egg problem.

The industry wants to start using it but can't find dev with commercial x years experience to have the confidence to invest in it and the developers can't get professional experience because they can't find a job in rustland yet.

This doesn't happen in PHP due to the age and spread of the language (for better or worse).

So I think it's a timing issue. If the language is solid and solves a problem uniquely and is well received where it is used (and not just some freak example in a company somewhere with a blog post "we use this and it's the best!") - it will pierce through after X years.


I believe that Rust is positioned to be the Next Big Language for server-side and application programming, taking over for C/C++/Java.

I'm not sure if there's a .NET-compatible implementation or interop but it'd likely take over C# as well if so, if Microsoft decides to invest in it. They have pretty good languages like F# already but Rust is designed around zero-cost abstractions and avoids the garbage collection penalty that comes with running on the JVM or .NET.

Rust offers performance, type safety, memory safety, and metaprogramming that are provided by few other (largely academic) languages, while permitting multiple programming styles and escapes from those restrictions when you need them.

On the other hand, I don't think the Web-oriented languages will go away unless someone develops a variant of Rust or Rust framework intended for building websites. However, for everything else that's not running inside a web page or supplying the content in a web server, I think we'll see a gradual transition to Rust; the web server will be in Rust; the JavaScript compiler will be in Rust; the browser will eventually be in Rust; and so on.


It won't replace JVM/CLR. Rust community is pretty against GC and even though lifetimes are wayyyy better than C memory management they're not nearly as convenient as GC.

If Rust ever gets a good GC it might very well replace VM languages. But even if they started now, it would take years.

Rust also has long compile times. Java and C# projects, even huge ones, build in a couple seconds because compilation is done in JIT. Go is AOT but sacrifices some performance for fast compilation too. It's important for a general purpose language where performance is not #1 concern but maybe #2 or #3. Same story as GC. Sacrifice a small amount of performance for convenience. That's not what Rust is designed for.

Rust is positioned as a systems language to replace C dialects and it that I think it will be wildly successful


I've been feeling this way as well. I think it all comes down to whether the ecosystem can be built up enough where your average developer can be productive without feeling like they're fighting the language or trying to figure out which smart pointer they need to use in which situation. Obviously those issues will still exist but could likely be mitigated if enough work is put into the ecosystem


So what will be the security flaws we will see in Rust-based software in the future? Seems like there are still many kinds of flaws that no language can prevent.


From a long term perspective I'm not worried but short to mid term it might cause some issues because currently there aren't that many Rust jobs yet. It's kind of a niche still, jobs wise.


Thank you, this is reassuring!


I think outside the domains of kernel and driver programming, eventually game development, it doesn't make much sense switching to Rust for userspace application development.

That domain is already more than coverered with languages offering AOT/JIT compilers, managed memory (in whatever form), very nice tooling and ecosystems with libraries for about anything.


It’s a big pivot from php in terms of the level of the stack it’s targeting. That’s not necessarily a bad thing but keep in mind that there’s a bunch of associated skills, knowledge, and even culture that differ in web dev vs. for example distributed systems.


Probably depends on what you want to be doing

Do you want to work in systems, embedded, etc? Rust is almost certainly (at least a large part of) the future

Web servers (at the application level)/native apps/games? It's gotten some traction, but the jury is still out


Rust is rather painful compared to alternative choices for web servers today. I’ve had to write 10x the code in some circumstances for the same result and I find it becomes hard to reason when you have several async types.

However, it has attributes that match it extremely well with function as a service where memory usage, cold start times, consistent execution, and compute costs are major factors.

It’s extremely well suited to that environment. I expect to see it thrive here once some aspects of rust settle down further.


Yeah, the web framework situation is also just fairly early right now. Rocket looks promising but it's not at 1.0 yet, and the current release still requires the nightly compiler. It does a whole lot to improve the ergonomics, but even then it's not as easy as Python or Go. So it depends on the usecase.

The good news is that the microservices trend makes it easy to optimize the "hot services" with a different language than what you use to throw together all your normal services.


I wouldn't call it as "easy" as Python or Go but I've been writing production services off the Rocket master branch (all async and compiles on stable rust but the API itself is rather unstable) for the last few months and it's been easily as productive as Django (sans ORM & django-admin integration). 80-90% of my logic is in request guard implementations and diesel types/helpers which get reused all over the place, making the controller logic trivially simple. Reimplementing most of the bells and whistles took less time than trying to decipher Django's magic anytime it goes wrong.

Rocket's DI and middleware layer with serde, validator, Rust's error handling, etc really take the pain out of the whole affair. That said I wouldn't recommend it until the ecosystem matures and stabilizes unless you really like updating lifetimes all over rocket libraries every few months.


>Rust is almost certainly (at least a large part of) the future

Can you name any design wins for Rust in a product's embedded SW, embedded OS, or device drivers? I've never seen it in the wild. How does Rust deal with hard RT?

I'd be interested to see what people had to say.


I have very little direct experience with embedded but I've heard several people talk about having a good experience using Rust for it. It's not hard for me to imagine memory safety without a runtime being pretty useful in that context, if nothing else. That said, I know embedded moves slower than a lot of other areas so maybe "the future" is farther out in that case.


I haven't written anything in rust, but RTIC seems like it might be roughly equivalent to a project like FreeRTOS https://crates.io/crates/cortex-m-rtic


Also, this post has some useful feedback on experience/lessons from using Rust in production in the embedded space: https://old.reddit.com/r/rust/comments/hkjn6f/rust_in_embedd...


There does seem to be a lot of people who are very excited about the potential for Rust as a replacement for C/C++ in the embedded space.

There is "Working Group" dedicated to the domain: https://www.rust-lang.org/what/embedded & https://blog.rust-embedded.org/newsletter-28/

Toward the bottom of the page there's a "Production use" section with some quotes from people @ companies using it.

There's also an Embedded Rust "book": https://docs.rust-embedded.org/book/

In addition to the language itself, the surrounding tooling (e.g. for package management) contributes to the potential wins available from to moving to Rust.

My impression is that there are still areas where there isn't yet an idiomatic solution as to how _best_ to represent certain embedded SW concepts in a way that allows Rust's guarantees to be maintained. So I think there's some churn around that as people experiment/learn. (Primarily in relation to some of the physical constraints of the hardware.)

One of the companies prominent in the Rust embedded space is Ferrous Systems: https://ferrous-systems.com/

While it's still early days, relatively speaking, I think a lot of people see a move to Rust as a "once in a generation" opportunity to finally move on from an embedded ecosystem built on substandard vendor C toolchains & also gain from the robustness/safety that Rust offers.

In a lot of ways the embedded domain stands to gain even more from what Rust offers because unlike, say, the desktop/server space there isn't another higher level language that's really an option.

I'm fairly early in my Rust learning journey but have played a bit with it on an STM32F7 dev board. But so far have spent more time on non-embedded side of things.

(Although one of the appealing aspects of Rust for me is that I can use the language on platforms ranging from embedded to server/desktop to the browser (via WebAssembly). And the knowledge is transferable & even a lot of the library support too--thanks to "no_std" packages/crates that have reduced overhead.)

As an aside, while learning off & on over an extended period of time, I've found this post really helpful for (re-)learning how to read Rust: https://fasterthanli.me/articles/a-half-hour-to-learn-rust

The post aims to go "through as many Rust snippets as I can, and explain what the keywords and symbols they contain mean" which I've found a useful approach.


With niche programming languages, I think the thing is to embrace the niche rather than try to go the traditional way.

We could be an outlier, but if there are enough D jobs going around (Often very well compensated), you shouldn't have a problem with Rust.


If you get proficient in a technology, you'll be employable regardless of the market size or niche. There's always demand for specialists.

On the other hand if you're more of a generalist, I'd say stay with popular technologies such as Python, JVM and .NET. You can't go wrong with any of these for the next 20 years.

PHP has been making strides lately with JIT, Laravel and proposals for generics. So staying on it shouldn't prove a bad choice either. But if you're looking for greener pastures while maintaining great employability, my personal bet for the next decades is .NET/C#. And it tends to pay more than PHP on average.


As someone that's forced to write C occasionally Rust is a godsend. I prefer high level languages but when I need the performance it's nice to have something that doesn't feel like it's from the 80's.

C is extremely painful. C++ is a shitshow. Rust is glorious in comparison.

Rust, or something like it, will definitely replace C/C++ someday. And rust has way more traction than anything else


Yes and no.

If Rust got a much larger adaption, then there will be many more people learning it then Rust skill is hardly a special thing.

If not, a less-adapted language will never pay you with the top market rate since it is not used by most people.

So what really matter is the size of total programmer market.


> I'm curious what people think about Rust as a long-term investment from an employment/job perspective. Is it going to pay the bills for the next 10 years?

The answer is almost certainly: no.

From a career standpoint, you would be better off learning C/C++ in much more detail before investing time in Rust.

If you want the "canary in the coal mine", watch what the game developers use for the game engines (they mostly use C++ right now). When Rust starts becoming common among game developers, it's got enough momentum for a job ("career" is probably a bit much).

However, even when Rust hits critical mass, it's unlikely to be a "career choice" programming language--you will have to know Rust AND <something else more important>. The world has changed, and "One Language To Rule Them All" is no longer a thing.

C, Java, Python, C++, Javascript, C# are going to hold the top 5 of programming languages for the conceivable future. Not matter how good Rust is it is not going to displace C and C++ on any reasonable timescale--inertia is a thing. Swift and Visual Basic have their niche and nothing is going to blast them out of it.

After that, it's pretty much a free-for-all in terms of language popularity, but everything is tiny at that point.


> If you want the "canary in the coal mine", watch what the game developers use for the game engines (they mostly use C++ right now). When Rust starts becoming common among game developers, it's got enough momentum for a job ("career" is probably a bit much).

I don't see why the entertainment industry would be any more important to pay attention to than any other tech-adjacent industry, such as finance, health care, transportation, etc. If anything, the short-term focus on the next title means that the games industry tends to be relatively late adopters of experimental core tech: witness how more quickly tech companies adopted C++11 as compared to game studios, for example. (This is not to criticize entertainment's conservatism with technology choices: it's a tough business and they are understandably risk-averse.)


> I don't see why the entertainment industry would be any more important to pay attention to than any other tech-adjacent industry, such as finance, health care, transportation, etc. If anything, the short-term focus on the next title means that the games industry tends to be relatively late adopters of experimental core tech

Um, precisely?

Game programming is hardware-bound and short-term focused. Consequently, if Rust is generally superior to C/C++, it will spread very quickly. This would be especially true in games that have a much higher networking component (where the memory and security guarantees start to matter more than just the raw engine performance).

Rust is a systems programming language. It is meant to extract efficiency/performance out of hardware. If that isn't your raison d'être, then using a systems programming language of any flavor is counterproductive.

So, the problem is that Rust doesn't really compete in "general programming"--it competes mostly in "systems programming". And it has to displace competitors that are mostly "good enough". And it can be done piecemeal so that you don't have to port entire codebases all at once.

None of that is a recipe for fast change. And THAT'S PERFECTLY OKAY.

Now, all that having been said, the wildcard in all of this is WASM. Rust support for WASM has been stellar when everybody else's has been hot garbage. WASM could be the thing that lets Rust break out of just the "systems programming" niche. That's a big hypothetical, though.


> the wildcard in all of this is WASM.

On a related note, Embark Studios who I mentioned above, just announced they've joined the Bytecode Alliance, an WebAssembly-focused non-profit: https://twitter.com/repi/status/1387393289847025669

The announcement mentioned the use of WASM to help build "...a future of software & creative game worlds that are not limited walled gardens where only we / the devs can add new capabilities or features, want to build an open ecosystem of interconnected software & game components that anyone can use & create with".

Essentially, Rust+WASM as a future for player-driven modding/customisation.

This is an area I've been exploring in my "WebAssembly Calling Card" project which enables people to create cross-platform personalized dynamic image avatars implemented in any language which can compile to WASM: https://wacc.rancidbacon.com

I've integrated WACC with the Godot game engine, Rust GUI viewers and the browser--they all use the same avatar ".wasm" file (with examples currently written in C & Rust).


Neat!

I'm happy to see all of this, but, again, the question was about career in/for 10 years. And Rust just isn't going to be that.

Rust is, for example, far behind in terms of market penetration where Python was at a comparable age. This is unsurprising. First, Rust is nowhere near as broad spectrum as Python--so it's domain of applicability is much more constrained. Second, Rust is coming into a full ecosystem that Python didn't have to contend with.

Python was basically the only respectable person at the scripting language job interview in the late 1990s--Tcl and Perl both decided to start huffing glue in the middle of the DotBomb era while Sun tried to jam Java everywhere it didn't belong and left a lot of greenfield areas to Python. Rust doesn't have the "advantage" of that level of incompetence in its competitors/alternatives.

And, while I'm fully aware that Python and Rust aren't really in the same space, I simply chose Python because it's one of the only popular languages that is relatively "new" (if you can call something from somewhere in 1989-1994 "new") and didn't come upon its popularity as a mandated parasite attached to a much bigger host (looking at you: Swift/C#/Javascript).


game programming is performance heavy, while other tech-adjacent industry may be requirements-driven. Also, C++ has existed for what? 30 years? t


Finance is performance critical too, just to name one. Games are just one small part of the high performance systems space.


Finance is performance heavy in very, very, very narrow points--and NO programming language cuts it in that spot. Those firms use custom silicon.

The vast majority of "programming" in finance is finding something to model that is relevant. And that's Python, R, Julia, etc.--all the exploratory tools. Nobody in that space has even heard of Rust.


There's a lot of C++ in financial/trading: custom silicon is only used for the most rudimentary/simple/rote part of the system (think like packet parsing or templating your messages you send to the exchange).

Many people are very interested in Rust right now for HFT: C++ is nothing but pain and causes tons of problems.


> If you want the "canary in the coal mine", watch what the game developers use for the game engines

On this front two relevant data points are:

* Embark Studios: https://medium.com/embarkstudios/inside-rust-at-embark-b82c0...

* Ready at Dawn Founder/CTO: https://twitter.com/AndreaPessino/status/1021532074153394176

Also, there is a specific Rust GameDev Working Group: https://www.rust-lang.org/governance/wgs/gamedev


How relevant are these data points though? Embark Studios e.g. appears to be some small company in Sweden (?), doing 3-4 different things (aside from games) to get by.

You can find all kinds of languages in game shops, including custom Lisps and the like, so finding Rust too is a given.

But I think the spirit of the parent was to check what game studios start adopting "en masse", not whether this or that studio has adopted a language.


> How relevant are these data points though?

Depends on who you are I guess. :)

> appears to be some small company in Sweden

According to[0] they're currently around 200 employees and the founding team includes former CEO of DICE (+ex-EA) & other experienced industry people.

They're funded in the large tens of millions range: https://www.pcgamesinsider.biz/news/69461/nexon-increases-st...

So, basically, they're a bunch of experienced game industry veterans funded with a bunch of money & a clean slate to choose their technologies--and have made Rust as one of their prime choices. (Although I note they're using Unreal for their 1st game, so not sure how much of the Unreal-related dev they'll be doing in Rust.)

> doing 3-4 different things (aside from games) to get by.

Curious what gave you that impression?

[0] https://medium.com/embarkstudios/our-continued-journey-89dad...


Take a look at the Embark jobs.

C++, C++, C++, C++, C++, C++, Rust (for security), (Go, JavaScript, Python and Rust).

The Rust stuff is all confined to the back end where security matters--no engine stuff.

Ready At Dawn got bought, so it would be interesting to know if that still holds as they are owned by Facebook who is Rust-positive.


> As one significant example of that growth, Rust is the leading language in the development of the Diem (formerly Libra) blockchain, which is overseen by the independent Diem Association. Facebook, through its digital wallet Novi, is a member of the Diem Association.

I'm always amused by the contortionist double-speak of Facebook communications. They put so much effort into distancing themselves from Libra when they launched it, but now they have a chance to take some credit for the work behind it, all of a sudden it's a kinda-sorta Facebook project.

Totally off-topic but I can never resist a jab at Facebook.

Anyway – great news for Rust. Despite my distaste of FB, I can't deny the value of their open source contributions.


It's not really a contradiction to take credit for the effort behind something, but also make an effort to allow that thing to be independent once started, is it?


Not mentioned here but FB is rewriting the Relay compiler in Rust as well. Will be interesting to see how well it works.

https://github.com/facebook/relay/issues/3180

Personally I've had trouble with the Relay compilers watch mode not picking up file changes but that might be a Watchman issue not a Relay compiler issue.


Should have mentioned that I only mentioned watch mode issues because that’s the only issue I’ve had with the JS compiler. It’s been fast enough, although the app I’m working on is relatively small.


I personally know this timeline to be incorrect by at least a year. I was part of a startup that got acquired by FB in early 2015. We moved our codebase into the FB monorepo pretty quickly. One of our microservices was written in Rust. It was only a few hundred lines, but it was in the repo and actively deployed on FB infra by April or so of 2015.


Does that mean it was pre-1.0 Rust!?


Yup!


Must be, since the 1.0 release was a month later.


Wit?


Without getting a job in Rust (due to lack of experience) what is the best way to get experience with it?


Find an open source project you care about that uses it then try to fix some simple issues. That's how I learned!


Write something that sounds fun. It doesn't even have to be useful to anyone, just interesting. And you have to be able to talk about it. The last person we hired for our team, I actively courted because he'd written a javascript parser (note: just a parser, it parsed basic JS to an AST and did nothing with it) and talked about it at a local rust meetup I'd been hosting (until COVID).

If you don't have local (or maybe now vitual?) meetups to go to, write a blog post about it and post it to HN and put a single sentence "I'm looking for a job that lets me write Rust professionally" note at the bottom, you never know who's going to see it.


The official book is good, and I recommend even experienced programmers read it. Ownership, borrowing, and lifetimes are really central to the language, and really handled differently from other familiar languages. The jump from "implicit and occasionally violated" notions of ownership to an "explicit and 100% enforced" notion is a big one.


Write little command line tools for yourself in it (it’s the same answer for most languages).


The embedded RustConf talk said a big thing internally at FB was rewriting stuff like existing/well-used Python scripts into Rust...


I started out by doing a raytracer, graphics programming is really rewarding. https://raytracing.github.io/books/RayTracingInOneWeekend.ht...


I learned a bit of Rust by doing Advent of Code with it. That's more algorithms than systems, but the difficulty ramp in AoC means it ends up being a good intro to the core of a language.


I did some experiments with bevy.


related from today: Facebook Joins Rust Foundation

https://news.ycombinator.com/item?id=26982890


Are there enough rust jobs that don't involve shady stuff like crypto currency?


Rust releases only last about 5 months before they're so out of date you can't compile something written $today. I suppose this is a perfect fit for Facebook though since they control their entire stack and rebuilding the world is a-ok.

It's not so okay if you're trying to write something you expect other people to be able to compile and run.


I don't understand your concern. You're upset that new features are not backported to older compiler versions?

If you have some reason to not update your compiler, then don't use those new features, and use dependency versions that also don't use those features. Go back a few releases if you need to.

If you do need those features though... Then just upgrade your compiler?

I'm not aware of any compilers/interpreters that work the way you want here. Maybe Python 2 for that painful decade of overlap...


I have done more than dozen of languages and consider Rust the best experience for both "get new shinny things" and "keep old quiet".

But the Rust experience IS cargo. No (other) package managers.

Rust is cargo centric. With it, you are free to pin to versions and keep things working for very long.

BUT

also if you are coming cold to any library is to be EXPECTED to be "new". This is a good thing: The community at large move forward in tandem as if it were a coordinated army!

But that don't mean I can't work with older code, is that if you see a crate RIGTH NOW you see te forward momentum at large. If you wanna use something older you can pin to a back version.

However, this also point to a fair problem: Is not easy to correlate to which version of Rust each crate(version) relate to.


That's just blatantly false. Rust takes extreme pains to be backwards compatible. When they wanted to make breaking changes to support some new language features, they literally spun up a separate compiler front-end behind a cargo.toml flag to make sure they didn't break any existing code.

Compiler changes are tested against a bunch of libraries in the crates.io repository to ensure nothing breaks. It's just exceedingly rare that a new Rust version would break older code.


I think GP meant forward compatibility. New code will use new features and won't compile on old compilers (like other languages)


If you updated old compiler with new features, aren't they new compilers?


the most charitable reading of GP is not that nobody can ever implement new features in a language but that the 5 month release cycle is causing new features to come in too fast.


That's really weird. So far Rust has been the most stable thing I have worked with. Working with Cargo was a pleasant experience knowing that I will compile any program and the compilation will go smoothly (vs. Python where I have multiple times figure out what went wrong).


Stop using +nightly.

Stable rust hasn't broken in my experience.


I don't. I use rust from debian repos. In Debian 11 (bullseye), a distro which has not even been released yet, it has Rust 1.48 from 2020-11-19. This already is out of date and was unable to compile the two 1.50 targeting applications I wanted to try out.

And don't go saying I should use rust-up. That's a symptom of the very problem I'm describing. What other compiler is only usable if you install from some website instead of your repos? I can't think of any.


So your complaint is that Rust library authors are too excited to use new stable features as soon as they are available, and your preference would be for them to wait e.g. a year before adopting new language features, so that their code keeps working with old compiler versions?

Hard to see this as a problem with the language per se. What is the language supposed to do, ship working features but ask people politely not to use them? Not ship new features even if they solve acute problems?

As a library consumer, if you want you can just stick to >6 month old versions of all the libraries you depend on, and they should work fine with your 6 month old compiler version.


It's a bunch of problems. Firstly, you can't even say, reasonably, that you only want >6 month old versions of libraries, at least not with cargo. Yes, you can express such a wish for your direct dependencies, by using = versions in Cargo.toml, but the transitive dependencies, aka the dependencies of your dependencies and so on, those you can't control. Putting Cargo.lock into git solves some of your problems, but this doesn't help you when you are adding a library. Then, for any novel transitive dependency, Cargo just takes the highest version that works.

The second problem is that even if 90% of your dependencies support older compilers, it doesn't matter, because the 10% that insist on newer compilers will still break. It can be a single dependency, and you usually have hundreds, with vastly different policies.

There are solutions to this problem, like MSRV aware cargo. They just need someone to implement them (as well as someone to compute MSRV numbers).


What proportion of programmers/projects are (a) so risk averse/untrusting that they are unwilling to update their compiler to the latest version, but (b) want to depend on the latest version of hundreds of miscellaneous libraries?


Again, that's how the default development workflow with rust, which uses cargo, works. Including the cargo that's packaged by debian.


I think you'll find that using an old Rust from Debian is not the default development workflow for most Rust users: https://blog.rust-lang.org/images/2020-12-rust-survey-2020/r...


Hmm, what language lets you use new libraries with old compilers? The only ones I can think of are languages that have stopped being developed, because as soon one of your dependencies uses a new feature, you're done.

I'm just not sure how what you're describing would work.


Correct. It's the extremely rapid pace of development in the language combined with enthusiast devs using the latest and greatest that make it unsuitable for general use. Part of this stems from Rust almost requiring installing with rust-up instead of from repos. This encourages the bad behavior.

Eventually the Rust dev community will grow enough to have non-language-specific-enthusiasts devs as a majority and it will slow down and begin to target actually stable releases. But as of now you have about 5 months functionality per "stable" release.


Just to summarise your position

- you will only use an old compiler because that’s your preferred source of installing tools

- you will only use the latest and greatest packages that require the newer compilers.

And this is somehow the languages fault. The language needs to fix the problem of library authors using language features? Seems to me like it’d be easier for you to relax one of your two requirements. They’re fundamentally incompatible.


In the C++ world there is no such problem. You can usually compile most projects using compilers found in stable distros. Maybe not all projects with LTS distros, but at least the ones which are released once or twice per year. In debian, the rust compiler is updated more often than other programs and it's still not enough.


C++ is 35 years old or something, changes extremely slowly, has a mostly static programmer community, and has a wide range of compilers in current use. When new versions of the C++ spec show up it can take months for compilers to all catch up, but few C++ developers are in a rush to adopt new features.


Every modern language works like this. rbenv, pyenv, nvm, gvm all exist for a reason. People don't want to wait for package managers. I go out of my way to avoid dependencies on them, because all they do is break things. Hard drive space is cheap, things should be self contained.


I see rust as following the evergreen model - along with nodejs, chrome, Firefox and others. In this software model the authors make strong stability and backwards compatibility guarantees.

For example, as I understand it the rust compiler team literally compiles every public crate with each version of the rust compiler to make sure they haven’t broken anything. The platform generally expects users to mostly keep up to date - as a web developer I can now use new features in chrome and FF within a month or so of the feature being released. Gone are the days of leaving behind large swathes of users because they’re still using FF 4.0 or IE 6. Nodejs has its own support table. I don’t care what version of nodejs Debian LTS has; when version 8 of nodejs stopped being officially supported and maintained, I immediately dropped support for it in my npm packages.

The “minimum supported rust version” flag in cargo will help with this going forward. But more generally I think this model of software development makes sense and we’ll see it more and more going forward. There was a kernel bug a few months ago caused by a bug in an old version of gcc. The gcc bug was fixed years ago but the patch was never backported to ancient gcc versions. Some distros were compiling Linux using ancient gcc versions, and then got upset when they ran into problems. The answer here is for gcc to say the same thing nodejs does - “these versions are supported and get bug fixes. These versions do not. Use old versions at your peril - they have known bugs.” Explicit support periods is better than vague, ambiguous expectations that software will work forever. Microsoft has the same policy with windows patches.

I think the era where it makes sense for apt to carry that burden by serving us ancient software packages is slowly coming to an end. As software packages take responsibility for evergreen stability, it also makes sense for users to get evergreen software. It would be nice if we could do that without needing app specific version managers like rustup and nvm.


rustc is already exempt from Debian's policies to not update most software after a stable release. It's still not enough because Debian lags behind 2-3 releases. It has a high standard for supporting various niche targets and there are regressions every now and then, and often there is simply no volunteer time to package all the libraries.


Debian has a policy for a packaging Rust crates:

https://wiki.debian.org/Teams/RustPackaging/Policy

If you restrict yourself to packages in the repo, won’t that resolve the issue? The idea of a distro is that all the packages in the distribution should play nicely together, but if you go outside that then version compatibility won’t be handled for you.


Debian's repackaging is not useful for Rust developers. It's a tiny tiny fraction of Rust packages (Rust has over 60,000 now), and it's not integrated with Rust tooling. It's not useful for anything other than the few binaries written in Rust that Debian ships.

Debian's policies are simply incompatible with the Chrome-like "evergreen" release model that Rust uses. I feel sorry for anyone who tries to use Debian's Rust. It is an awful experience.


Strange how a language according to you is unsuitable for general use. Yet it is used by Google, Facebook, Apple, Amazon, Dropbox, Microsoft, Mozilla, Huawei, etc.

You live by some principles. Most people don't care about the principles you seem to care very much about. Most just have problems that need solving and Rust is the best tool for solving them. If the tool works differently to how they are used to working, they adapt and get on with their lives.


This comparison doesn't make sense. Not everyone has Google or Facebook money and engineering.

Just because they can doesn't mean you can. They solve Google problems.

And they use Rust to solve specialized problems because at their scale it is worth squeezing every advantage.

For mortals like you and me most problems should be faced with a completely different mindset.


What does money have to do with using rustup?


Where did you mention rustup? Perhaps you replied to the wrong message?

I'm addressing your statement that says a language is of general use if used by the likes of Google and Facebook.

They have infinite amounts of enginering and very special problems to solve. What makes sense for them often does not for smaller teams solving trivial problems.

> Strange how a language according to you is unsuitable for general use. Yet it is used by Google, Facebook, Apple, Amazon, Dropbox, Microsoft, Mozilla, Huawei, etc.


I understand your point and kind of agree but you should also keep in mind that Rust is a relatively young language. I would treat these as growing pains.


It's not a problem with the language directly, but with its culture. People being excited to use new feature is a nice thing for them, but the trade off is that the consumers of their work may suffer.


It is actually a problem with the language as well because it is still missing a lot of important things, so releases actually change your productivity and you use the shiny new features because they solve real pains. And this is far from over because there's still a lot of pain in rust development and a lot in the work to fix it. The same could have been said about JS/typescript where so many things were annoying that people rushed to use the new features. This is getting a bit calmer now because the new annual features are less life changing.

Eventually maybe in 5 years Rust will have it's major pains solved and things slow down a bit. Maybe then the release schedule will actually change as well? Who knows


What are the features you expect Rust to gain between now and 5 years from now?

As for major language features I remember async await from a year ago, const generics recently and that’s it. If you can remember more features that fundamentally changed how we write code, please let us know.

I don’t think many features are expected in the near future apart from maybe generic associated types. But not any time soon either.

I think you might be seeing releases every six weeks and assume that each of these bring massive changes. They don’t. Each is a small iterative improvement. Take a look at the Rust release notes if you like.


Generators is one.

Custom allocators and anything else required to build bare-metal software.

Compiling Rust itself, especially std::core. I hit that wall when trying to cross-compile using stable Rust that came with my distro.


Non-Lexical Lifetimes were a pretty big quality-of-life improvement. The question mark operator was also quite nice.


NLL landed in 1.31, in 2018, and ? landed in 1.13, in 2016. two and a half and five years ago :)


It's becoming less sane to use any systems package management to handle tool chains and libraries with each passing day.

The recommended way to manage your rust installation is rustup. It makes it trivial to set up and manage tool chains, globally or otherwise.

> what other compiler is only usable if you install from some website instead of your repos?

What compilers' recommended installation is some 3rd party package manager that lags behind the latest stable release by months?

Rustup is a useful tool, it's dumb to ignore it.


> It's becoming less sane to use any systems package management to handle tool chains and libraries with each passing day.

And this is a terrible development that we should all try to resist, as it leads to a combinatoral explosion of dependencies and problems, as soon as you introduce another language. So then we end up using docker to encapsulate things because they are hard to manage on one box.

It just seems crazy, but I don't really have much of a solution, tbh. Perhaps we could just install the build tool (in this case rustup) for supported languages and stop trying.


I don't think it's terrible. Modern package management just works on a different assumption - the dependencies and their artifacts are required at build time, and for libraries they cannot assume the version required by the software exists outside the build environment or in deployment.

If all of software respected those assumptions then we wouldn't need containerization nearly to the degree we have it today.


It's all fine until you start needing multiple language package managers, each with their own C/C++ dependencies, and then it gets a lot trickier.

I mean, I know how to handle it, but it's pretty depressing that I need to do it, when in the past one could rely on relatively recent versions of everything existing with a system package manager.

I always need C/C++/Java dependencies in DS, so maybe I just feel the pain points more.


That's more an indictment of the sorry state of C/C++ dependencies and linkage paradigms (shared libs are an antipattern like 90% of the time, imho). If C/C++ had a standard package manager then it would be much more trivial to mix and match with other languages.

Modern package management in a nutshell is "don't do what we do in C because we learned from those mistakes."


Sure, but I'm a data scientist. I need C/C++ dependencies all the time. I don't have the luxury of ignoring this problem. I really wish I did though :(


Plenty of C++ code can't use modern C++ features because of Linux distributions holding everyone back. I think this is a bad thing and don't want these concerns to seep into Rust as well (outside of "fundamental" libraries.)

Safety-critical systems may need these guarantees, but I object to them externalizing the costs they need to bear on the whole community.

Almost everyone should be on up-to-date compilers, assuming the compilers promise that they won't break old code. The more we move towards that world the better.


Yeah, this is a problem that the distribution packaged rust compiler is quickly out of date compared to what most of the ecosystem expect.

However there is hope for a better future: https://github.com/rust-lang/rust/issues/65262 will allow crates to specify a minimum supported rust version (MSRV), this way, we can imagine cargo resolving algorithm taking the installed rust version into account when picking the version of a crate to download so it would pick the last version of the crate that still compile on your system.

Another thing is the Sealed Rust / Ferrocene initiative [https://ferrous-systems.com/ferrocene/]. The idea is to have a version of rust supported for a long time, and it is likely that many core crates from the ecosystem will decide to keep compatibility with that LTS version.


Relying to myself to mention another thing: https://github.com/rust-lang/rust/issues/64796 is adding support for a #[cfg(version(x.y))] tag that will allow crates to more easily provide additional feature when using newer compiler, while still being compatible with older rust versions.


I see this complaint a lot. The answer is not for people to not use modern features of tools with frankly very reasonable cadence, but instead for distributions to adapt to a world where things are not distributed on CDs any more.


I maintain a project which has _a bunch_ of dependencies and tests im CI that it can compile on 1.41. That’s a lot of dependencies: https://crates.io/crates/insta/1.7.1/dependencies


i am not rust user, but many language are use its own pacage managing. making pacage first through language then through distro are two layer of review it must pass, so are always slow. i am not hearing persons complain on ruby, python, javascript, go, r, perl, ... use own pacage manage. rust are not a different here. in future maybe it stabalize and we are then having fewer update for to manage.


> And don't go saying I should use rust-up. That's a symptom of the very problem I'm describing. What other compiler is only usable if you install from some website instead of your repos? I can't think of any.

Lets see... i had this problem with python all the time, the system python version was notoriously old and out of date on macos, redhat, debian, etc. This is how pip and pyenv and a dozen other tools were born.

Similar things with node. Or when the distro gcc has a bug that your code hits. Go has this problem too - e.g. the pain around moving from nothing -> dep -> modules was real (if not widely publicized).

The language provided version/package manager solves a very real problem. That problem is a mismatch in what a system package should be vs what a development package should be. In some sense containers, nix/guix stye package management, and language specific dependency managers are different approaches to solving the problem: what my computer runs for daily use should be relatively stable and generally I won't need the latest and greatest package features in software I use - upstream developers have generaly dealt with various library deficiencies bugs in their dependencies. In software I develop though, I regularly hit api bugs and sure would like to just use version+1 instead of spending time developing around it. Particularly when the output of my development is a binary that is running in a container on some server.


The problem is that you chose a Linux distro with an explicit stability guarantee and a slower release cadence than Rust.

If you had chosen a rolling distro like Arch, you wouldn't have to use Rustup.


Typescript and node?


This might be a case of a dependency that doesn't respect SemVer? Did you post this incidence?

Edit: Just re-read your comment. Maybe ping the guys in Debian to update their repos?


I've just taken a snapshot of a non-trivial project of mine, from exactly 5 months ago: builds fine. Last snapshot with significant changes: 4 months ago, builds fine as well.


I think your comment is good, but also, your comment and the parent comment highlight a situation that makes this discussion hard: you're both talking about two different things.

You're talking about using a current compiler to build old code, aka, backwards compatibility. We put a lot of time and effort into this, because it's important, and we're not perfect, but we're pretty darn good at it, it seems.

The parent is talking about using an older compiler to build newer code, aka, forwards compatibility. Basically no programming languages I'm aware of provides this today, but they release on a much slower schedule, which means that in practice, it is likely to happen less often. Because Rust ships so often, and because it's so easy to upgrade Rust, the community tends to adopt newer features faster than in other language ecosystems, meaning that forward compatibility becomes more of an issue if you do not want to update the compiler.


To build on this, there are parts of the ecosystem that explicitly outline their Minimum Supported Rust Version. Sometimes it's short (e.g. last two stable releases), and sometimes it's longer (6+ most recent stable releases).

What's not well expressed is that is that certain releases prompt a bunch of people to upgrade the newest libraries to require the latest stable release. And these releases are not easily visible or predictable from outside the community. This is driven by Rust releases that stabilize significant features; I'm talking about stuff like async, min const generics, impl trait, proc macro support. So support for older releases tends to wax and wane depending on how recently a big feature was stabilized.

There is one other contributing factor - cargo tries to pick the most recent version of library (that's SemVer compliant), not the earliest one, and MSRV changes aren't at this point considered a breaking SemVer change (as far as I know)


Yes! As a non-Rust example, the Go 1.15 compiler can't compile some Go 1.16 projects because they use the new embed library.

The idea in the root comment that it's somehow outrageous that a compiler for version N of a language can't compile versions N+M is ... strange.

PS I love your book :)


I am still pretty new to Rust (about a year of usage), but I think there are ways to have this "forward compatibility" problem be a less significant impact in real world usage, though.

Node.js went through similar rapid development of the runtime as Rust is with the compiler. Node's package.json includes an "engine" field to indicate which runtimes are supported from which version, so NPM could immediately tell you if you were trying to use a library for a newer Node version than you have installed, and could automatically downgrade to an older version that still supported your Node version, if one existed.

That eliminates much of the pain of using libraries by others that are sticking to the latest release of the language.

But it also encouraged library authors to try to push that supported version number down as far as they could, because that would juice adoption of their library in the wild, further making forward compatibility less of an issue. (This isn't exactly apples to apples since most of the advances in Javascript the past decade have been syntactic sugar, not new powerful features, and there is a hard split on async/await versus callbacks as that was more of a feature than just sugar.)

In Rust, if libraries could broadcast their minimum Rust version in their crates, and treated changing that as a breaking change (major version bump), and were open to backporting features onto prior majors (for the bigger, important libraries), there can be a rolling drop-off of older versions of Rust instead of hard, unpredictable dropoffs.


There's an RFC to add the minimum rust version supported. It was merged! The tracking issue is here: https://github.com/rust-lang/rust/issues/65262


I appreciate this post. I have one or two points to make about what you've said here (none of which is wrong)

> because that would juice adoption of their library in the wild

That requires a lot of people to be using older versions, that is, it assumes there's an audience of people who are interested in using older Rusts. But we don't have that in Rust, even if it was that way with Node. Look at this chart from our survey last year: https://blog.rust-lang.org/images/2020-12-rust-survey-2020/r...

There isn't a huge audience to get adoption from if you support a wider array of versions; the vast majority of users seem to be using the latest stable.

> and treated changing that as a breaking change (major version bump),

This is controversial, because in some sense you're not wrong, but this also creates a lot of churn, and churn that doesn't really have to happen. In a model where most folks target the latest stable, you're creating tons of work for most of your users, at the expense of making things slightly better for a very small number of users.


> That requires a lot of people to be using older versions, that is, it assumes there's an audience of people who are interested in using older Rusts. But we don't have that in Rust, even if it was that way with Node. Look at this chart from our survey last year: https://blog.rust-lang.org/images/2020-12-rust-survey-2020/r...

So I would cite the classic statistics example of adding armor to the parts of returning aircraft that weren't shot up: https://worldwarwings.com/the-statistics-that-kept-countless...

That survey (that I wasn't aware of despite using Rust) likely self-selects the most enthusiastic users of Rust. There could certainly be a large number of other users, or potential users, that would answer things differently.

> This is controversial, because in some sense you're not wrong, but this also creates a lot of churn, and churn that doesn't really have to happen. In a model where most folks target the latest stable, you're creating tons of work for most of your users, at the expense of making things slightly better for a very small number of users.

I don't agree that it's that much extra work? As a user running `cargo update` should be enough and the versions increase. If cargo refused to update to a version beyond the currently installed `rustc` version, then this should be completely safe on their end.

Then the library authors can either do what they currently do and not backport features/patches for earlier Rust implementations, or they can if they want to support those users (so no extra work is required), but now it's very clear to them (based on download stats on crates.io) how much demand for prior rust compilers there is, and they can decide if they want to provide a stronger backwards compatibility for their library, and the users can tell based on the semver releases for the library if that library supports older versions of `rustc` or no.

This is still personally one of my more minor complaints about the Rust ecosystem. It's still far better done than most other major languages, and the current state of Cargo/crates.io was a major factor in our picking Rust over C++ for the project we're using it for. So this is a critique because I do actually care about this and see how it could help both adoption (less worry about "what's the right version of Rust to use?" for newbies) and long-term maintenance (if you can't come to a project a few months later and install any security fixes because there are library changes that require you to upgrade your compiler and there's a cascading set of dependencies-of-dependencies that make that riskier to do, so application-level developers may not want to jump on such a "treadmill").


I am not saying that there is no possible sample bias from the survey, but it is the best data that we have. I'm not aware of it being an issue for our users in general. Some folks do say things about this from time to time, but it seems like largely hobbyists and open source folks with strong opinions, rather than large corporate users or even most open source users.

> As a user running `cargo update` should be enough and the versions increase.

They would not increase, because that would be a major version change. To do that, you'd have to go into your Cargo.toml and change the version number there. This would also be true of all of your dependencies. Major versions, while Rust can have two of many of them in tree (but notably, not ones that wrap C libraries, for example...) can still work, but like, then you have two copies until the long tail of your dependencies updates, and maybe by the time that happens, it's bumped again in your code.

> So this is a critique because I do actually care about this

I totally believe this! It's also possible that I am not right, just my take on what you're saying, that's all :)


I know only the slightest fundamentals of compilers in general and Rust’s compiler in particular, but would it be technically possible to ship library dependencies as compiled MIR code to older compilers to maintain forward compatibility?


MIR is not stable, so no. In some world where it was, and stability meant "unchanging" not "adds new features", then yes, in theory, it might be possible. Note that this even doesn't actually solve the parent's problem; they want to use the old compiler to compile the new code, not compile the new code with a new compiler and use that binary as a dependency for their code that's built with the old compiler.


Ah, I see. I don’t know where I got the idea that they were trying to use newer dependencies instead of just compiling newer code themselves. Thanks.


I mean, maybe they are, I guess that if you add "we somehow allow for binary packages to be distributed via crates.io" that would remove them needing to compile those dependencies themselves, right now, you do both of those things locally, so I was kind of assuming that was still the case.


Backward compatibility doesn't mean that you write code with the latest features, then go back to a version that's a year old than expect it to work. No language or software does that.

The promise is that if you have code you didn't touch for a year, you can update Rust, and it will still compile.

You got it backwards.


Don't blaspheme against the FOTM gods. Today is the year of the Rust. Wait until next year when $language is the next shining beacon of not-having-to-deal-with-legacy and Rust itself goes to the dustheap where Cobol and Fortran and Python 3.6 now live.


Rust was doomed to fail from the start.


Why?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: