The thing that's really missing for me in rust is a better way to modularize enums with associated data and their behaviour.
There are many cases where you have a closed set of known variants for a type, but they have very different behaviour, e.g. nodes in any kind of adaptive tree data-structure, such as an ART, HOT or Judy Array.
In those cases you don't want to bring out the big guns with dyn traits. But the current approach involving huge match forests, macros, helper functions and code duplication
plainly sucks ^^'.
This could be solved by making enum variants proper types.
Something that has been proposed already, but which has been rejected due to lacking capacity.
This does need to be solved, but I came up with a pattern in a recent project I'm pretty happy with:
- Create all the variants as normal structs
- Create enums around different subsets of them that just wrap each struct in a variant of the same name (including from/tryfrom impls)
- Have methods for "casting" one of these enums into another
The second and third points can be made fairly non-painful with a macro where you just say "enum X can be structs Y, Z, A, B..." and it creates the variants and impls all the traits for you
But yeah. This is one of the bigger pain-points to implementing certain stuff in Rust
Edit: just realized you were talking about something slightly different from me, but I think you could extend this pattern to get it:
Have a trait that all variants implement, implement it on the enum too, and have the enum delegate to the variant structs (since those are already separate). The wiring should be automatable with the same macro
The issue with the enum of structs approach is that it doesn't really work for low-level stuff with lots of byte-packing.
Compare the following
#[repr(u8)]
enum Dense {
A {
foo: u8,
bar: u16,
}
}
struct AStruct {
foo: u8,
bar: u16,
}
#[repr(u8)]
enum Padded {
A {
inner: AStruct
}
}
The Dense enum will be 4byte, because Rust will use the u8 tag as an implicit first field of a tagged union with all the variants fields.
However Padded will be 6 bytes, because AStruct is padded to 4byte, as the u16 causes it to have 2 byte alignment.
That two byte alignment causes it then to be padded again when combined with the u8 tag.
You can't mark AStruct as #[repr(packed)] either, because then the compiler can't proof that the struct will never be moved out of it's alignment providing variant (and unaligned field access is a no no for rust).
The only other thing I could think of would be to transmute between the structs representing the variants, but I think that will probably cause UB because it destroys an ungodly amount of providence information.
Yeah, makes sense. I'm fortunate that my use-case doesn't have to worry about that
In general, coming from TypeScript I'd like to see a lot more "duck" types in Rust that describe shapes the compiler already knows how to deal with, just placing additional constraints on what the user can do with them. Individual enum variants, but also subsets of a given enum (where the representation can still just be the same as the full enum)
Along similar lines, I'd really really like "anonymous structs" (basically tuples with named members). That seems especially low-hanging since tuples already exist; it's almost just sugar
I think there's a lot you could do here that would fit really neatly into Rust's existing type system, but like in the RFC above I assume it's just a matter of devs not having time to implement it all
The problem then is what should happen if you define two "anonymous structs" that share some members (name and type). Most users want these to be seamlessly extensible, which amounts to full row types as seen in OCaml.
It would be nice if subset-structs worked like they do in TypeScript, but I would still be totally happy if they had to be an exact match. Subset matching would probably be the 80% complexity for the 20% benefit
Now... a neat compromise could be something like this
fn foo<T: { bar: i32 }>(my_struct: T) {
}
foo({ bar: 12, stuff: "abc" })
That would fit a lot more smoothly into Rust's type system than the automatic subset-matching would, but even then, I'd be content without it. The most basic version of anonymous structs is hardly more than sugar over tuples, it just seems like such an easy win
I kind of wish Rust would eventually focus on a rich standard library. I'm talking everything from a basic but usable web HTTP server which others like Python and even Go (which I attribute to their success via net/http being built-in and OOTB) have available from the get-go. I also think Rust should heavily consider building a cross-platform UI stack, it doesn't have to be insanely exhaustive, maybe it can just seek for feature parity with something like Tk, but it would lead to far more adoption I think.
Who doesn't prefer to just build something in a language where you know first time setup is pretty straight forward as opposed to wasting hours setting up some convoluted web project by hand?
I think Rust needs to invest in providing a rich standard library, even if it is just its own library that is fully optional, as long as the core team works on and maintains it, I think it could do plenty of net-good for Rust as a whole.
I'm also thinking of my favorite things about languages like Racket and Python which is a really simple editor that is built-in. Again, Rust needs to be able to build UIs to achieve this.
I can speak to the UI question, as I'm very invested in it.
It is way too early to standardize any such thing now. There are, give or take, 5 promising UI toolkits, and a dozen others that are best thought of as experimental and may contain interesting ideas. There's a very wide diversity in approaches, and much of that diversity is not accidental but can be traced to different requirements. If you need to deploy an app both on the web and natively, then that pushes you toward adopting at least some web technologies. If you're working on games or game tooling, then the tradeoffs of immediate mode GUI are much more appealing. Another major axis is how much to rely on platform capabilities for stuff like text layout vs rolling your own.
We need to do a lot more exploration of the design space. I believe there is potential to build really compelling UI in Rust. It's entirely possible we end up with fairly different approaches for different use cases, hopefully converging on common infrastructure for things like accessibility.
But if we standardize anything we have now, it risks "the standard library is where modules go to die."
Rust is particularly strong because the crate system lets them slowly bring things into the stdlib.
There are many UI libraries for Rust; I’ve tried most of them, and all of them are radically different in their approaches. None of them are anywhere near as good as most UI frameworks in other languages (in my opinion).
Bringing a UI framework into the stdlib at this point in time would be terrible because the community still hasn’t figured out the best way to write a UI framework and it would stifle innovation in this developing section of the language.
I don't understand this website and why it always get linked. It's just a list of a bunch of crates unsorted. Most of them are unmaintained, or just early prototypes.
There isn't a good comparison of all these crates
> Who doesn't prefer to just build something in a language where you know first time setup is pretty straight forward
Is first time setup of Rust projects a regular issue for you? That's not my personal experience, but I am of course biased. 99% of projects are "git clone && cd && cargo build," done. 0.95% are "oh, also install some C library via the package manager," which is a one-time thing per library.
In practice, some folks have tried to get a "richer standard library" going, and they've never seen community adoption, and so end up abandoned. It's been quite a while since this sort of thing was proposed by the project, but back when it was, the community also pretty resoundingly rejected the idea.
Having come from Ruby (where you were also a visible member of the community) I really do prefer Rust's model where the standard library is kept a bit thinner and additional functionality that would have been part of the Ruby (or Golang) standard library is kept separate.
The benefits are I'm sure not new to you. Updates and releases can be completely independent of Rust itself, and by not being in the standard library maintainers are able to deprecate and remove misfeatures when needed as opposed to having to support something that holds back a library for forever.
The one downside though is that there is the unfortunate reality that—by leaving important parts of the ecosystem to random crate authors—important and heavily-used crates which would be part of other langs' stdlib can become unmaintained over time. This hasn't been too large a problem so far, but I do wonder if more could be done to promote community adoption of crates that are perceived to be more core to the ecosystem. The nursery is probably a good approximation to what I'm looking for, but without necessarily coming along with the implication that it should ever end up in `std`.
As someone who was merging PRs into a Ruby gem for the first time in years this past week... this also happens in Ruby, ha! I've been that failed random maintainer over there too.
And I suspect standard libraries get far less 'maintenance' than many people think they do, especially compared to a popular external library.
You're not wrong, but the signaling around `rust-lang-community/http-server-async` having one patch release in two years vs `r0ckst4ar-life/tokio-hat-tip` doing the same—even if both repos host the exact same code—is (in my eyes at least) completely different.
Both could publish the exact same HTTP server crate but with one of those two, I can a) have some confidence that it's a probably a reasonable default choice, and b) assume that it's much more likely to be stable and mature than simply abandoned.
I dont mean established codebases. I mean if I ask you to build a web server in Rust how quickly do you have it up? What are those steps? I can have a Django or ASP .NET project up insanely quicker for example. Heck I can have a Go net/http project up quicker than all the above just cause of muscle memory since that package has not changed a whole lot in years, which adds to the value of Rust having an extension to their stdlib that is maintained by them.
〉time
The current time is: 12:38:36.95
〉cargo new foo
〉cd foo
〉cargo add axum
〉code . (and then copy-paste the axum docs hello world from docs.rs/axum into src/main.rs)
〉cargo build
〉time
The current time is: 12:43:37.44
Most of that is waiting on my crates.io index to update (which isn’t something that happens every time you start a project), and that, hilariously, the axum hello world also requires that you do a `cargo add tokio --features macros rt-multi-thread`, which they do not document, and I had to figure out myself. Most packages don't have that issue, and I'm filing a bug right now.
> I can have a Go net/http project up quicker than all the above just cause of muscle memory
I suspect that this has a larger impact on your experience than you're giving it credit for. You shouldn't discount familiarity, it's a huge bonus! And a good reason for you to do this stuff in Go and not Rust. But it doesn't mean it's universal.
Am I going to argue that Rails is not wildly productive? Absolutely not. But at least, for me, the package existing in the stdlib isn't the different factor here. Rails and Django both aren't in the Ruby and Python standard libraries either, you know?
Never used axum, what does it bring out of the box? Interesting, maybe I've been out of tune from Rust for too long, I don't get paid to work with it so it is easy to fall behind!
I'm really into things like Django and ASP .NET these days, and last I looked (months ago?) Rust doesn't offer anything fully batteries included yet (relative to those two frameworks), heck even Phoenix is really good in this regard.
On that note, how does a beginner do what you just did in under a minute? Versus, someone going through the walkthrough for Go and seeing you can make a website using net/http, because it is built-in and doing it? This is the nuance I'm talking about, built-in or at least core-maintainer supported libraries are usually easier to adopt because you know there is a lot of investment into them.
The first chapter of the Rust book covers how to integrate dependencies in your workflow. The only thing not covered there is choosing which web framework to use. To do that, you either Google it, ask around, or just pick literally any of them and give it a try.
You’ll notice the sibling comment did the same thing but with a different web framework they prefer. That’s great too. But other than the choice, it’s entirely agnostic. Another sibling even posted a step by step tutorial, if that level of specificity is helpful.
You're right that adding dependencies is easy, but you're significantly downplaying the discoverability of quality crates for specific usecases.
When people say "we need a rich standard lib" they are more interested in a "blessed set of crates" than a "it should be pre-downloaded with the compiler".
I mean I think my answer is the best anyone can do. It’s not as if there’s never packages competing with the standard library, and finding one you like and then always use is a one-time cost. There’s a ton of ways to find the answer to this question easily. It is of course true that you have to look.
While two different ones were mentioned in this thread even, a new user would be easily served by either.
I guess the story for doing web in Rust is different today, its a little easier, but as someone new to the language, how do I know what Actix is, or who maintains it, having watched that scene a while there was a weird scandal with Actix a few years back, the lead developer just quit or something? But on that note how do I know which web framework is best for Rust for a beginner? If I'm coming from Django, what framework gets me the same experience? I've gone through Actix docs before, and their docs were completely out of date form what cargo installed for me, it was confusing and off-putting, I didn't bother with Rust web dev for years after that.
takes ~a minute. Not sure std would have saved me any time, I would have just searched the stdlib docs instead of crates.io for actix. And I guess I wouldn't need to `cargo add actix-web` lol that was not the bottleneck.
I'll suggest that the Rust core team, who are very experienced and capable at making compilers, PL design, data structures, might not know much about implementing the HTTP protocol, parsing XML, or particularly UI framework design (which has driven many engineers to madness already). They should focus on the language and let the community build tools for the application layer.
Fun fact, in the pre-1.0 days Rust did try to provide a batteries-included stdlib, and came to the exact same conclusions as you have. :P In the run-up to 1.0 an enormous number of modules were jettisoned from std and into the then-new crates.io. I'm not opposed to having a larger Rust stdlib, but it is better to at least let subject domain experts hack on the problem for a while until, perhaps, the community standardizes on a de facto API.
The reason Rust has a lean std is because it must be backwards compatible, and already there are some deprecated things in std that must nevertheless be supported forever. With community crates, they can have breaking changes and can generally innovate more than those in std.
An alternative I'm not against is having the concept of "crate distros", where a bunch of related crates can be pulled in in one go with a tested set of versions. Something that delivers an experience like Django or Rails. At that point the distinction of "std" and these packages is one of nomenclature (and the kept ability to evolve independently from the compiler release schedule and remove things in the later).
It's not exactly what you are talking about, but tokio is kind of doing this - hyper, tonic, tower, tracing etc. and then all packaged together in axum.
That would be a great thing to have, but I can't imagine how that can be maintained. Managing versions of gdal+gdal-sys+geo+ndarray+ndarray-linalg has been a giant PITA recently so I for one would welcome this feature.
This seems like a self imposed limit rather than something fundamental.
I am not sure why people get so hung up on the stdlib's stability guarantees. We could have a second unstable "nonstandard" library for example, and eliminate this "issue" completely.
Even a crate "distro" like the other user said would be nice. Something to allow us to establish some level of trust over what we are using. Right now we have people mindlessly adding 100s of crates with no concern about the supply chain.
I’m the space Rust is in, people care about stability to the extreme. So some of it is self imposed, sure, but that self imposition is an understanding of what constraints the intended audience has.
Heck you can see it in this very thread with a complaint about too many new things even though they’re compatible!
I am not sure that something like a crate distro would actually cause a decrease in stability overall, at the end of the day people are still going to be using the same crates and those crates will be as stable as before, but it will be easier to figure out which ones are trustworthy for a person who isn't super deep into the rust ecosystem and whatnot!
I am thinking of nom for example, lots of people are already using nom and are comfortable with it's current level of stability, so if it was part of a crate distro people would just determine if nom was stable enough for them and use it, otherwise find something else.
I do appreciate that the real standard library and core language are stable though just to be clear. I also appreciate that the stdlib tries to stay somewhat lean rather than shoving the whole world in it.
I am just a random hobby rust user so it's very possible I am talking a bunch of nonsense!
Also thanks for replying, it's always cool to see you active in these threads :)
> I'm also thinking of my favorite things about languages like Racket and Python which is a really simple editor that is built-in.
I've heard this before but I've never really understood it.
Why on earth should language designers spend time and effort building a toy text editor when damn near everyone has VS Code, JetBrains, Sublime, some flavor of vim or emacs, etc. available to them? Why would anyone want to learn a new, completely bespoke editor just to play around with a new language when you can install a language extension for the editor you're already using in seconds?
If I want to build a UI and the language I'm using has a supported (yet limited but thats okay) UI stack, I can feel confident that it will not go unmaintained like third party packages for UIs. Also a lot of the UI stacks import other UI libraries in C++ or C which doesn't really make the benefits of Rust shine through.
I'm not much invested in Rust, but seeing how the C++ Committee wastes time discussing topics like a 2D graphics library or audio playback I shudder at the thought that Rust could become the same.
Also with Cargo, what's the difference between a 3rd party library and the stdlib? (none really)
Rust doesn't have the same goals as Python or Go. It remains first and foremost a systems language, where Go is first and foremost a web language.
Also: when the package ecosystem is as strong as Rust's, it's less important to supply first-party things just because they're useful, and more important to supply things that establish a common interface that multiple third-party packages can then agree on as a standard. See things like the Future trait
Are you telling me there isn't a single system in this planet without a UI? Because that sounds like a bad claim.
I would also appreciate it if the standard library had a standard for UIs so you can build something UI library agnostic, that would be really interesting to see.
Until we have languages like Rust and Go make headway with GUI stacks, we will be forever cursed to live in a world full of Electron web apps.
The systems language C++, traditionally came with rich frameworks alongside their compilers.
Nowadays it only applies to Visual C++, game consoles and embedded platforms, because everywhere else it lost the leading role for full stack development.
Great post and I really like the focus on sticking to the core mission: make things safe and do so without compromising on critical features like async.
I see the complaints about things like stdlib and I don’t get it - the core team is doing an excellent job of keeping rust doing what it should do best, and the crate system and community fills that gap. If you want a big stdlib then use Python.
It is a delicate balance--having a smaller high quality standard library versus the "everything you could possibly need", "kitchen sink" approach found in the Python standard library.
It is great that there is an active Rust community with tons of useful crates, but it can be daunting as a newcomer or when introducing Rust into an organization to discover and adopt the "right" crate(s) for common tasks that the standard library has no answer for.
An example common task is asynchronous programming. For my current project, I am better off using thread::spawn, async / await, async-std, tokio, smol-rs, or something else? I had tentatively selected async-std, but then I came across this thread: https://github.com/async-rs/async-std/issues/992#issuecommen...
If I select a given crate today for a given common task, will the crate still be around and maintained in three years? Will I need to rewrite my project to use the latest crate du jour?
I wonder if it would be possible for the Rust community to design / define a common library of traits for common tasks that could be used as "interfaces" (sorry, Java background) that library crates could be implemented against.
This would provide better stability for crate users and help to avoid analysis paralysis and uncertainty when selecting crates for common tasks that the standard library has no answers for.
There could even be multiple libraries of traits grouped by the type of task. This would allow for more natural groupings based on usage (instead of one huge library of traits) and permit independent changes from other defined libraries of traits.
The Ada programming language has a similar idea called "Standard Library Annexes". See the bottom half of http://www.ada-auth.org/standards/22rm/html/RM-TOC.html where the sections start with alphabet letters instead of numbers. There are standard annexes for: "B. Interface to Other Languages", "C. Systems Programming", "D. Real-Time Systems", "E. Distributed Systems", "F. Information Systems", "G. Numerics", and "H. High-Integrity Systems". Ada toolchains do not have to implement all of these annexes to be a standards-compliant Ada toolchain, but it gives those that do a standard interface so that they are more interchangeable, and code written against them does not have to be rewritten.
I think the big problem is that any such thing you'd design an interface for might change in such a way that it doesn't make any sense to preserve this hypothetical interface over time. Rust's standard library promises to keep maintaining all the pieces long after they're obsolete, and the bigger it gets the harder that will be.
In 1983 if I design a Video interface I guess we need PAL vs NTSC to handle colour standards, and also we need Interlacing, that's a thing. Should we have a fixed list of resolutions? Nah, let's splash out, user defined, that's future proof.
Oops, here we are in 2023 and while nobody cares about Interlacing, and the arbitrary user defined resolutions allow me to express 4K display, there's nowhere to pick 120fps, or to specify 12-bit colour.
>> I think the big problem is that any such thing you'd design an interface for might change in such a way that it doesn't make any sense to preserve this hypothetical interface over time. Rust's standard library promises to keep maintaining all the pieces long after they're obsolete, and the bigger it gets the harder that will be.
True. However, without more standardized components the Rust ecosystem could end up like the Common Lisp library problem: lots of libraries in various states of maturity and maintenance with no obvious choice of what to use to solve your problem.
Big standard libraries help to drive corporate and enterprise adoption. It is part of the reason why Java and Python are so widely used in corporate / enterprise environments. It brings a lot of standardized functionality and longer-term support that those kinds of environments want.
We rely on the Rust standard library because it will be maintained for a very long time and has mostly stable APIs. Without a bigger standard library or some other kind of standardization, we would build our own libraries that might re-invent the wheel, but at least we know that our libraries will be maintained.
That is what languages with rich standard libraries allow.
In Rust's case maybe what is required is a kind of curated set of modules, a bit like Java EE defined a set of basic functionality (in multiple JSR specifications) that every application server should provide, including Spring based ones.
It's not that delicate of a balance. Async is the only real problem here where some very fundamental behavior (i/o) is redefined by packages in the ecosystem due to some quirks about how async/await works in Rust without a runtime in std.
Frankly I don't see what "annexes" would buy the Rust ecosystem. You don't pay for what you don't use to begin with, and there's zero advantage to multiple implementations of the standard library. That's an antifeature in my opinion.
> If I select a given crate today for a given common task, will the crate still be around and maintained in three years? Will I need to rewrite my project to use the latest crate du jour?
The same is true in any ecosystem. How likely is your project to exist in three years, and how much would it cost to replace the dependency if it gets deprecated? It's just not that high of a risk to worry about.
>> Frankly I don't see what "annexes" would buy the Rust ecosystem. You don't pay for what you don't use to begin with, and there's zero advantage to multiple implementations of the standard library. That's an antifeature in my opinion.
I am not talking about multiple implementations of the standard library, but rather, optional groupings of traits for common tasks. For example, could the community design / define traits for an asynchronous programming runtime?
>> The same is true in any ecosystem. How likely is your project to exist in three years, and how much would it cost to replace the dependency if it gets deprecated? It's just not that high of a risk to worry about.
The company I work for supports products that have lifetimes measured in decades. I would love for us to use Rust, but the management will not approve Rust because of how unstable the ecosystem is. Updating products because the libraries we rely on are no longer maintained would cost time and money that could be spent in ways that would add value that our customers could appreciate.
I hope Rust can get higher kinded types through higher ranked trait bounds but for types, not just lifetimes, which would be equivalent to HKTs in other languages. I don't believe generic associated types are fully equivalent to HKTs necessarily but type level HRTBs should fill that gap.
And I'm also looking forward to that keyword generics initiative, hopefully that team can look at how OCaml implemented their algebraic effects system, not sure if it's totally comparable however.
> I hope Rust can get higher kinded types through higher ranked trait bounds but for types, not just lifetimes, which would be equivalent to HKTs in other languages. I don't believe generic associated types are fully equivalent to HKTs necessarily but type level HRTBs should fill that gap.
In service of what though? I love Haskell and HKTs, but Rust is a different language and I don't find myself actually reaching for HKTs, even when writing libraries.
Just my personal opinion, I think adding these features would be to Rust's detriment. I am happy to do my exploration into more complex type level features in another language in order to not add everything and the kitchen sink to Rust.
It's more like many features are almost there but not quite and then do something differently. GATs are almost HKT but not quite and they're Rust specific rather than based in a common theory like category theory that many languages implement. Keyword generics are also almost like algebraic effects but not quite. It's more that I want Rust features to go all the way rather than do something 80% and in a different way than what other languages might commonly do.
Rust promises backwards compatibility, but I'm not aware of any language that promises forwards compatibility -- not even C. C compilers from 1999 aren't able to compile code that uses C11 features.
And C23 compilers won't be able to compile code that still uses K&R, trigraphs, gets(), are free to ignore annexes made optional on C99 and still achieve ISO compliance.
What's more interesting to me is that C23 apparently removed support for non-two's-complement signed integers. Which vendors had to change position to get that change made, and what hardware had to officially get removed from serious consideration to make it happen? This site mentions UNISYS:
Correct. But other more mature languages, like C or Bash or whatever, have devs that don't immediately use bleeding edge features. Bash gets forwards incompatible features all the time but it's devs actually write for pretty much any OS from 2000 to now. Because it's mature. C11 wasn't required for a significant amount of C projects until close to 2020.
As Rust grows and it's demographic changes from bleeding edge types to a more general purpose userbase hopefully their use of the latest forwards incompatible features stop and distro rustc compilers become useful for more than half a year.
And you know this is true. Otherwise all the docs wouldn't be advocating for dangerous curl | sh rustc (into rustup) installs. You absolutely do not get this kind of community behavior in C where your system compiler is good for at least a decade.
I've moved away from C partly because I had to fight decades-old compilers.
It's incredibly disheartening to realize I can't hope it will fix even smallest warts, because the series of multi-year lags of the standards process, implementation, LTS cycles, and laggard upgrades mean everything takes decades and C from 1999 is still "new".
Distros need to get their act together. Downloading a tarball every few months is not hard, and Rust doesn't have the whole-OS-breaking UB fragility that a change of a system's C compiler may have.
At work I usually write C++17 but I recently just had to work on a project in C89. Going back 28 years is painful. Not to mention the instant reduction in productivity, but it made me seriously hate my job.
On the one hand: I hate the growing trend of "everything bundled seperately" and "everything has it's own package manager".
On the other...
Why are you compiling things using packages? If your job is dev work then surely you have a bunch of tools that are unversioned by the OS? (Perhaps you've considered that they're not actually unversioned).
VSCode, which updates in the background, intellij IDEA which updates into /opt.
There is value in having it packaged, it helps you have a relatively deterministic system especially w.r.t. building other packages.
But what does it actually give you the developer? Surely it's better to use the native tools for the language, which are tailored precisely for this and guarantee that your dependencies will be versioned correctly and available?
It's not like the binaries you create using the compiler require you to have the compiler installed.
I am an end user of programs people write in rust. Like when I tried to use a fast radio scanning software defined radio program written in rust. There was no compiled version. It was written 3 months after the version of compiler from my repos was released but still used forwards incompatible features. Or the time a web serial I was reading had a spider and epub generator written in rust. Approximately the same deal. And it's only become worse as my rustc became older than a few months.
I am not a developer. I use my operating system to run software that other people write. Most of it like the above examples don't come from repos but the tools to compile them do. Or at least did until rust became popular.
Homebrew manages just fine. This also sounds entirely like a problem with the rather old packaging process that evolved from the historical mess of distributing C projects.
Many users of Rust quite like that style, and it has served Rust well.
What I think that needs to happen instead, is better support for tooling (in Cargo and crates.io) that understands about the MSRV or Minimum Supported Rust Version. Once the tooling is in place, at least you can't accidentally break your builds by updating your dependencies to versions that are too new for your compiler.
Perhaps additionally releasing a Long Term Support version once a year or two would also help, as that would provide a common target for many crates to support conservatively.
It's not quite as safe, since with apt or rpm the package can be signed by the distribution a (presumably) safer system, instead of being signed on-the-fly by an outward-facing webserver.
And `curl | sh` is signed by a certificate issued by CA. It's the same amount of safety if not more, since there is a trusted 3rd party while package managers are "trust me bro, it's our public key"
Anyways, I'm not using rustup and using my nix flake to handle everything for me.
Those files you get from `curl | sh` are just whatever happens to be on their internet-facing server (or any CDN server) at the time, whereas signing the actual package instead of the tls certificate for the https connection means that package building and signing could have been done entirely on a secure/isolated machine. Using `curl | sh` you're rolling the dice that the web-facing server didn't get pwned. Using signed packages means you don't care if the web-facing server got pwned.
With a conventional package manager, the contents of the package itself is signed, not just the connection to the mirror that hosts the package itself. Having a secure connection to the mirror is not necessary in this case because there is no way to modify the package without it being detected.
With `curl | sh`, only the connection to the server is signed, but not what you are piping into your shell.
Can we even get a C++ compiler from the OS distribution and end up happy?
Even C can be "wrong" (gcc vs llvm for example).
So realistically nobody gets their compiler from the OS.
Distro provided compilers are meant to build the distro's provided packages, nothing more, nothing less. It is an accident of several confounding factors (slow release cadence, bad alternative delivery methods, wide disparity of feature support across deployments requiring developers to be conservative) that allowed some languages to blur the distinction and make their developers not have to care as much.
Rust releases every six weeks. Some crates update aggressively. Others go to great lengths to support 1.13. In practice, there are "watershed releases" where most of the ecosystem migrates to relying on the same version, because a specific feature unblocks some use case. Looking at https://lib.rs/stats the obvious two such cases were 1.31 (async/await, edition 2018) and 1.56 (2021 edition), but at different points in time I've noticed other smaller breaks (like 1.63) for specific features (like const generics) that after a while get smoothed out by further jumps.
One thing that keeps getting lost in the online arguments is that Rust is very aggressive about backwards compatibility despite its release schedule, so updating the tool chain has one cost: rebuilding your projects. Staying on older versions indefinitely means not only that you can't use the latest shiny language feature, but that you're not getting security updates or tooling improvements that materially improve your DX.
> Can we even get a C++ compiler from the OS distribution and end up happy?
no, tying compiler versions to operating systems as most linux distros do is one of the most harmful thing to software development in these last decades I think
I think developers have a very skewed view of this topic. I am not a developer, I just compile software that I want to use. People like myself are not replacing our system compilers with random third party ones. And yes, we expect to be able to compile things and it usually works out with languages like c, c++, etc.
Realistically nearly everyone gets their compiler from their OS. It's only developers that don't.
In those cases you don't want to bring out the big guns with dyn traits. But the current approach involving huge match forests, macros, helper functions and code duplication plainly sucks ^^'.
This could be solved by making enum variants proper types. Something that has been proposed already, but which has been rejected due to lacking capacity.
https://github.com/rust-lang/lang-team/issues/122