Very interesting and timely especially since I see the following so often:
> Why wouldn’t the Linux kernel team choose Ada instead of Rust?
To which I say: Ada doesn't have a large user base beyond DOD projects. Virtually no greenfield projects except for DOD. Whereas there has been a Cambrian explosion of Unix kernels/tools/servers written in Rust.
More generally, these questions of the form "Why are you ignoring X?" where X is Ada or Zig or C++ are really "Why didn't you write/don't you rewrite this in Rust?" by a different name.
And my/that answer goes both ways. Notably, Roc recently has stated it is moving away from Rust. Ghostty uses Zig because Mitchell Hashimoto doesn't like Rust. But the answer is always someone has to show up and do the work, usually because they like doing the work in some language.
So -- it's okay to like the project strictly because of the language chosen. That being said -- I'm super interested in the more technical differences.
The history of computing has many examples of products and languages winning out over technically superior alternatives for non-technical reasons. Like Linux gaining a lot of momentum because of the BSD lawsuit. This[0] is one of my favourite comments on HN. It shows us one of the paths not taken in the evolution of operating systems.
> It's okay to like the project strictly because of the language chosen.
I think that the massive groundswell of interest around Rust will end up being positive overall. At this point I can't see us going back to the way things were. I can't imagine any new systems programming languages gaining traction that aren't at least as safe as Rust.
> At this point I can't see us going back to the way things were. I can't imagine any new systems programming languages gaining traction that aren't at least as safe as Rust.
I see Rust being used in many context where it's safety features aren't really needed or even a hindrance like game dev.
Honestly the safety is overstated in importance. The amazing tooling, decent package manager, great error messages, the inclusive community, the type system that took the low hanging fruits from the academic functional programming world, are much more relevant for its success.
The problem with Rust is that it made using languages without automatic garbage collection acceptable again and I am not sure if this is good. The borrow checker is a neat solution and might be a good fit for certain niches but automatic garbage collection is just vastly more ergonomic to use.
We had Lisp Machines, we need to go back to future.
I'm with you - I like Rust for the overall pros/con balance, in context of with its alternatives. Safety is one thing. The tooling, error messages, type system etc you mention is another. Having mutable references as function parameters is another big one, shared by surprisingly few alternatives.
I've written Rust professionally for the past 6 years, and pretty much exclusively for stuff where memory safety is the least of my concerns (even speed!). I've made gRPC APIs, agents running on machines to setup configuration and such. The usual suspects for this work would usually be Python or Go.
You've pretty much summed up why I'm standing behind Rust for non-memory-critical applications: tooling, error messages, and the type system. For business logic, you cannot beat all that, as a lot of business errors can be avoided using those (all praise the Rust enums!). Also, you've summarized my feeling about the functional world embraced by rust: it provides quite a bit, without requiring a Haskell PhD to write it. Seriously, iterators and the associated trait are a godsend (even though having to always write .iter()....collect() is sometimes clunky).
As for the GC. I've yet to come to situations where the lack of GC was the real hindrance. It has more to do with futures adding not easy to grasp lifetime bounds to data, but one quickly learns to embrace it by moving data when needed. Combined with channels and Arc, there are many escape hatches making in my opinion the GC a non-issue.
IMHO, making GC-less languages acceptable again is a good thing, because it keeps diversity in the ecosystem and gives options. It also shows that doing without GC does not relegates written code to manual memory management like C/C++. In fact, Swift does the same and is pretty good at being a GC-less language. Both Rust and Swift are languages where memory management is opinionated by embedding it in the language, and thus ergonomic. Frankly, I rarely think about "oh hey, I wished I had a GC".
GC implementations are widely available in Rust. You don't have to use garbage collection in Rust, but it doesn't stop you either. You just have to import some custom crate and 'implement' tracing for your own data types.
> I can't imagine any new systems programming languages gaining traction that aren't at least as safe as Rust.
I don't think safety is a 1 dimensional spectrum (say from 0% to 100%). I view it more as a multidimensional thing, and tradeoffs between these dimensions.
Rust invested a lot on _some dimension_ of safety, which is essentially static borrowing and aliasing. And the tradeoffs for this are numerous: Dynamic safety is either much harder in Rust (due to unsafe being vastly more dangerous and hard to get right vs say, a C program), or at the expense of performance (I.e. just Arc<Mutex<>> or copy everything). That also comes at the expense of an immense cognitive burden for the programmer.
I see a lot of space for different tradeoffs on different dimensions of safety.
I don't think this is true. Rust has constraints that C/C++ doesn't have. For instance, it's undefined behavior to create more than one exclusive (mutable) reference to the same object or to create one where a shared reference already exists. This is not necessarily easy to ensure.
The aliasing rules in C are much more lax: you only can't have several pointers of different types pointing to the same object, except if one of them is a character pointer (ignoring the restrict keyword, which is opt-in and rarely used).
I don't think this is quite the same comparison. In Rust, multiple mutable pointers to the same object can exist at the same time. So, it's similar to C in this way. It is mutable references that must be exclusive.
It's besides the point whether C pointers are more similar to Rust pointers or references. It's even true that pointers BY THEMSELVES have fewer constraints in Rust than in C . It's in the interaction between pointers and references that it's very easy to trigger undefined behavior in Rust.
Besides the fact I already mentioned about the dangers of casting pointers to references, there's also the problem that pointers are only valid as long as no operations are done with references to the same object (no interleaving). On top of it, the autoborrowing rules make it so it's not always clear when a reference is being taken (and operated upon).
So yes, in my opinion _unsafe_ Rust is significantly more difficult to get right than C.
But only Rust uses its equivalent to C "restrict" wrt. its shared and mutable references (unless dealing with UnsafeCell<>) for alias analysis optimization. (In fact, this used to trigger bugs in LLVM because it was hard to test its aliasing optimizations with purely C code.)
It’s not just restrict. Rust indeed does have much more strict rules that even though at first glance seem similar to C, the optimizer is allowed to do weird things if you violate a lot harder to understand rules. This has been very well documented.
For example, if I recall correctly it’s technically UB to cast to a *mut pointer unless the original variable was declared mut even if you have exclusive ownership over that object.
There was a fantastic blog post describing a lot of these nuances but I can’t find it.
Rust has stricter rules than C++. E.g. global mutable variables are way more dangerous in Rust than in C++ - just creating a reference to it can be immediate undefined behaviour.
> I don't think safety is a 1 dimensional spectrum (say from 0% to 100%). I view it more as a multidimensional thing, and tradeoffs between these dimensions.
Absolutely. This is evident in the different language features of Rust and Ada. At any rate, Rust has advanced the state of the art when it comes to memory safety. The next generation of programming languages are inevitably going to be compared against Rust. In the same way Rust has been compared against Ada. Especially given Rust's popularity. I imagine that if a next-generation 'systems language' can't at least match Rust's features, it won't achieve wider adoption.
That comment was great. It's so hard for me (raised on Windows and now immersed in Linux) to imagine something so foreign to the OS paradigms I'm used to. It just goes to show how much the things you are used to can limit your thinking.
Windows NT was supposed to be a modernized VMS (And legend has VMS+1=WNT, each letter individually…) - and given a face similar to Win 3.1 (and then 95/98/Me and then they were unified in XP)
It has a lot of the VMS features, and most of them are unused or effectively only used by malware (e.g. alternate file streams).
I did not get a chance to use VMS myself, but I did use Unix (and VMS/CMS, and a bunch of other OSes) at a time VAX was still a viable alternative even for greenfield projects.
My impression is that it is sort of like J/K/APL - powerful, compact, coherent, got almost everything right from a technical standpoint - but not friendly, feels foreign, and not liked by anyone who has ever used a Unix, or Windows, or a Mac - which is 99% of the relevant population since about 1985.
(Added): Macs also had multiple file streams (“forks”). It doesn’t anymore, for many reasons, but the chief one is interoperability. Less is more, worse is better.
Which goes to show why *nix won over more feature rich alternatives - the human element. Being feature rich makes the learning curve (usually) much steeper but tends to help in the long-term, while shifting the complexity in to userland means that if I don't need a feature, it needn't concern me.
You trade ability for reduction of complexity - getting an operating system that you can learn in months instead of years, which - since the tech industry has been in hipper-growth phase for much of the last 30 years, was more important.
Indeed, had AT&T been able to sell UNIX at VMS prices (or whatever else OS from those days) from day one, without access to source code, and history would have taken a different path, regarding UNIX and C adoption.
Given Cutler hated UN*X, that only makes sense. NT did a lot of things right at the kernel level.
> and most of them are unused or effectively only used by malware (e.g. alternate file streams).
"Most of them unused"?
ADS was added to NTFS to support Mac OS file sharing over AFP ("Services for Macintosh"). Microsoft makes use of it for downloads from Edge, among other things. It's otherwise not a supremely useful feature and one that you'll silently lose data when moving to another fs/www further degrading it's usefulness.
Re file streams yes classic macOS did have them current macOS does not but it does not need them as you can use Bundles instead which are Unix directories that are treated as a whole by the SDKs.
> At this point I can't see us going back to the way things were. I can't imagine any new systems programming languages gaining traction that aren't at least as safe as Rust.
Zig seems to be gaining traction. Hard to predict whether it will succeed or slowly fizzle out after the initial excitement phase though.
Is Zig gaining traction? It almost seems like the project has gone radio silent in the past year. It's hard for a language to get traction when it's not even 1.0 and there's not much of a roadmap on how and when they'll get there.
> The history of computing has many examples of products and languages winning out over technically superior alternatives for non-technical reasons.
While this is undoubtedly true, i suspect that there are also many cases where people presume the reasons are non-technical but when examined closly turn out to have hidden technical issues.
I can't imagine any new systems programming languages gaining traction that aren't at least as safe as Rust.
Am I allowed to sigh here? It's a good thought, but this is humanity we're talking about here.
All it requires is some big-project, some big-corp creating some essential ecosystem with lame-language-A, and now it's got traction.
And frankly, look at rust and its weird crates. Weird as in "Let's make a super-safe language, then import random stuff from unknowns, unaudited, and just hope it all works out."
If rust was really serious about security, only signed crates from validated developers, with code review would be the norm. Oh but wait, that'd be "not easy".
Vetting code is extremely expensive in terms of human time and human energy. Sadly it's not only a problem for Rust, it's a problem for every single programming language out there.
> The history of computing has many examples of products and languages winning out over technically superior alternatives for non-technical reasons.
Unfortunately, the history shows that such technically inferior languages and products winning out are also because of outright market manipulation by their corporate backers, tons of money in ads, or attacks on competitors. The elephant in the room, is to look at what's generating or paying for the hype and marketing behind recent programming languages like Rust and Zig (for example).
> I think that the massive groundswell of interest around Rust...
A review of the groundswell, around Rust or Zig, is its not attributable to just grassroots or genuine interest, and can significantly be the result of outright astroturfing and artificial brigading. Sadly, many people follow along with the marketing or are bombarded into submission, "must rewrite in X" or "GC all bad and manual all good" foolishness, versus actual use case and legitimate examination of language tradeoffs.
Ada is also used in European transport infrastructure.
There are enough customers around the globe to keep 7 Ada compiler vendors in business, not many language ecosystems can claim that, in the age folks don't pay for their tools most of the time.
OTOH, DoD (and some public sector orgs in Europe) are extremely important customers of some of the companies that directly contribute the most support to the Linux kernel in a way that no Rust project is.
I have been attempting to compare Ada and Rust for a while, but never got around to doing it quite so much in depth.
So, my take on this is that:
- out of the box, Rust is distinctly better at memory safety than Ada without SPARK
- Ada + SPARK is extremely powerful (but iirc, SPARK doesn't allow compositional reasoning, which makes it harder to use in a crate-based library)
- for this specific use case, Rust's error model makes much more sense than Ada's (with or without SPARK)
- Ada's tooling to avoid memory allocation remain more powerful in some dimensions than Rust's.
Also, I think that we all (should) agree that the move towards memory safety in system-level programming is a very good thing, whether it comes through Rust, Ada, or any other language.
Directly using Unchecked_Deallocation is frowned upon, either use ControlledTypes (which the article mentions), or Bound/Unbounded Collections (which the article misses).
Using it directly is similar to the usual discussion of using unsafe in Rust for double linked lists.
As of Ada 202x, SPARK covers most of the language, no longer a subset, and also does affine types, aka borrow checking.
> Directly using Unchecked_Deallocation is frowned upon...
This is true. I admit that I didn't really acknowledge this in the article. I personally don't really like the Ada community's consensus on this matter. In many applications 'let memory leak and let the OS sort it out' just won't cut it. It's true that you can use controlled types for everything, but isn't this just moving the Unchecked_Deallocation somewhere out of sight? Also, if I'd have written 'You can't leak memory in Ada if you stick to using collections', wouldn't that just be like saying 'You can't leak memory in C++ if you stick to using the STL'? The point of the article was to show what Ada and Rust do to outrightly prevent you from shooting yourself in the foot, at the core language level.
There's a lot of misconceptions floating around the Ada community about memory being freed automatically when it goes out of scope, which I think have stemmed from the ARM leaving open the possibility for a compiler to implement its own GC. Admittedly even I was misinformed. In my defence, nearly all the Ada I write is for bare metal, and never touches the heap at all.
> ...'let memory leak and let the OS sort it out'..
I wouldn't be surprise to hear this from C or C++ folks, never heard this in Ada circles, there is always that missile meme going around, and I happened in the past to occasionally drop into FOSDEM Ada's rooms.
Speaking of which,
"Memory Management with Ada 2012 from FOSDEM 2016"
> Also, if I'd have written 'You can't leak memory in Ada if you stick to using collections', wouldn't that just be like saying 'You can't leak memory in C++ if you stick to using the STL'? The point of the article was to show what Ada and Rust do to outrightly prevent you from shooting yourself in the foot, at the core language level.
A programming language is seldom used without its standard library, I don't care what a plain grammar + related semantics check offer, if there is nothing on the standard library, IDE tooling, libraries to chose from.
Otherwise than you should also do a comparisasion using no_std against Ravenscar profile.
And yes, unless you are using a C++ compiler that offers the option to harden its standard library, you are going to some surprises, starting by having to use at() instead of operator[]() for bounds checking, and even with hardened one, stuff like use-after-free/move only when using a good IDE like VC++/Clion or static/dynamic analysis tooling.
I tried to track down where I read someone advocate this, but I can't find it now. I probably shouldn't have accused the whole Ada community (of which I'm a member) of this attitude. Regardless, I find it odd that 'Unchecked_Deallocation' is so thoroughly discouraged without what I'd consider an acceptable alternative.
> A programming language is seldom used without its standard library...
Nearly all of the Ada I write is for RISC-V microcontrollers, using the 'light' runtime that comes bundled with GNAT. There's no collection classes, and no heap for them to allocate anything in. The semantics of the language are absolutely still important. I still want a language that makes harmful mistakes difficult.
> how do you handle a logic error in your code? does that even make sense philosophically?
In some contexts sure, there's not really much you can do, but in many others it's perfectly clear. In a request-oriented system like a web server, you stop the failed request-handling logic and send back an error code. Exceptions can make this quite natural to implement. Much better than letting the error go undetected, invoking undefined behaviour, and risking mishandling of future requests, or perhaps even a serious security vulnerability.
> Yet you could also enable bounds checking for debugging on operator [] or use address sanitizer.
I'd flip that around. Why bother with additional platform-specific cleverness when the standard library already gives you what you're after?
Every now and then you get comments like this, that are intentionally offensive, like a burning bag of poop.
The point of .at() is to check the bounds at runtime. Here is your answer.
Regarding the comments. You don't necessarily know if it is a logic error and you certainly don't know if it is in your code.
Now onto the how. You handle it like any error. If you don't have a specific error handling strategy, you handle the error at a higher level e.g. your HTTP server returns a 500 http status code and logs the error and the location, so that the developer can fix the bug.
As to whether it makes sense philosophically. The only limitation is your lack of imagination. In C, the program would keep executing and create a security problem. The fact that the exception handler got called in the first place implies that at least the security vulnerability has been averted, meaning that you did indeed recover from the error, even if the catch block is completely empty.
You can check manually, but we know programmers can't be trusted to do so correctly every time. There's an ocean of security vulnerabilities because of this, as out-of-bounds access of a raw array is undefined behaviour in C and C++.
It makes good sense to use at in debug builds especially, where you're less concerned about the performance penalty.
In C++ you can define a macro to select between at and [], and while this isn't exactly pretty, it does work:
#ifdef DISABLE_RANGE_CHECKS
#define AT_METHOD operator[]
#else
#define AT_METHOD at
#endif
#include <vector>
// ...
i = my_vector.AT_METHOD(my_idx);
> even there it looks questionable imo
Why? Its behaviour is pretty much strictly better than that of raw []. The only downside is a possible performance hit.
I don't program in rust or ada, however it seems like such a comparison is missing important details. After all, if memory safety was all that mattered we'd all be using python. The thing that gets people excited about rust is not just memory safety but the tradeoffs taken to make that happen.
What got Rust adoption if anything was human factor, most folks adopting it never programmed in Ada, Object Pascal, Modula-2, Delphi,...
They weren't even born when we were using them, or if they were their toys were actually toys not systems programming languages.
Rust was their first encounter with systems programming, and Standard ML linage, after lots of Python, Ruby and JavaScript, and most likely never seen anything else other than C and C++. Most likely that also believe the fairy tale of C being the very first systems programming language.
To complete what you write, the people in the room during the design of Rust were all very much aware of memory-safe languages. Rust was largely designed by putting C++ developers (who were tired of undefined behaviors) and ML developers (who were aware that ML was not great at system programming) in the same room and have them negotiate a language together, in the hope of producing something that would convince C++ developers to migrate to something less risky.
I kinda vaguely remember Erlang being mentioned, but I don't recall Ada being discussed. To be fair, I didn't attend many of these conversations.
The Rust designers might have known that other memory-safe programming languages had existed, but certainly they were not familiar with them.
Otherwise Rust would not have inherited a ton of obsolete and useless features of C.
For example, when I have seen for the first time a "Hello, world" written in Rust, I have been appalled to see that in a language designed in the 3rd millennium one still has to specify whether function parameters are passed by value or by reference, which is the wrong thing to specify about function parameters, exactly like specifying in old C language whether a local variable was "register" or "auto" was the wrong thing to specify. (The right thing to specify about function parameters has been established in 1977, almost a half of century ago, by DoD "IRONMAN", which has been implemented only in a few languages, the most important being Ada. In my opinion, most difficulties and complexities of C++ have been generated by its failure to comply with "IRONMAN".)
In my opinion, attempting to design a new programming language without first studying carefully the history of programming languages, in order to know well the many good and bad choices that have already been used in various programming languages and to understand which have been their advantages and disadvantages, is an amateurish approach that cannot result in real progress.
Most recent programming languages have been built around some innovative great idea that makes that programming language better than the previous in some respect, e.g. Rust for compile-time checking of memory accesses, but because in the other areas of the language the best previous features have not been chosen, no language is uniformly better than older languages, all of them are better for something than older languages, but worse than older languages for other things, which makes the choice of a programming language very frustrating.
Rust is very much designed as a low-level language. If you want a higher-level language with similar guarantees, OCaml, Haskell, F# or Scala do the job very well already.
The mechanism of argument passing has its uses in terms of both performance control and avoiding unwanted aliasing, something that not many languages do properly (not even Ada, iirc).
It's not the right thing to do for all languages/applications, but I believe that it's the right thing to do for Rust. Do you have a better design in mind?
What they want is in/inout/out, with the compiler determining if the semantic is better implemented by value or by reference.
In Ada, there are also requirements that some types are always passed by value or by reference. And as of Ada 2012 there's a way to force the "always pass by value" types by reference.
The short of it is, Rust doesn't do great because STEELMAN wants subtyping and contracts (though I see contracts may be coming to Rust...) as well as return values instead of exceptions.
Oh, I thought they were advocating for something new. Yeah, the split in/inout/out is nice, but doesn't sound particularly better than what we have in Rust. In particular, in my book, return values instead of `out` clearly win in terms of readability.
Also, yay for contracts in Rust :) Looking forward to seeing them proven by model-checkers, too!
> has to specify whether function parameters are passed by value or by reference
You specify whether arguments are borrowed or moved, and whether the access is shared or exclusive. This is not an implementation detail, it's an API contract that affects semantics of the program. It adds or removes restrictions on the caller's side, and controls memory management and thread safety.
People unfamiliar with Rust very often misunderstand Rust's borrowed/owned distinction as reference/value, but in Rust these two aspects are orthogonal: Rust also has owning reference types, and borrowing types passed by copying. This misunderstanding is the major reason why novices "fight the borrow checker", because they try to avoid copying, but end up avoiding owning.
There are different possible approaches to achieving similar results for argument passing, but Rust prefers to be explicit and give low-level control. For example, Mutable Value Semantics is often cited as an alternative design, but it can't express putting temporary loans in structs. The syntax needs a place to declare lifetimes (loan scopes), as otherwise implicit magic makes working with view types tricky or impossible: https://safecpp.org/draft-lifetimes.html
I agree about your greater point of learning about history being important, but very much disagree about your specific gripe. What Ironman says is:
7.2.F. (7F.) FORMAL PARAMETER CLASSES
There shall be three classes of formal data parameters:
input parameters, which act as constants that are initialized to the value of corresponding actual parameters at the time of call,
input-output parameters, which enable access and assignment to the corresponding actual parameters, and
output parameters, which act as local variables whose values are transferred to the corresponding actual parameter only at the time of normal exit. In the latter two cases the corresponding actual parameter must be a variable or an assignable component of a composite type.
7I. RESTRICTIONS TO PREVENT ALIASING
Aliasing (i.e., multiple access paths to the same variable from a given scope) shall not be permitted. In particular, a variable may not be used as two output arguments in the same call to a procedure, and a nonlocal variable that is accessed or assigned within a procedure body may not be used as an output argument to that procedure.
These are very reasonable!
7I is important (in the mutable case), and solving the problem that bought forth this rule is the core idea behind Rust.
7F draws useful semantic distinctions: Output parameters are return values. Input-output parameters are mutable references. Input parameters act as a copy or readonly reference – which are equivalent if you can't mutate or observe mutation through readonly references.
Your complaint is that Rust draws a distinction between whether input parameters are implemented via a copy or a reference, but Rust draws many more distictions here – arbitrarily many so, because it's part of the type system. This makes the system both applicable everywhere instead of only in function parameters and allows expressing more different semantics.
For example I have a hard time seing how one would, while following the Ironman requirements, distinguish parameter whos type is "dynamic array of elements of T" from one of type "dynamic array of elements of type mutable reference to T". Likewise you can define a record that holds a mutable reference and an immutable reference, or define a new reference type (such as a reference counted pointer) without new language support.
I said that passing a copy and a readonly reference are equivalent. However, in Rust there are no readonly references. Rather, there are non-aliasing references, which allow mutation, and aliasing references, which by default don't allow mutation – but types can make use of interior mutability to allow mutation through aliased references according to they rules they need. For example you can access data protected by a mutex if and only if you hold the lock, which means that even if other references to the mutex exist, there are no other references to the inner data.
Just to be clear, not your parent, and not really a fan of STEELMAN. But.
> Input-output parameters are mutable references.
In/out can also be by value, it would copy any updates back after the call.
> Input parameters act as a copy or readonly reference
This is what your parent is getting at: the idea is that in/out/inout describe intent, not mechanism. The compiler chooses the mechanism for you.
I think in a language that's intended for lower-level tasks, describing mechanism is important. That said, outside of STEELMAN, there's an argument to be made for in/out/inout, and in fact, there's been some discussion over the years, for example, &uninit T as a sort of variant of out.
> For example I have a hard time seing how one would, while following the Ironman requirements,
You're right. STEELMAN updated this section to say
7I. Restrictions to Prevent Aliasing. The language shall attempt to prevent aliasing (l.e., multiple access paths to the same variable or record component) that is not intended, but shall not prohibit all aliasing. Aliasing shall not be permitted between output parameters nor between an input-output parameter and a nonlocal variable. Unintended aliasing shall not be permitted between input-output parameters. A restriction limiting actual input-output parameters to variables that are nowhere referenced as nonlocals within a function or routine, is not prohibited. All aliasing of components of elements of an indirect type shall be considered intentional.
Ada has "access types," which are pointers. You can declare them as aliased or not. Via this mechanism, you can pass both an aliased variable as an in parameter, and an access type that points to it as an in out, and still modify the variable, even though it's in. This will give you surprising behavior, but it's not UB, so you get "oh hey that number is not what it should be" not "this means half your program is optimized away.
Ada does prevent data races, because it has a built-in task system, and that task system only allows multiple tasks to access "protected objects" that are basically RWLock<T> in Rust terms.
I'd agree it's largely the human factor, but maybe from a different angle. Rust has got the "there's a crate for that" factor, like JavaScript, which helps onboard people who are afraid of getting stuck writing everything from scratch.
For example, serde (and serde_json, toml, etc.) is a gamechanger, as is tracing, reqwest, and many others.
So, maybe less about the language itself for many people, but more about the human factor of overcoming fear of being stuck on an island
I clearly remember sighing deeply back in 2002 realizing I have to introduce yet another array type so my code could compile in Object Pascal. And this is just one example of many.
Rust is objectively better than what I tried before, in several ways.
From where I was standing several years ago, there was almost zero "human factor" in adopting Rust for many companies (and devs). People simply saw the better features, performance and security and got excited and wanted to work with that language. And then many mistook that for hype. Yourself included apparently?
Sometimes things get popular just because they are good, you know. Not super often historically, mind you, but it does happen. IME Rust is one of these outliers.
Who do you think does most of the work in LLVM for example?
It is full of PhD paper contributions.
For example, research students doing internships at Embecosm (https://embecosm.com), as many others, plenty of material from LLVM Developers Meeting sessions.
I guess this is due to the tag line of the company. I am not familiar with the compiler/LLVM space so unsure how the different branches (compiler maintenance and AI tool infrastructure for example) are covered by the PHD internships, etc.
Sure, you still get use-after-free and manually memory management, which does not hold folks to now race into rewriting stuff in Zig.
However it gets you covered on:
- proper strings with bounds checking
- proper arrays with bounds checking
- no pointer decays, you need to be explicit about getting pointers to arrays
- less use cases of implicit conversions, requires more typecasts
- reference parameters reduce the need of pointers
- variant records directly support tags
- enumerations are stronger typed without implicit conversions
- modules with better control what gets exposed
- range types
- set types
- when using COM like interfaces, reference counting
- arenas
Sure C++ also covers some of that, that is why I went C++ after Turbo Pascal, but too many folks still use C like idioms instead of improved C++ ones, it is like adopting TypeScript, but still write JavaScript instead.
And yet here we stand, still not having it replaced at scale.
To quote Hoare in 1980,
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
===> I note with fear and horror that even in 1980 language designers and users have not learned this lesson.
Referring naturally to C, by the way 1988 is the birth year of the Morris Worm, taking advantage of C's buffer management capabilities.
Yes, a pretty low bar that 40 years after those events is still largely used across the industry.
And the copy paste compatibility of Objective-C and C++ with C doesn't help, because even though those languages provide better capabilities and safer alternatives than writing raw C, there are always those folks that will write C style code no matter what.
> And yet here we stand, still not having it replaced at scale.
Network effects and business' risk aversion.
Nothing to do with technical superiority / inferiority really. The MBAs only hear "we must replace everything from scratch" and immediately shout "NO!".
Offensively simple I am afraid. That's all there is to it.
Believe me, I'd jump with joy if we scrap like 75% of all UNIX crap, especially "untyped text rulz" in the terminal -- and introduce one good terminal protocol in the process. But it ain't happening in yours and my lifetime.
I look at the problem the other way around: There are certain domains that significantly constrain the choice of programming language, for reasons like compatibility, or performance. If you're already limited to a small set of statically-compiled 'systems languages' like C, Ada, Rust, etc, then memory-safety is a secondary factor that will really influence your choice of which systems programming language you use.
Rust is the first (yes, first) mainstream language that offered reasonable and composable memory safety, without compromising on speed.
Ada does not offer anything similar. It's designed for programs that are mostly static. It also does not compose well, as you can't dynamically allocate without losing most of Ada touted advantages.
To be fair, they recognized it. Their plan is to adopt a borrow-checker.
No they don't. Most programs are full of dynamic allocations (String, Vec, etc.) with only rare use of RefCell (I assume that's what you're talking about).
Most non trivial programs I've seen in Rust just end up with Arc<> and copies all over the place, because getting lifetimes right becomes borderline insanity.
It totally depends on the program. Sometimes you need to use Arc, sometimes you don't. Arc doesn't involve any runtime asserts though and it's not a crime to use it.
For example I just ripgrep. Arc is used 29 times, compared to 185 uses of Vec.
> Rust is the first (yes, first) mainstream language that offered reasonable and composable memory safety, without compromising on speed.
Is that true, though? Many memory-safe languages are very fast (fast enough for the vast majority of use-cases). Feels like mostly they compromise on memory, don't they?
Rust seems to be a good replacement for those use-cases where C++ is needed. But given the popularity of Rust, it seems to me that many people use Rust where they don't actually need it.
No, for systems programming it means nobody needs it to be faster.
You're welcome to use Python for whatever slow scripting you personally need, but it's not suitable for writing an OS and libraries because it's going to make every program slower, and while that might be ok for you, it's not going to be ok for a lot of people.
I explicitly said "for the vast majority of use-cases". Of course you can pick one that is not part of it and pretend I included it in that majority, but then... what's the point of even trying to communicate?
The comments you're replying to aren't claiming that Rust is the only language that is fast enough for any use case. They're pointing out that Rust is the only language that is memory safe
> without compromising on speed
I was just trying to explain why that matters to you.
Nobody is saying you have to use Rust for everything.
Well, Java is memory safe, and the JVM is fast enough for tons of use-cases.
Before you get at me for "fast enough", let me say that "without compromising on speed" doesn't mean anything either. I can write super slow code in Rust, it's not like the language prevents me from doing it.
My point is that in many cases, the speed is really not the problem with the JVM.
> "without compromising on speed" absolutely does mean something, and it doesn't mean "fast enough".
I Googled "without compromising on speed" and didn't find a formal definition. Would you mind helping me there?
"Zero-cost abstraction" is a concept I understand. But... are you sure that it would be impossible to write faster code (even if you did it perfectly in assembly) than what Rust generates when you use `RefCell`?
Feels like it is compromising on speed, to some extent.
> but in general it means there aren't any systematic overheads
Or maybe I am wrong and there are absolutely no systematic overheads in any concept similar to `RefCell` in Rust? Is `RefCell` zero-cost? That's not my understanding.
> I Googled "without compromising on speed" and didn't find a formal definition. Would you mind helping me there?
It's not a formal term. The meaning was clear though.
> But... are you sure that it would be impossible to write faster code (even if you did it perfectly in assembly) than what Rust generates when you use `RefCell`?
The second example is what you'd do if you were writing C or assembly. Rust takes the safer option by default.
> Is `RefCell` zero-cost? That's not my understanding.
I think you are misunderstanding what "zero-cost" means. Or maybe what RefCell does. RefCell checks are runtime if a value is borrowed by some other code. It's kind of like a single-threaded mutex. You can't implement that in assembly any better, so it is zero cost.
You are probably thinking "but in C you don't need to use RefCell at all!", and that is true but unsafe. You can avoid that in Rust, but nobody does because it's 2 instructions (which will be perfectly branch predicted) so it essentially costs nothing.
It's kind of like how `std::vector::at()` is still a zero-cost abstraction in C++. The thing you are abstracting is "array access with bounds check". You can't perform that function any better with hand-written assembly. Saying "aha, but in C we never bother with bounds checks!" doesn't invalidate that.
Was it? You are telling me that it is actually possible to write faster code if it is unsafe, but for some reason it doesn't count as faster in your book, and therefore it's not compromising on speed ("just ignore the faster solutions").
With that kind of definition, all the code I ever wrote is provably optimal: if you ignore all the solutions that are more efficient than my code, then my code is the fastest.
Yes it was clear. Unsafe Rust is still Rust. I already explained that in some cases you can beat the compiler but there aren't any systematic overheads.
I think you understand at this point and are just nitpicking to avoid admitting it, so goodbye.
Not sure if that's bad faith, but I will assume you did not understand my point.
My point was that it's not enough to say "If you use Rust, your code will run faster", because the language does not do it all. Most code out there is largely inefficient. The tendency is to pile up dependencies and frameworks to be more productive.
No need for Rust to make an ElectronJS app vastly more efficient.
Similarly, I recently wrote a TUI with a popular Rust TUI library, and it takes up to 10% CPU just by refreshing the view when typing a text. It's not my code, it's the (again, popular) library that explicitly doesn't consider that a problem. It's not because it is written in Rust that it is efficient, is it?
As I said, the one example I have is a very popular TUI library and it's a lot less efficient than the popular alternative in at least 2 other memory-safe languages I tried.
"Fast enough" is doing a lot of lifting in your argument. I've met plenty of embedded devs that said they'll only code either C or Rust and that nothing else comes close in terms of tooling / dev UX and speed.
For all my 23 years of work a language like Elixir (or even Ruby) was "fast enough" but there exist a ton of other world out there that needs more.
Because when I said "for the vast majority of use-cases", I expected people to understand that I agree that some use-cases can't run e.g. a garbage collector.
And I did understand you. Point was that "fast enough" is definitely not good enough for a lot of other use cases, nothing else. If anything, that's kind of reinforcing your point. :)
Right. Yeah then we agree :-). As I said, I do like Rust, and I think it makes sense in many cases.
I just feel like I read a lot of "Rust is the only worthwhile language our there", and actually there are many languages that are a better fit in their use-case.
> I just feel like I read a lot of "Rust is the only worthwhile language our there"
Pay no attention to zealots. Every community has them. I love Rust beyond what's rational and I still am sensible enough to reject it for projects where it would only prolong development time for not much gain (the projects in question do not need the super-duper speed or to use much less memory than you would get with my favorite Elixir, for example). And for a lot of script-ish / one-off / I-want-to-finish-it-quickly projects I just opt for Golang.
IMO projects like rewriting the UNIX userland utilities in Rust makes perfect sense; the originals are super old and nobody wants to seriously find security bugs in them and since they are written in C you can bet your neck they absolutely do have buffer over-/under-flow bugs that can likely lead to escalation of privileges. Is anybody fuzzing those tools 24/7 for years? They are the buildings blocks of at least 90% of all servers out there!
Rust makes sense in embedded as well, in financial trading, or in any system that has to squeeze every last drop of its hardware in general.
But web applications / API endpoints / API gateways and such? Meh, a compiled language in a VM like Elixir is plenty enough and always will be.
"Right tool for the job" should be the first catechism of the future Tech Priests and their cult to the Omnissiah (insert "Warhammer 40,0000" reference here ㋡).
Before Rust, the only way to achieve memory safety was to use a garbage collector. Or static allocations. Experiments with region inference like in Cyclone were impractical, because they often led to accidental memory explosions.
> if memory safety was all that mattered we'd all be using python.
I've seen more segfaults in my Python code than I have in any other language ecosystem. Yes, that's in the C modules that Python libraries so often wrap, but the result is still that Python is far from the best choice for memory safety.
JavaScript actually tends to be better at avoiding dropping into C unless absolutely necessary.
Python does not offer the same guarantees for safety when it comes to concurrency. Afaik Rust makes data races impossible, by forcing you to make use of specific types, when you have a concurrent scenario, and since everything has to type check (in contrast to Python) before you can run it, that limits you in what you can do, making it safe.
Maybe for those that have to use Rust. But borrow checker was sole reason why I wanted to learn Rust. And at that time (~2018) it was also reason why people were recommending it.
And people like C in spite of the segfaults. It's the same disliked thing, Rust just moves it to compile time instead of production. Which many people prefer, since it avoids security issues, data loss, and mysterious crashes.
> Ferrous Systems and AdaCore are.. joining forces to develop Ferrocene - a safety-qualified Rust toolchain, which is aimed at supporting the needs of various regulated markets, such as automotive, avionics, space, and railway.. qualifying the Ferrocene Rust compiler according to various safety standards.. [including] development and qualification of the necessary dynamic and static analysis tools.. our long-term commitment to Rust and Ada extends to developers who will be using both languages at the same time. We are looking at interoperability between them - including, in particular, the idea of developing bi-directional binding generators.
Nvidia uses Ada/SPARK for formally-verified security firmware on RISC-V cores on GPUs [1] and SPDM attestation of devices like GPUs and Infiniband NICs [2].
As explained at your link, the example program that is not type-safe is based on a mistake of the 1983 Ada standard regarding the use of "aliased", which has been removed by a later Technical Corrigendum, where the program demonstrated at your link is explicitly classified as erroneous, so any compliant Ada compiler should fail to compile it.
As also explained at your link, the same type-safety breaking technique works in unsafe Rust. Both "unchecked" Ada and "unsafe" Rust do not provide type safety, while the safe subsets of the languages provide it.
_3 is magic, _4 is uncopied, and 5 is b. move here is like ptr::read, which means that uncopied points to a copy of magic, not aliasing magic, and is dangling. Because this is UB, it gets optimized straight into the panic.
After I figured that out, miri started working, I must have made a mistake earlier. It will tell us the same thing:
test test ... error: Undefined Behavior: memory access failed: alloc113986 has been freed, so this pointer is dangling
--> src/lib.rs:24:13
|
24 | assert!((*uncopied).value != std::ptr::null());
| ^^^^^^^^^^^^^^^^^ memory access failed: alloc113986 has been freed, so this pointer is dangling
|
= help: this indicates a bug in the program: it performed an invalid operation, and caused Undefined Behavior
= help: see https://doc.rust-lang.org/nightly/reference/behavior-considered-undefined.html for further information
help: alloc113986 was allocated here:
--> src/lib.rs:18:16
|
18 | Magic::B(b) => &b,
| ^
help: alloc113986 was deallocated here:
--> src/lib.rs:18:23
|
18 | Magic::B(b) => &b,
| ^
= note: BACKTRACE (of the first span) on thread `test`:
= note: inside `magic::<&str, &u8>` at src/lib.rs:24:13: 24:30
note: inside `test`
--> src/lib.rs:36:5
|
36 | magic::<&str, &u8>("magic string");
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
note: inside closure
--> src/lib.rs:35:10
|
34 | #[test]
| ------- in this procedural macro expansion
35 | fn test() {
| ^
= note: this error originates in the attribute macro `test` (in Nightly builds, run with -Z macro-backtrace for more info)
This code started failing in Rust 1.12, when MIR happened, so that's exactly my guess as to what fixed it.
Do you think Ada would be good for game development? My main interest in the new wave of low level languages like Rust, Zig and Odin is to see which one would be good for game dev.
Out of all of them, the best would probably be Rust (great ecosystem) and Odin (the designer behind Odin is literally a game dev iirc)
The SPARK subset of ADA really seems the best here, but doesn't seem anywhere near as fun to write as Rust, or have anywhere near the amount of libraries available. The community is much smaller also, and I think community counts for a lot here. The takeaway is Rust, really is, really really great. It's fantastic there is so much documentation and such a strong community, and that so many people are invested in writing safe code because it's finally easy to do so.
SPARK offers more than Rust, and in my experience, writing Rust isn't fun (either). Rust has more documentation to get you going than Ada, and certainly more than SPARK, as well as a much larger community and more libraries, but the Ada/SPARK community is a nice, helpful bunch.
> [about Ada] You never miss worrying about whether a parameter is passed by value or reference
Uh? I thought that Ada suffered from the "aliasing issue"(1): given that the compiler choose whether to copy or alias, there is a risk that the semantic of your program change when the compiler/optimisation change, am I wrong?
1: if you have a function with two parameters, one "in" and the other "inout" and you pass the same variable to the two parameters.
Even if the aliasing were not forbidden by default by the compiler, aliasing in Ada would have caused problems more seldom than in a programming language that allows the programmer to specify whether parameters are passed by reference.
The compiler may choose to pass by value the "in out" parameters in which case their aliasing may have no effect ("in out" parameters that are passed by value, typically in order to be passed in registers, are loaded upon function entry and stored back upon function exit).
In Rust, C or any other language with parameters passed by reference, you can always alias 2 parameters passed by reference, unless the language defines that as an error that must be caught at compile-time or at run-time. Usually aliasing will lead to errors if at least one of the 2 parameters is modified inside the function.
Because of this, aliased parameters which may be modified inside a function and which are not allowed explicitly shall always be flagged as an error.
The SPARK Ada tools detect this as an error, but not the free Ada compiler.
This should have also been detected as an error by the normal Ada compiler, but I assume that this is avoided in order to enforce market segmentation.
Those who are not happy with the reduced error detection abilities of the free Ada GNAT compiler are expected to pay for SPARK.
I think that the main reason why the use of Ada has remained restricted is that even today having access to complete Ada development tools is expensive, even if the free GNAT tools are enough when you do not need the better error detection provided by the paid tools.
So in this case C or C++ is safer than Ada (if you use GNAT) but at the price of reduced performance, funny no?
Note that at some point Zig had the same semantic as Ada, but then they changed to use C's (less efficient) semantic: this aliasing detection must be difficult to do..
Is there an online Ada compiler which detect the issue?
My Ada is somewhat weak, but IIRC Value2 doesn't alias Value1. The parameters don't work like C. It's more like an assignment once the function returns, although the compiler is free to deviate from that.
Try it: change the size of the array: with a small array, it's passed by value, with a big array it's passed by reference and the value of "result2" is changed..
> Nothing prevents you from writing Ada code that dereferences an access type after it's been freed; However any access dereference triggers a runtime check to ensure it's non-null. Unlike in C, freeing an access type in Ada automatically sets its value to null, and any subsequent attempt to dereference it will raise a Constraint_Error exception, which can be caught and handled.
> Ada has its own nomenclature for multithreading: Tasks, and provides its own built-in mechanism for preventing data races between them. Ada's Protected Objects encapsulate data inside an implicit mutex, allowing only one thread access at a time via a public interface.
I believe ADA doesn’t have a guarantee that you are accessing data thread safely. And that lack of guarantee is where use-after-free and double-free can hide. Heck, you don’t even need any threading.
1. Create an alias
2. Free the original value
3. Access the alias
If I read it correctly, the variable holding the original value is reset to null after a free but the alias can still access freed memory.
TLDR: Rust has almost no competitors when it comes to the very robust safety guarantees it makes and none when you combine with the performance niche it services.
Programming languages where you are not allowed to access directly any shared variables, but only inside protected data structures, so that complete thread safety is enforced, have existed already a half of century ago, e.g. Concurrent Pascal.
I have not noticed yet anything innovative in this aspect of Rust.
As explained in the article SPARK detects such double free or use after free, even if the standard Ada compiler does not.
The separation between Ada and SPARK in the power of detecting unsafe memory uses is just a market segmentation mechanism, not something intrinsic to the Ada language.
The SPARK development tools must be bought, while the free Ada development tools have less comprehensive error checking.
This is the main disadvantage of Ada vs. Rust. It is a question of money, not of technical qualities.
I think you’ve skipped important words that I’ve said:
> Rust has *almost* no competitors when it comes to the very robust safety guarantees it makes and *none when you combine with the performance niche it services*.
The almost is doing important work in that statement and the none is referring to systems programming. As another commenter noted, SPARK is more than just a cost thing - it fails to scale to non trivial sized projects.
I’m not saying there’s nothing to learn from Ada. All I’m saying is that Rust offers significantly stronger protection and memory safety out of the box than Ada and no language has managed to deliver such memory safety with the performance profile of C and able to target all the same variety of use cases in terms of systems programming (but then also scaling to web services which C/C++ can’t really reach unless you like trivial memory exploits)
> I have not noticed yet anything innovative in this aspect of Rust.
Innovation often times can be subtle like packaging up existing ideas in a friendlier package that can get more mass adoption. If you fail to see the innovation Rust has done with the borrow checker (which afaik is truly novel) and then making the entire language be able to scale from embedded to web services in a cohesive way and also making it a legitimate replacement to C++ in ways that other languages have tried and failed, I think you’ve been too dismissive.
AIUI, SPARK does not use a Rust-like borrow checker: it's based on formal verification which is a lot more difficult to program for, i.e. not really feasible except for smaller programs. This seems to be in sharp contrast to Rust - the safe subset of Rust enforces a set of rules that can actually scale up to larger programs while still preserving safety.
> Why wouldn’t the Linux kernel team choose Ada instead of Rust?
To which I say: Ada doesn't have a large user base beyond DOD projects. Virtually no greenfield projects except for DOD. Whereas there has been a Cambrian explosion of Unix kernels/tools/servers written in Rust.
More generally, these questions of the form "Why are you ignoring X?" where X is Ada or Zig or C++ are really "Why didn't you write/don't you rewrite this in Rust?" by a different name.
And my/that answer goes both ways. Notably, Roc recently has stated it is moving away from Rust. Ghostty uses Zig because Mitchell Hashimoto doesn't like Rust. But the answer is always someone has to show up and do the work, usually because they like doing the work in some language.
So -- it's okay to like the project strictly because of the language chosen. That being said -- I'm super interested in the more technical differences.