Rust is hard to get started with, but once you reach optimal development speed, I can't see how you can go back to any other language.
I have a production application that runs on Rust. It never crashed (yet), it's rock solid and does not require frequent restarts of the container due to memory leaks or whatever, and every time I need to fix something, I can jump into code after weeks of not touching it, and being confident that my changes won't break anything, as long as the code complies (and is free of logical bugs).
I can't say the same about any other language, and I use a few of them in production: NodeJS, Ruby, Python, Java. Each of them has their own quirks, and I'm never 100% confident that changes in one place of the code won't cause harm in another place, or that the code is free of stupid bugs like null-pointer exceptions.
I like writing C and still write C from time to time, but I've migrated to Rust for my hobbyist embedded projects and I just cannot imagine going back. It took me some time to work out how to get Embassy working smoothly, but now I just have a project template that's (for my purposes) absolutely rock solid. Rust and Cargo made embedded fun again for me.
Basically all of my struggling now is entirely based around hardware specifics rather than ironing out kinks in my code and I'm into that.
100% agreed. After writing Rust as my day job and using it in production for the last 2 years, my only criticism is the lack of a good high level standard library (like Go) and the insanity and fragmentation of Future type signatures.
Though at this point, I wish I could write everything in Rust - it's fantastic
Also, I can't wait to (practically) use it as a replacement for JavaScript on the web
It's great for web servers and build tools but has work to do on the client side. It's difficult or impossible to write GUI applications in Rust (despite that being a great use case via macros and fearless concurrency)
re: noarch, emscripten-32, and/or emscripten-wasm32 WASM packages of rust on emscripten-forge a couple days ago. [1][2]
emscripten-forge is a package repo of conda packages for `linux-64 emscripten-32 emscripten-wasm32 osx-arm64 noarch` built with rattler-build and hosted with quetz:
https://repo.mamba.pm/emscripten-forge
Evcxr is a rust kernel for jupyter; but jupyter-xeus is the new (cpp) way to write jupyterlite kernels like xeus-, xeus-sqlite, xeus-lua, xeus-javascript, xeus-javascript
First of all, even if C++ is not 100% safe, that is the go to language when I want to delve into OpenJDK, CLR, V8, GCC and LLVM internals, followed by the language everyone speaks when talking about graphics programing and compute.
Adding something else to the mix, doesn't help, only hinders the reception from other folks, being the strange dude wanting to spoil their party.
Secondly, if I want use a ML type system, I have plenty of options in managed languages.
Naturally as computer nerd, I play with all programming languages, including Rust.
Which doesn't mean I have any business case to push for it.
I do wonder how much of this is "rust" vs "people who pick rust".
Like nothing stops you from littering your codebase with `.expect("TODO: Fix Later")`. This isn't just in your code too; it applies to your dependencies with Hyper returning a Result instead of a RuntimeException.
There's a common theme in posts like this where the writer knows _too much_ about programming, and tries to do things that are awkward in Rust, but a lot of these problems don't actually come up in practice. Sometimes I feel like they take it as a personal challenge to never ever use .clone() and like -- it's okay -- use clone(), nobody else is going to know.
I'm maintaining 4 kubernetes operators written in Rust with a team of devs who are _not_ brilliant developers, and it's all async, and once I setup the patterns, they were able to jump in and start contributing business logic immediately.
Is it the most performant possible implementation? Definitely not! But once I told them how to stop using unwrap() everywhere, they have literally _never_ crashed or panic'd in over a year. When we were working in go, nil pointer exceptions were the bane of our existence.
At the risk of making this another Go vs Rust comment, I hesitate to mention this, but Go developers copy structs all the time. We do it mostly as a way to make values immutable (and to avoid nil checking all the time). It is mostly not a problem. If something is a hot path, we'll use a pointer.
Whenever we pass by value as opposed to by address, Go makes a copy. Use .clone() in Rust, it is fine most of the time. Optimize it out if you find it is a performance bottleneck.
I always thought that using Rust for Kubernetes controllers / operators would be going against the grain a little (since almost everything in that ecosystem is golang). I'm interested in why your team ended up switching to Rust and how it is working out for you.
This is a good rundown of the pain of rust development. I had imagined that I'd reach a nirvana of rust dev where I could just write the code and feel like I knew what I was doing.
This ultimately never happened after N learning attempts and M side projects. A nascent startup idea died due to the time spent on rustisms.
I don't doubt that some people experience this zen, and I'd take rust over c/cpp every day for a new app if that was my choice… but that's not really my only choice.
The article also nails the performance topic, while rust gives you all of the insight and control over what your code is doing - its exceptionally rare that all of the libraries and modules in your program agree enough on these interfaces to not end up with memory copies everywhere.
It's little appreciated how revolutionary the garbage collector is, in that it probably contributes more to software reuse than every other innovation put together.
If you were trying to please everybody in C, for instance, a library should be able to accept a buffer from the application, or allocate a buffer using a malloc/free passed in from the application, as well as many other scenarios.
For very simple libraries it is not complicated but as libraries and applications get more complex the library can never be sure if the application is done with a buffer and the application can never be sure if the library is done with it. For a framework, which calls the application, it gets worse.
However, the garbage collector always knows. Memory allocation is a global problem but Rust tries to treat it as a local problem which can be solved for any particular program but gets harder and harder as programs get larger and if you want to compose a program out of independent parts it is intractable: sure you can RC all the things (no cycles!) and copy a lot but you lose many of the reasons why you gave up on the garbage collector.
FORTRAN lumbers on because its memory model is simpler than C and many other languages so it runs faster.
If you write Java programs that allocate a few huge arrays and don't allocate anything after that they run pretty fast too.
I wrote a Python chess program in two weekends that was good enough to beat a serious beginner but that I didn't feel I could take to the chess club because it couldn't respect time control the way it was written. I started rewriting it in Java and realized that if I wanted decent performance I had to not allocate anything in the inner loop. Transposition tables using Java hashtables are a net loser at the depths it is working at, I really gotta write something that works at the bytearray level off-heap but it's not that hard of a project because, like the typical HPC code, it's not really that complex/complicated.
This is pretty much it. When you are at HPC performance requirements, the rust pain doesn't matter too much - but... You can also get decent perf in any language by being careful about the allocations. The lack of gc provides an at most 2x benefit for a reasonably well optimized java program in this setting.
Some people thought Java was the new COBOL as early as the early 2000s but it was one of the first computer languages designed and specified by adults (I'd add Common Lisp to that list, if you look at the CL spec and the Java spec you realize they are both nicely structured with none of the strange circularity that plagues The C Programming Language)
Threads in Java are just great, you really can write programs that get a 14x speedup on a 16-core machine. Web servers can use threads for managing concurrency and for parallelism handling individual requests. I look at how much pain people go through with async in Rust (another global-local problem where they'll be pushing that bubble around under the rug when they give up) and contrast that to the ease of threads in Java, so long as you get to understand all the primitives that Java offers to deal with common problems and don't get seduced by the false promises of Fork-Join, Scala actors and such.
Even HPC, unless there is some surprising wind of change, Chapel has much more adoption opportunities than Rust will ever had.
Like GC/RC languages are now considering how to introduce linear/affine/effects into their type system in ways that aren't too alien, Chapel is being designed from start with similar approach.
The productivity of automatic resource management most of the time, with the knobs of advanced type system, for when actually matters.
To me Rust is a suboptimal choice for lot of other use cases.
Automatic memory management should be embraced 90% of the time, but it is not due to cargo-cult (no pun intended, Rust's cargo is awesome) and superstition. I mean, people successfully did "systems programming" in GCed languages decades ago. They were lucky nobody told them that is not achievable.
Besides Python that I have to use professionally for its numpys and matplotlibs, personally I settled on Go for my personal purposes, with all its warts and caveman-like design, even though I prefer a lot of Rust stuff (error handling, type system, FP-like constructs). D or C# came pretty close, as languages alone they are much better than Go, but tooling and ecosystem, in case of D, or still somewhat limited AOT capabilities, in case of C#, made Go a winner for me.
While I am not a big fan of Go's design decisions, I am a cheerleader for TinyGo and TamaGo achievements, or having bootstraped the whole compiler toolchain and runtime.
Likewise, even though I dislike the outcome from Android Java, I celebrate that Android has proven what Longhorn could have been if DevDiv and Windows division actually collaborated.
Also share the same opinion regarding Rust, its sweetspot are places where any kind of automatic resource management isn't wanted, or even if possible, it would be quixotic battle trying to change the minds of the target audience.
I'm currently using Rust for my saas product (very much a b2b web platform + api). It has been great so far. The only pain point is the change > rebuild > refresh flow is a bit slow (5-8 seconds to rebuild).
It isn't ideal, but it comes with a number of positives. I am able to change code across the whole codebase with a high degree of confidence when it compiles, it's working. Even in my html templates.
Previously, my product was written entirely in Go. The change > rebuild > refresh flow was better in Go, but I had less confidence with code changes (especially around go's templates and runtime errors).
IMO, it is just as good as Java and c, but rust people will say rust is better than both.
But the standard is COBOL, for business applications you need predictable fixed point math. By that, if you divide 1/3 you get .33 instead of .33333333333333333, depending on how the numeric item is formatted. Or maybe a better example, 1/6 you would get .17 instead of .166666666
Does rust have that ability ? I do not know. Some bolt on libraries exist for c, but it is a bit harder to use IIRC.
I have never seen any COBOL code but I'm surprised this works in a general sense even for currency related calculations due to the loss of accuracy.
In any case, it seems that it would be relatively easy to implement COBOL compatible numeric processing in any language in a library (and maybe one case where operator overloading is a feature instead of a bug).
I've been waiting for the Rust ecosystem to stabilize, but reading through this I thought "OK, they still haven't had their Django moment".
Go is weird and arcane (and oddly cathartic if you were exposed to Plan9), but it has so much less cognitive overhead that the most likely direction (for me, at least) if I need something else is back to C++ and not Rust.
I like Rust, however for me when not doing C#, Java, Typescript, it is also C++.
Mainly because when working with those ecosystems, runtimes and native libraries, when playing around with LLVM, or GPU coding, it is all about C++, unless I would like to build ecosystem instead of actually implementing what I wanted to do in first place.
Also because for better or worse, I have the scars already.
Perhaps an interesting question is, can a programming language be created that works well from low level operating system use cases to business apps? Or, is it fundamentally necessary to have multiple languages to cover different domains?
I think Swift gets very close to this ideal, but there is still an Apple-only stigma associated with it despite it having support for Linux, Windows, and embedded systems.
We mostly use react-router (or remix for older things that haven't been upgraded yet).
So the way that it works is, you define your api in dropshot. You ask it to generate an openapi document. You run this on that document to get a typescript client. And then you use that client in your react-router application.
I'm personally using it in the "framework" mode, so I have a "backend for frontend" going on, but the main appliccation is using it purely on the client, served from the same server as that API. Both have pros and cons.
I have a production application that runs on Rust. It never crashed (yet), it's rock solid and does not require frequent restarts of the container due to memory leaks or whatever, and every time I need to fix something, I can jump into code after weeks of not touching it, and being confident that my changes won't break anything, as long as the code complies (and is free of logical bugs).
I can't say the same about any other language, and I use a few of them in production: NodeJS, Ruby, Python, Java. Each of them has their own quirks, and I'm never 100% confident that changes in one place of the code won't cause harm in another place, or that the code is free of stupid bugs like null-pointer exceptions.