
Rust 2019 and beyond: limits to some growth - janvdberg
https://graydon2.dreamwidth.org/263429.html
======
jacquesm
That is a super mature post whose content should be read by any language
and/or eco-system component (libraries, frameworks and so on) creators and
maintainers.

I love the intentionally limited scope and the amount of thought that went
into writing this, and I wished I had the clarity required to be able to do
this. Especially the bit about 'negative space' got my intention, that's one
tool I'm going to put in my toolbox to re-use. And there already is one
example from a company whose founder I knew that made serious bank on just
knowing what they were not, so this advice has applicability far outside of
computer programming language design.

Thanks for posting this.

~~~
munificent
_> Especially the bit about 'negative space' got my intention, that's one tool
I'm going to put in my toolbox to re-use. _

One of the most profoundly useful things I've ever realized in my life is that
the most important choices are often about deciding what _not_ to do.
Creativity is deeply wedded to the idea of constraints and pragmatically,
there are only so many hours in the day. Deliberately deciding to not
investing in one thing is the clearest way I know to free up mental capacity
for the things I do want to put care into.

~~~
mehrdadn
I see the value of deciding what _not_ to do _for version(s) X of something_ ,
but ruling something out for all eternity seems a little... short-sighted?
Who's to say that what you're ruling out won't become the next big concern for
everyone and your technology won't be the odd one out without it?

~~~
usrusr
There are two possible ways to deal with that situation: create something new
that is the old thing plus the feature that had been ruled out for it, or
simply revoke that decision. Eternity can be surprisingly short when everybody
agrees that it is time to move on.

Both ways are perfectly possible, and both would in many cases be much
preferable to having an implicit "is it time yet for X?" on the agenda every
time the version number is incremented. "Not in the foreseeable future" is
effectively understood as a far out roadmap item, no matter how much you don't
mean it that way.

I'd rather see a humble "we thought that we would never X, but we were wrong"
repeated for many different X than a single misleading "we absolutely commit
to not be adding GOTO before 2025".

~~~
lmm
> Eternity can be surprisingly short when everybody agrees that it is time to
> move on.

That relies on having a decision-making process and culture that accommodates
that. In my current job I'm stuck dealing with inadequate tools because of
tool choices made in 2002 that no-one has the political capital to revisit.

------
CJefferson
I can't agree more with this. I am a long time c++ dev, been to c++ committee
meetings. The language is too complicated and inconsistent, and that is now I
believe unfixable. I believe no one understands it. New features keep
arriving, but you still have to learn everything that came before, for older
codebases.

For example, {} style initalisers were added to simplify and "unify" things.
Except to make a vector of length 3 you still need to use the old style (3)
notation. So now there is just one more thing to learn.

It is much easier to add "one more thing" to a language than it is to later
take it out.

~~~
aaaaaaaaaab
Let’s revisit this when Rust has...

\- the number of users

\- the amount of mission-critical legacy code that _has to_ compile

\- the number of compiler vendors

\- age

... comparable to C++’s.

~~~
paulsutter
According to that we should all be using COBOL

Did you hear they’re finally making it object oriented? The name will be ADD
ONE TO COBOL

~~~
pjmlp
Cobol is already OOP for several years now.

~~~
adwn
> _Cobol is already OOP for several years now._

I think that was supposed to be a joke, mocking Cobol's verboseness (compare
"C++", which adds 1 to C and which extends C with OOP).

------
sudeepj
I have written non-trivial code in Go and Rust. My situation is C++ dev -> Go
-> Rust -> Go (current day job). The shifts have been due to various
circumstances on my day job.

The thing is when I moved back from Rust to Go, I had forgotten how productive
Go is and I felt something like this:

[https://youtu.be/1i_Eqj7wu88?t=50](https://youtu.be/1i_Eqj7wu88?t=50) (20-30
sec video)

As someone who tried to "sell" rust in my org, I see the biggest issue is:

1\. Steep learning curve (it is true)

2\. When lot of people talk about #1 it gets amplified in the minds of people
who make calls on the tech to be used

A great effort has been spent on #1 and I agree the payoff is worth it. But #2
is perception/marketing issue.

PS: I am huge fan of both Rust & Go.

~~~
the_clarence
Big fan of Rust as well, but this is why both Go and Rust are going to co-
live. They are both amazing languages.

~~~
akavel
Personally, in my heart, I'm kinda keeping fingers crossed for a future where
if one needs to/has to, one uses Rust (mostly where one would use C/C++ today,
including for "squeezing the last drop of performance", and especially high
risk everpresent libraries like libpng; plus as an extra, probably for highly
ambitious parallelism like in Servo); otherwise, one just goes with Go (for
the amazing general productivity and readability).

Though then there's also Luna-lang, which I'm hoping even stronger to become
long-term a new breakthrough paradigm/tech in programming & scripting...

On a mostly unrelated note, but continuing with off the sofa predictions, I'm
curious on a few more developments:

\- whether WebAsm will become the new de-facto universal architecture — at
least for VMs, but I wonder if we won't eventually end up living on purely
WebAsm CPUs at some point in future? (Unless WebAsm has some inherently
hardware-hostile design decisions; I'm not a CPU/ISA guy to know that.)

\- then RISC-V; will it become the ultimate personal computer/device CPU in
the meanwhile? Also, what may happen if both RISC-V and WebAsm start naturally
competing in this domain eventually?

\- Fuchsia off-handedly dethroning the Linux kernel & ecosystem (and maybe
even Windows) as the sudden standard mainstream OS (and thus also, amusingly,
suddenly swinging the Tannenbaum's argument up through a semi-random caprice
of Google's deep pockets)?

Especially per the last point, I wonder if we'll eventually end up in a world
with much more... "secure"... personal devices. And if yes, will it end up
being net good or bad (or just neutral/hard to say/nuanced, a.k.a. both, as
usually) for humanity. Given that I'm recently realizing that "secure"
unfortunately seems to be a very close sibling, or even maybe just an other
face of, "closed" — think DRM, walled-garden ecosystems, no more rooted
Android or XBOX... or even no more control at all over your PC, a.k.a. no
general computing for the masses, potentially. Though I personally stay mostly
hopeful on this front; in part because I think I believe people are as a sum
too chaotic for this to be able to happen completely.

~~~
Longhanks
“Secure” Personals devices with a Kernel developed entirely by Google? Sounds
like a nightmare.

------
skybrian
As a newcomer to Rust and perhaps having been spoiled by Go's very readable
specification, I was surprised that Rust doesn't have a definitive language
spec. How do the people working on Rust ensure everyone has a common
understanding of the language without one?

~~~
steveklabnik
Some stuff is quite nailed down, some stuff is not. In the end, not breaking
existing code is the most important thing.

These things are always on a spectrum. Just because a spec exists doesn’t mean
that it has holes; Go’s spec isn’t formally proven, for example, so you could
make a similar claim: how can people know that it all works without a proof?

The answer is that it’s all a spectrum. Many languages don’t have anything
resembling a spec at all!

(We are interested and actively working on such a thing for Rust, but we’re
shooting high. It’s gonna take a while.)

~~~
f2f
> Go’s spec isn’t formally proven, for example, so you could make a similar
> claim: how can people know that it all works without a proof?

that's moving the goalposts. let's see Rust's language spec finalized first,
then we can talk about it being proven. without the two you shouldn't throw
stones.

~~~
skybrian
I actually think a mathematical-flavored spec where a proof would be
meaningful would be a bad idea to the extent that it makes the spec less
readable by ordinary users. For example, Dart has a more mathematically
flavored spec and it's not readable by most people.

Of course, formalisms can still be useful, but as a matter of writing style,
maybe it's better to put that sort of thing in an appendix? (As sometimes done
for grammars.)

~~~
electrograv
One can generate an informal spec from a formal spec.

One can NOT generate a formal spec from an informal spec (or else that
“informal” spec would actually be formal, after all).

So, a formal spec is strictly better than an informal one — it enables all the
benefits of an informal spec, via the ability to generate any number of
informal specs from it in many human languages, cultures, levels of detail,
etc., and of course enables things like compiler reproducibility (which you
cannot do at all without a formal spec).

That being said, any spec is probably better than no spec.

~~~
skybrian
Since many programmers are not mathematicians, a formal spec will always have
a smaller audience than a well-written spec written in English. You cannot
automatically derive good writing from pure mathematics.

As a result, neither is strictly better than the other. They have different
audiences and serve different purposes. The audience for pure mathematics is
quite small.

------
cryptonector
> Reputation for overcomplexity, loss of users. Becoming the next C++ or
> Haskell.

I think Rust is already there. It's a concern, but I'm not sure you can or
should try too hard to avoid this.

What you can do is look to make simplifications as you add complexity.

And sometimes innovations can have simplifying effects. For example, the
various "effects" libraries for Haskell simplify I/O code compared to not
having them.

~~~
gpderetta
There are only two types of languages, those that are too complex and those
that aren't yet.

~~~
augustk
There is at least one language that evolves in the opposite direction. The
philosophy of Oberon is that "Perfection is achieved, not when there is
nothing more to add, but when there is nothing left to take away."

~~~
pjmlp
And sadly lost its opportunity to Java in the late 90's.

So whatever Wirth has done was quite of interesting in terms of language
research for the language geeks among us, but hardly with any market
relevance.

~~~
augustk
If you are interested, I can recommend the Oberon compiler OBNC:
[https://miasap.se/obnc/](https://miasap.se/obnc/)

~~~
pjmlp
Thanks for the heads up, I am quite aware of Oberon's ecosystem though.

You can check Oberon's history on my website.

------
Animats
It's interesting to see this critique of the Rust language's development
process. There's a lot of process. Most of it aimed at adding new features.
That may be part of the problem. The Go crowd seemed to know when to stop.

I've criticized Rust's growth here before. The big breakthrough in Rust was
the borrow checker. Finally, one could have memory safety without garbage
collection or reference counting. Huge improvement.

"Unsafe" opened too big a hole. I was looking forward to seeing more work on
eliminating almost all need for "unsafe". Backpointer support. A way to talk
about partially initialized objects. More static analysis. But that didn't
happen. Instead, feature after feature was piled on, like C++. Individually,
the features aren't bad, but cumulatively, Rust is now a very big language.

Meanwhile, C++ has been trying to retrofit Rust ownership semantics using
templates. There's too much "be very careful" associated with that.

~~~
dbaupp
All of those unsafe "hole fixes" are features, meaning your desires are for
adding even more features.

Additionally, useful back pointer support (i.e. for anything more than a toy
doubly linked list) is probably a bigger feature than anything else added
since Rust 1.0, and probably harder to work with.

~~~
steveklabnik
... and basically an open research question, whereas the features we're
talking about adding are generally less novel.

------
rossdavidh
I think that, in not only languages but anything related to code (frameworks,
packages, etc.), there is a tendency for things to overshoot because that is
likely to make those who know the most, more powerful (they know how to use
all the features), and its negative impact is on new learners, i.e. those who
have little or no voice in these decisions (for obvious and often good
reasons). Thus, it becomes a form of pulling up the ladder after you. Not that
this phenomenon is in any way limited to coding, of course.

~~~
dccoolgai
As a longtime JS dev, I sometimes cringe for this reason when I see new
features in ES. Even though I'm excited about them, when I think about
experiencing them as a learner I feel intimidated. One good point the article
makes that I wish more language designers accounted for is the fact that in a
shared language you can't just pick and choose - to reach a level of expertise
you basically have to know all of it.

~~~
chubot
Yes, I think you could know "all of" of JavaScript in 2009, in the ES5 era.
Does anyone even remember that anymore? I'm shocked at how much the language
has changed.

I was experimenting with server-side JS in 2009, before node.js was released,
and I hacked on Brendan Eich's Narcissus interpreter, which gave me a good
sense of the language.

Now when I look at example JavaScript code on the web, I invariably find
something foreign. I'm not saying it's bad -- just foreign for somebody who
uses the language only occasionally.

I don't believe JS should be a language only for "full-time JS programmers". I
use 3 or 4 other languages regularly, so it's not nice when one language wants
to hog my mental bandwidth (unfortunately C++ is one of those languages, so I
know this problem well).

It seems like everybody forgot what Crockford was saying back then? He talked
a lot about the failed ES4 project (which ironically Graydon Hoare was a part
of (?)). And ES5 was very restrained as a result, and I think mostly
successful.

It seemed this idea of restraint went completely out the door soon afterward,
and nobody even remembers what was discussed 9 years ago. Crockford seems to
have moved on (as well as Ryan Dahl). I haven't followed the JS language
changes that closely, so I could be wrong, but I have the impression that it's
completely changed.

\-----

I don't think I'm alone, as I remember that Bryan Cantrill recently said that
"Brendan Eich never wrote a book about JS". In other words, the language was
left without a founding philosophy. It's been an accretion of convenient
features, some of which may be regretted in a couple years' time.

There was a Strange Loop talk about this too regarding the lack of
history/culture in JS.

It seems like the same thing may have happened with Rust, since Graydon isn't
an active contributor by his own admission.

~~~
dccoolgai
Yeah, it definitely feels like ES6/7/Next is packed with features that
theoretically make it a little faster to write code while making it
monstrously slower and harder to read for at least 80% of practitioners...
(destructuring comes to mind, with a syntax that is bafflingly similar to
literal decs, that used to be the "good part") Seems like they made that
choice every time.

I was reserving my thoughts about ES6 until I worked with it for a while, but
now that I have, I have to say for those reasons you cited I don't like it as
much as ES5.

~~~
feanaro
Can you give an example of the aspect of restructuring you consider confusing?

~~~
dccoolgai
"Confusing" is perhaps too harsh (although destructuring over several levels
can be terribly confounding). By "hard to read" I mean primarily the fact that
when you are "scanning" code, destructuring statements are syntactically way
too similar to literal declarations. It severely degrades, IMHO, the speed
with which you can effectively "scan" code in JS. It's not really that you
_can't_ understand it when you slow down and analyze - it's that you _have to_
slow down to see the difference between "let a = [0,1]" and "let [,a] =
[0,1]". Whereas in ES5, whenever you saw the brackets you could know without
reading into the code there was an object or array being declared.

That's not to say that I think destructuring is all bad - it certainly saves
lines of code - but aftet working with it for a few years, I would gladly
trade it back for more scannable code, easier syntax for beginners, etc.

------
the_clarence
I ran into the Failure crate today. It was hard for me to figure out if it was
something official that was going to make its way into the core language or
just a third party crate. If the former, should I avoid learning failure
patterns that are not using this crate? It sometimes feel like the language is
moving too fast for me to learn it.

~~~
epage
`failure` captures common error patterns in Rust and provided a test bed for
experimenting on them while working to improve `trait Error`.

There is an RFC for pulling some of the trait improvements into the language.
After that, I believe they plan to continue to iterate on the design of the
failure crate.

The general recommendation I make and see from others is that `failure` is far
from stable. Feel free to use it in applications but avoid it for libraries.

~~~
zanny
Anything that would break failure in libraries written now would break all
libraries because it would mean the fundamental Error trait has changed. Since
failure is Error compatible you don't lose anything using it. Its only ~2500
lines of Rust.

Coinciding with the 2018 edition release I ported one of my libraries to it
that was using vanilla std::Error impls for two years and dropped about ~400
LOCs out of 600 lines of error handling. It still does all the exact same
stuff, and if you use its generic Error impl it behaves the exact same way as
the std trait.

If Failure stops getting updated.. fine? It doesn't need changing if it works
right now, on the 0.1.3 release, for you. If Rust's error trait changes it
breaks all libraries anyway and would only happen in the next edition. Thats
probably three years out!

~~~
epage
> Anything that would break failure in libraries written now would break all
> libraries because it would mean the fundamental Error trait has changed.
> Since failure is Error compatible you don't lose anything using it. Its only
> ~2500 lines of Rust.

If you expose failure in your public API and failure makes a breaking change,
it is a breaking change for your API because your clients need to be on a
compatible version of failure.

~~~
zanny
Your clients don't though, you can have multiple versions of a crate in your
build tree. Your failures aren't then compositable with newer failures as
failures but they do impl Error and can be used as standard errors.

The alternative is to just use... standard errors anyway? With the
aforementioned sizable boilerplate? There's no downside to failure unless
there's a less volatile error replacement you'd use instead.

~~~
burntsushi
That you can have multiple versions of your crate is exactly what enables the
issues caused by public dependencies. It's not about multiple versions of
_your_ crate, but rather, multiple versions of the failure crate. If failure
0.2 is released (and doesn't use the semver trick), then you'll have two
versions of the Fail trait, and thus, two versions of the failure::Error trait
object. Madness then ensues for anyone using failure::Error (or dyn Fail in
general).

failure is a victim of its own success. I think most everyone believes it's
pretty close to what we want Rust's error handling story to look like. But
it's a public dependdncy and everyone started using it. Now it's more
difficult to evolve.

------
fhrow4484
The Rust documentation is a bit outdated:

[https://www.google.com/search?q=rust+tour](https://www.google.com/search?q=rust+tour)

the 1st result is "A 30-minute Introduction to Rust". Going through it
requires 4 navigations from the user to get to a valid page, everything in
between is deprecated

~~~
steveklabnik
Interesting! That page has been deprecated for something like four years...
thanks for pointing this out. I've seen people mention other pages before, but
never this one.

~~~
sondr3
I'd also like to mention that getting to the documentation for anything with
Rust is fairly annoying, mostly because a ton of the stuff in the book links
to the first edition and getting from the first edition to the current one is
really cumbersome. It'd be great if you would be redirected from the old book
to the latest version instead of getting to it and only get a warning that it
isn't the latest version. Especially since there's no easy way to go from the
first edition chapter to the corresponding one in the latest, it's at least
three-four clicks and annoys me every single time.

~~~
steveklabnik
Can you please file bugs when you see this? I'd be happy to update them to
point in the right place.

We have some constraints that make plain redirects really hard; we cannot run
a web server or use a ton of JavaScript, for example. Maybe this situation is
painful enough for community consensus to change...

~~~
mwcampbell
> we cannot run a web server or use a ton of JavaScript

Just curious, why are those constraints there?

~~~
steveklabnik
The documentation is distributed locally, as static files, as well as being
available on doc.rust-lang.org. This is considered a big feature.

I personally would love to tell people "sorry, you have to run a basic web
server, even locally" for a few reasons, but it doesn't feel socially viable
at the moment, sadly :/

~~~
mwcampbell
Here's one way you might be able to make the case: If you require a web server
even locally, then the docs can be installed as a single file (zip archive,
SQLite database, etc.). This would eliminate the overhead of lots of small
files on at least one popular desktop OS, speeding up installation and
updates.

~~~
steveklabnik
This is already an active discussion, yep. We'll see...

------
ahartmetz
Excellent. These are things designers of other languages should have
considered but didn't. Looking at you, C++. I write mostly C++...

~~~
maxxxxx
To be fair, C++ did a lot of research that the Rust designers then could learn
from. I hope that C++ will get replaced by something simpler but it's a
phenomenal success story.

------
Ericson2314
The comparison between C++ and Haskell is laughable, and shows that Graydon is
missing the most important facet of this:

GHC Haskell and Rust with have an intermediate language (Core and MIR + Chalk,
respectively), which keep the rest of the language in tight check. No other
mainstream language has this (Java bytecode doesn't really speak to high level
safety properties). This ensures the vast majority of the languages are
forgettable sugar, and making consistencies of various sorts tractable.

I would argue that even outside of languages as we conventionally think of
them, the lack of separating functionality into core languages/interfaces and
sugar is the central failure in letting complexity spiral out of control. And
indeed, just about everything has.

~~~
DannyBee
That is not an important facet at all. It's not a thing your users see. The
"syntactic sugar", as you call it, is the interface between you and your
users.

No amount of pretending otherwise will change that. No amount of separation
will change this. It does not give you any more ability to change the syntax
over time, or get it righter.

Your users care _only_ about this syntactic interface. In a good world, they
do not care how the rest happens in practice.

Worse than this, the idea that a mid level IR should be the "thing that keeps
the syntax in check" seems beyond broken. The lower levels do not drive the
higher level in roughly any case. Instead, they exist to serve the higher
levels. First you understand what users want at a high level, then you try to
understand how to make it fast/well. If you need to change the mid level IR to
do so, you do.

There are nothing but tradeoffs in mid level IRs, and those tradeoffs change
over time based on the needs of languages, not the other way around.

Driving a language based on what you can accomplish in a mid level IR would be
incredibly silly - "Welp, we better use this form of parallelism at a high
level because we chose coroutines in the mid level IR" should never occur.
Instead, the answer is "we change the mid level IR to best support the form of
high level parallelism we want in the language"

~~~
qznc
While I generally agree with your point, D/Walter Bright slightly disagrees
with the sentiment. One constraint for D is "must not require data flow
analysis" because that incurs a cost on compilation time. Afaik this was
mostly inspired by Javas definite assignment rule [0]. If fast compilation is
a goal, then a language is constrained by implementation and IR aspects.

[0]
[https://docs.oracle.com/javase/specs/jls/se6/html/defAssign....](https://docs.oracle.com/javase/specs/jls/se6/html/defAssign.html)

~~~
zbentley
This is a perhaps over-fine semantic distinction, but I would argue that
there's a qualitative difference between IR/language implementation and
compilation speed. The former should not drive the language's features; the
latter _is_ a language feature.

~~~
Ericson2314
IMO most discussion of the compilation speed is incredibly narrow-minded. We
_incremental_ compilation more than anything else. Programs barely change each
rebuild, so that should solve all the performance problems without having to
maim ourselves.

------
zanny
Since when did you need to read, comprehend, and digest every language feature
in Rust to ever use the language?

The absence of HKT / GATs / overloading / default arguments from the language
doesn't make it easier for a newbie to learn. These aren't topics someone new,
or even intermediate, should ever be touching until they _need_ them. And in
the absence of having this functionality (example - we JUST got const fn this
month in stable) you have to work _around_ the weaknesses of the language in
often obtuse and cumbersome ways... such as hacking the macro system with
lazy_static.

I think a lot of this sentiment comes back to what makes a language _hard_. I
recently started learning CSS properly for once, now that Grid is in and I can
just do all my styling by hand again in the same way I can use ES6 instead of
JS frameworks. I needed Bootstrap and JQuery back in the day because despite
the simplicity of the fundamental languages there were things I _wanted to do_
that I _could not do_ in an intuitive way. So other people wrote thousands of
lines of code to make what should have been simple and accessible but the
languages weren't expressive enough.

On the inverse, you need some degree of intuitiveness to your design. If I
didn't read about the syntax of [<name>] col [<name>] col in CSS Grid I would
have never intuitively thought you could name Grid regions. The syntax is
completely arcane and arbitrary, isn't based on anything else common to CSS,
and just happens to be. _That_ is the source of real complexity in a
programming language, not how long the book is or how many pages long the
standard library is. Nothing worth learning or doing is so simple as to be
digested in a weekend and trying to make a programming language like that,
especially a systems one - one that has to expose all the jank complexities of
how computers actually work - would have you make a really pretty paper weight
nobody could use to get real work done.

The complexity in C++ is not in how big the template library is, or how many
glyphs the syntax uses, but in how much mental stress it is to see void* = 0
or new Foo and have to juggle all the edge cases and eccentricities of the
grammar and model on every single line. Its a massive weight on ones shoulders
and _that_ is what will drive a newbie away, at least a newbie with the
mindset to take advantage of what they learn. Not the length of your book, so
long as all those chapters are about things people want, need, and that are
beneficial to the user, rather than warning them about all the ways they can
break everything with a single character or how arbitrary decisions were made
incongruent with the rest of the language because thats just the way it is.

C++ and C are not complex for their expressiveness, they are complex for their
arbitrariness and lack of consistency. Same way Javascript and CSS are
complex. Same way an actual, legitimate newbie to Rust is going to have much
more difficulty processing why :: is the namespace delimiter or why you need
curly braces to deliniate scope than how having the option to declare your
function const would. Until they need const functions they don't need to touch
them and can write all the non-const Rust they want. But now they _have_ const
when they need it and can often never touch something crazy like lazy_static
again.

~~~
Ericson2314
Yes this hole thread makes me think we have a bunch of people who think
complexity = initial learning curve. No wonder most of our industry is an
inscrutable inconsistent mess.

~~~
leoc
Allow me to summarise the pitfalls of any sweeping 'keeping features out of
the languge spec will prevent users from having to deal with the complexity
burden of those features' assumption with a story:

Once, long ago, there was a new programming language. It was lean and mean.
Partly to keep codebases understandable to all, and to make the language easy
to learn and straightforward, it eschewed both having too many features, and
having any feature that was too abstruse. People marvelled that the whole
language, standard library and all, could fit in a nutshell. The name of the
language was Java. The end.

~~~
Ericson2314
Yeah understanding the forest is more important than understanding the trees!

------
Santosh83
Rust can force a programmer to be disciplined and that's good as far as it
goes, but sadly no current language can force (or even encourage) a programmer
to think critically at an architectural level. Indisciplined coding leads to
security issues (at least immediately), but a poor architecture can mean a
complete failure of the project to meet its goals, never mind exceeding them.

~~~
steveklabnik
One claim about Rust that's increasingly common is that its language
constraints _do_ require you to reconsider architecture. See
[https://kyren.github.io/2018/09/14/rustconf-
talk.html](https://kyren.github.io/2018/09/14/rustconf-talk.html) for example.

------
dmos62
I'd read the other two latest posts on this blog for context:

The Planning System:
h[https://graydon2.dreamwidth.org/262512.html](https://graydon2.dreamwidth.org/262512.html)

More Galbraith:
[https://graydon2.dreamwidth.org/262669.html](https://graydon2.dreamwidth.org/262669.html)

------
ben0x539
Sure, we can limit the language growth, as soon as we get all the critical
features in!

~~~
cesarb
Every feature is a "critical feature" to someone. You have to know when to
stop.

~~~
galangalalgol
Async/await isn't stable yet, that seems to be making many commenters here
wait to adopt, so at least that should make it in before they lock it down.
I'd like the platform independent SIMD but that I can admit is niche and could
live in nightly for a long time without harming adoption. They have polls to
get a feel for the difference between those two categories and which features
fall into them.

~~~
burfog
Async/await appears to require runtime support, possibly even a modern OS.
Wasn't rust supposed to be suitable for writing bare-metal code?

SIMD at least has hope of running on bare metal. I suppose the ideal is to
have it adjustable: compiled to non-SIMD, using typical SIMD hardware, using
implicit threads (and thus needing OS support), or using SIMD hardware with
threads.

I fear the high-level web developers are in the driver's seat, which may doom
rust as a systems language. I'm hesitating on rust because I would miss
bitfields and various unsafe performance features. I like the opt-out safety
of the "unsafe" keyword, but I want more things that I can opt out of. I want
goto, computed goto, no-default switches, something like Microsoft's __assume
keyword, and all the other low-level stuff you'd want for making boot loaders
and high-performance kernels.

~~~
kam
> Async/await appears to require runtime support, possibly even a modern OS.
> Wasn't rust supposed to be suitable for writing bare-metal code?

You're probably thinking of Tokio, the library for async network IO, which
obviously depends on an OS. The async/await language feature compiles down to
basically a state machine and has minimal runtime requirements. I'm looking
forward to using it on bare-metal microcontrollers.

------
fpoling
Have other languages or libraries anything like the proposed anti-feature RFC
that bans adding features in a particular direction?

Go with its ban on generics that they hold for 10 years comes to mind.
Anything else?

~~~
dullgiulio
Go never banned generics, it just set a very high bar of integration with the
rest of the language.

The idea is that Go is one language with a very specific personality: if it
changes that much it just becomes another language and can as well be called
something else.

------
MichaelMoser123
any shop that is considering Rust would need to consider the question of how
to hire or train a sufficient number of programmers. With any mainstream
language you have a big pool of programmers up for hire, my guess is that this
is more of a problem with a new languages like Rust.

Maybe that's the reason why they did hype Java to such an extent when it was a
new platform, i think SUN understood that it is quite difficult to gain
acceptance as a mainstream player in this game.

------
Zelmor
>Magnifying inequality: only the most-privileged, available, energetic, well-
paid or otherwise well-situated participants can keep up. >privilege this
privilege that

Shit like this is why I will never touch this language.

