
Rust Moving Towards an IDE-Friendly Compiler with Rust Analyzer - sethev
https://www.infoq.com/news/2020/01/rust-analyser-ide-support/
======
_bxg1
> Another thing is difference in handling invalid code. A traditional compiler
> front-end is usually organized as a progression of phases, where each phase
> takes an unstructured input, checks the input for validity, and, if it is
> indeed valid, adds more structure on top. Specifically, an error in an early
> phase (like parsing) usually means that the latter phase (like type
> checking) is not run for this bit of code at all. In other words, "correct
> code" is a happy case, and everything else can be treated as an error
> condition. In contrast, in IDE code is always broken, because the user
> constantly modifies it. As soon as the code is valid, the job of IDE ends
> and the job of the batch compiler begins. So, an IDE-oriented compiler
> should accommodate an incomplete and broken code, and provide IDE features,
> like completion, for such code.

This has been one of the most frustrating things about Rust's development
experience. A piece of code looks error-free, then you fix an error somewhere
else, reach the next compiler phase, and an error pops up where you just were.
It makes it really hard to develop something piece by piece.

I'm glad they're aware of these problems and are making a concerted effort to
address them.

~~~
Buttons840
> A piece of code looks error-free, then you fix an error somewhere else,
> reach the next compiler phase, and an error pops up where you just were.

Isn't this the case with all compilers? The most basic example being (for
example) I have a syntax error on line 10, and that's the only reported
problem, but then I fix the syntax error and now I have an error on line 200.

I agree it is a problem, the world would be better if all languages could
avoid it, but I don't expect it ever to go away. Do you believe this problem
is better or worse in Rust compared to other languages?

~~~
the8472
Eclipse's ECJ will keep compiling a class even when a specific method is
invalid.

And even inside a method it'll try to massage the AST into a somewhat valid
state while you're typing so it can keep analyzing the code and show
errors/make suggestions. A missing bracket on line 15 won't prevent it from
soldiering on to spot a misspelled method call in line 40 even though the
nesting is totally messed up and if taken at face value line 40 would be
syntactically invalid since it's outside the class body.

~~~
usrusr
And it doesn't just march on to identify errors elsewhere, it can often
continue all the way to runnable bytecode. The invalid parts will just be
replaced by exceptions that parrot the compiler error. Meanwhile other code
paths can already be exercised which can be very handy at times.

~~~
gmueckl
On the flipside, I have managed to actually run non-compiling Java classes
because of that a couple of times, only to stare at the debugger with wide
eyes when I realized what was going on.

------
dunkelheit
Aleksey is true to his “avoid success at all cost” strategy, constantly
emphasizing that rust-analyzer is experimental and alpha quality. Even the
installation process is I think deliberately clunky. In fact in my opinion it
already provides a better IDE experience than RLS and is improving daily,
whereas RLS is stuck in maintenance mode.

~~~
matklad
Being alpha quality and providing better experience than RLS are not
incompatible.

rust-analyzer is definitely alpha-quality at the moment.

~~~
dunkelheit
Sure, it is a good idea to be upfront about the current state of the project.
Still I think I am not mistaken in my impression that you don't wish for rust-
analyzer to become popular too early as that would put some unwanted
constraints on experimentation.

BTW, thanks the project and good luck! I use it daily and it is awesome.

~~~
matklad
>Still I think I am not mistaken in my impression that you don't wish for
rust-analyzer to become popular too early as that would put some unwanted
constraints on experimentation.

You are not mistaken indeed. More specifically, I do want to maintain freedom
of pushing completely broken code. It's not necessary about popularity, it's
about user expectations. And I also don't allocate as much time as I could
into things that directly increase popularity. Like, the "install extension
from the market place" could have been done almost a month ago, but I still
haven't got to it. It's not that I am deliberately pushing back against such
improvements, I just don't push them forward to actively.

------
rictic
What would a compiler for a language like Rust or C++ look like if it was
optimized for development iteration speed first? Could you produce binaries
that were good enough to be useful for testing and developing?

You'd end up sharing a lot of the same infrastructure that you'd want for an
LSP server, basically to do as much preprocessing as you could on the initial
compilation and then work incrementally on top of that.

You'd also of course still want your globally-optimizing batch mode compiler
for production, but for many projects it seems like such a thing could produce
good enough binaries for development, and greatly improve engineer
productivity.

~~~
badsectoracula
It'd look like Delphi i'd guess, at least the earlier versions. Delphi 2 is
stupidly fast, almost instant compilation on a contemporary PC that runs at
around 200MHz with 16MB of RAM and practically instant (in that it is
impossible to notice it) in any modern PC (a synthetic benchmark i wrote some
weeks ago had it compile a bit above 10MLOC in 5.56 seconds on my 3700X CPU,
which is an average priced consumer CPU - on a single thread).

While it doesn't have the niceties added in later versions and in Free Pascal
(like dynamic arrays, generics, anonymous functions and a few other things) it
still has more than enough features to create big complex applications (e.g.
object oriented language, rich RTTI that allows automatic serialization,
properties, native string support, optional reference counting, real module
support).

In terms of generated code... well, it isn't exactly great (it came last on
this benchmark i wrote some time ago
[http://runtimeterror.com/tools/raybench/](http://runtimeterror.com/tools/raybench/)
that compares some C and Pascal compilers - i mainly wrote it for retrocoding,
but threw in some modern compilers as well - EDIT: i misremembered, it didn't
came last, Borland C++ 5.0 came last, in fact it came slightly above Free
Pascal 3.0.4 with its default settings, though by enabling 64bit and modern
instructions FPC was able to generate much faster code).

But then again, it is written in itself and as i wrote, it is stupidly fast.
So it does the work perfectly fine for most tasks - as long as you don't need
brute force number crunching.

Of course i'm not advocating the use of Delphi 2 (and i do not know how it
compares with modern Delphi - last time i tried out the free version they have
the entire IDE felt very weird and sluggish). However the fact that it exists
is a proof that you can have a very fast compiler and IDE for a rich language
that produces very good code - maybe not the best possible code, but good
enough for a large majority of tasks.

~~~
Certhas
I started out with Pascal. Compilation was such a non-event... Subjectively
entire programs compiled faster than a line of Julia takes today. Then again,
if I remember correctly Pascal the language was written with ease of compiling
in mind. My compiler today has to do a lot more work so I can do a lot less...

~~~
badsectoracula
Yeah although Delphi/Object Pascal - even in its Delphi 2 incarnation - is a
much more complicated beast than Wirth's Pascal. But i think most of the
extensions that were introduced over the years were still in the same spirit
of easy compilation and parsing.

------
xiphias2
It would be great to have a timeline as well: when does the project plans to
superseed the current RLS? What's the goal for end of 2020?

At the same time I know how hard is to estimate projects.

~~~
nindalf
From what I've read, the maintainers don't yet want to discuss when rust-
analyzer will be the recommended IDE experience.

As a user, I can say it's pretty good and you should use it today if you're
writing Rust code. So to me, it doesn't matter whether the docs recommend it,
or if it's distributed by default with the compiler. I just use it and tell
everyone I know to use it.

------
unnouinceput
Quote: "The main thing is that the command line (or batch) compiler is
primarily optimized for throughput (compiling N thousands lines of code per
second), while an IDE compiler is optimized for latency (showing correct
completion variants in M milliseconds after user typed new fragment of code)"

And then there's Delphi, doing both.

~~~
int_19h
Delphi could pull this off by being a language designed to be easy to parse
(and often unnecessarily restrictive or verbose as a result). This goes all
the way back to Turbo Pascal - while Borland's DOS C compilers were also very
fast, their Pascal was unbeatable because of very straightforward parsing, and
true separate compilation of units.

~~~
unnouinceput
*can pull this off. To this day Delphi is stupidly fast when compiling. Despite lately being bloated with stuff it didn't really need, but hey! C/C++ standard is all the rage so why not.

------
Dowwie
I am so grateful for the effort that people are putting into the rust-analyzer
project. It keeps getting better. I wonder whether an entire class of
challenges that the project faces is attributed to electron?

~~~
nindalf
I use rust-analyzer and I don't think it has any issues related to electron.

------
CameronNemo
I have been using racer and neomake with a lot of success, fwiw.

