
Rust required to build Gecko - steveklabnik
https://groups.google.com/forum/m/#!msg/mozilla.dev.platform/Gzwh1IbxvHE/7RlO21I6DwAJ
======
sandyarmstrong
> Rust language in Gecko is an important part of Project Quantum.

For anybody who missed this, Project Quantum is a Mozilla project to
dramatically improve Gecko. Part of this project is to bring in Servo
components like CSS and WebRender, hence the Rust dependency.

More awesome info:

[https://wiki.mozilla.org/Quantum](https://wiki.mozilla.org/Quantum)

[https://medium.com/mozilla-tech/a-quantum-leap-for-the-
web-a...](https://medium.com/mozilla-tech/a-quantum-leap-for-the-
web-a3b7174b3c12#.9tuvysk2r)

[https://billmccloskey.wordpress.com/2016/10/27/mozillas-
quan...](https://billmccloskey.wordpress.com/2016/10/27/mozillas-quantum-
project/)

~~~
harisamin
Awesome!

------
userbinator
Gecko is already dependent on a lot of things so adding Rust might not be that
big of a change, but IMHO a long and large list of dependencies and complex
build processes are what can really put off those wanting to contribute to
browsers (among other software) by fixing bugs or whatever else. I have
personally experienced a few times where it was easier and faster to just
patch the binary than figure out how to compile everything from source and go
through the whole configure/install/etc process again.

Especially with browsers, which not everyone agrees on how they should be and
desires to customise, only to find that the option to do so has been removed
or a source change must be performed, is subsequently delighted to know that
it's open-source so they should be able to do it easily, but then get
overwhelmed and give up after they realise the effort needed just to build an
_unmodified_ version of the software themselves. They then fall back to merely
complaining on the Internet, and reluctantly accepting their "fate"...
somehow, I feel like some of the visions of open-source didn't quite turn out
as well as hoped.

~~~
necessity
Uh, yes, it is a big change, Rust is huge. I for one am dropping this
bloatware for a WebKit browser once the changes hit Gentoo.

~~~
shakna
V8 seems a bit more bloated to me than what comes bundled with Firefox.

~~~
Matthias247
Maybe yes, maybe no. But Webkit doesn't use V8 anyway :)

~~~
shakna
True, I may have assumed they meant Blink, and something something Chromium,
which breaking away from V8 isn't particularly easy, even if one is the CSS
engine, and the other the JS engine. Sorry for leaping to conclusions.

But they may have meant Midori or similar which actually uses WebKit, through
WebKitGTK+... Which means linking against GTK+, probably GTK+3 to get the main
benefits like single-page not whole application crashing.

Which depending on your custom build, may or may not be an absolutely huge
overhead.

It could be a tiny change to add to an OS. Or huge.

------
echelon
Congrats Mozilla! It's fantastic to see Rust becoming a cornerstone of
Firefox. Rust has such a bright future ahead, and it's going to lead to great
productivity and safety gains.

~~~
newscracker
> Rust has such a bright future ahead

I believe it'd be mutual - Firefox would have a bright future because of
Rust...and vice versa.

------
shmerl
Does Mozilla plan a new Firefox based on Servo, or simply to replace parts of
Gecko with Servo parts? And if the later, why not to make a new browser from
Servo as a parallel project that will eventually match Firefox in
functionality?

~~~
Vinnl
Matching Firefox in functionality is quite an undertaking, so for the short to
mid term, the former is far more likely to be successful.

That said, I think Servo has a bright future ahead as well, even before the
long term. For example, matching the functionality of embedded rendering
engines for hybrid apps is far more likely, and Servo has its speed as a major
advantage over the competition. And who knows, it might make it into Firefox
for Android :)

~~~
cesarb
I tried Servo a few days ago, and it matches my experiences with the single-
digit milestone releases of Mozilla
([https://en.wikipedia.org/wiki/History_of_Mozilla_Application...](https://en.wikipedia.org/wiki/History_of_Mozilla_Application_Suite#Release_history)):
I could easily make it crash in just a few minutes.

Back then, it didn't take long until Mozilla was stable and usable enough that
I could use it as my main daily browser, and I expect the same to also happen
with Servo. Of course, Firefox is far ahead in functionality, but Servo
doesn't need all of Firefox's features to be successful.

~~~
Manishearth
> Servo doesn't need all of Firefox's features to be successful.

The problem is that the modern web is complicated. You have a lot of features
like svg which don't get used pervasively but are enough for it to impact
experience.

As for Servo crashing, we don't really prioritize crash fixes since it's not a
product at the moment. But please do file bugs for it if you think it's not a
known crash.

~~~
cesarb
Already filed,
[https://github.com/servo/servo/issues/14575](https://github.com/servo/servo/issues/14575)

(I was going to the Wikipedia article on "Animated GIF" because I wanted to
see how well Servo worked with animated images, and I knew I'd find one there.
Sadly, it panics every time.)

~~~
Manishearth
Doesn't seem like it's a bug with animated gifs, actually (though it could
be).

But you're right, we don't support animated gifs.

~~~
cesarb
It's worse: from the error message, it seems to be a bug with floats. From my
experience editing Wikipedia, floats tend to be a pain. Do you know why the
[edit] link on the sections is on the left? It used to be on the right, as a
float, and its interaction with the right-aligned images (which are also
floats) often caused problems. I've blanked out the several templates we had
to use to try to make floats behave themselves.

And there was also that Mozilla bug where IIRC some kind of rounding error
caused 1-pixel misalignments with floats. Took a long time until that one was
fixed.

~~~
Manishearth
Yeah, floats are in flux right now in Servo. I'm not sure if pcwalton's new
stuff is completely finished. I'd suspect the edit issue is also a float bug
or just a CSS property not being implemented.

------
Animats
Looking forward to the day when builds no longer require a C/C++ compiler.

~~~
Null-Set
Rust itself depends on clang to build the llvm backend it runs on top of, so
you can't get away from c++ that easily.

~~~
lambda
If Gecko is ever able to get rid of all of it's C++, it won't happen for quite
a while, that will be an enormous undertaking. I would expect, if Rust is
successful enough that that is feasible, then by that time there would be
multiple different Rust implementations, and some of them may be written in
pure Rust.

There's no fundamental reason that the Rust compiler has to be dependent on
LLVM. It's just a good strategic decision as it allows the Rust developers to
focus on the parts of the compiler that are unique to Rust, and use a well
tested backend with lots of existing optimizations and targets for the parts
that aren't particularly unique to Rust. There is actually some discussion
already of using a faster, pure-Rust backend for debug builds, and maybe far
in the future for release builds as well ([https://internals.rust-
lang.org/t/possible-alternative-compi...](https://internals.rust-
lang.org/t/possible-alternative-compiler-backend-cretonne/4275)).

------
sayrer
Great! Now use a build system that downloads the compiler too. For example,
Bazel will download a copy of the Go compiler if you are using it to build Go
programs.

~~~
dozzie
Actually, no, please don't. Build system _should not_ download shit from the
internets. Build system should only _build_.

~~~
berdario
I've already read this refrain: it seems to make sense, but I cannot really
justify why "build systems should only build" should be the case

What's the advantage over a build system that downloads the dependencies, but
which gives you the ability to prefetch all the dependencies (so you can
reliably do work without connectivity after prefetching)?

~~~
dozzie
To clarify: I'm not against managing (downloading) dependencies. I just want
two separate steps: [download things] and [compile things].

This separation allows to build the project off-line, which is not an uncommon
scenario. There are environments that have the direct internet access
prohibited e.g. by a company policy. There are environments that have it
difficult (e.g. are behind a proxy). There are distribution package builders
(RPM, DEB), which have a policy of only working on local sources.

And then there is _build reproducibility_. If external network is involved,
the whole reproducibility idea goes out of the window. Remember the left-pad
farce? Part of the cause was idiotic split to microdependencies, but part was
that everybody used external network for their build process.

~~~
berdario
> To clarify: I'm not against managing (downloading) dependencies. I just want
> two separate steps: [download things] and [compile things].

Fair enough, but if you have outdated dependency artifacts, it doesn't make
sense to compile without fetching them, and so I think it's reasonable to make
the 2nd step depend on the 1st one (and thus make it execute automatically)

I understand that keeping the 2 of them too-close might inadvertently conflate
the 2 concerns (the build step becomes impossible to run without the download
step) even if the design goals explictly thought of them as being able to be
run independently, so this might be an argument for "builds should only
build", but I'm not sold on it yet

> And then there is build reproducibility. If external network is involved,
> the whole reproducibility idea goes out of the window. Remember the left-pad
> farce? Part of the cause was idiotic split to microdependencies, but part
> was that everybody used external network for their build process.

Well, the build systems that I have in mind are Stack (
[http://haskellstack.org/](http://haskellstack.org/) ) and Nix (
[http://nixos.org/nix/](http://nixos.org/nix/) )

Both of them have an heavy emphasis on reproducibility, and yet both of them
automatically download dependencies.

(the Hydra Nix continuous integration systems OTOH run build and test steps
with limited/no network connectivity, to enforce that separation of concerns)

(the microdependencies btw wasn't a problem that caused build non-
reproducibility, microdependencies only made it worse when the shit hit the
fan)

Also, there are 2 different reproducible build failures: it doesn't build
(bad) or it builds but the artifact is not identical (worse)... downloading
dependencies from the internet can only cause problems of the first type, if
you have a sane way of fetching them

sane means: either you have the hashes of the dependencies to check, or you
can trust the archive to always give you the same files when you ask it for a
snapshot... and for security reasons you'd still check the crypto signature on
the hashes to verify that nothing has been tampered in the mirror from which
you downloaded

(Unfortunately, not only Npm but also Hackage afaik have that anti-feature
that allows you to reupload a "bugfix" with the same version number... so we
cannot always trust that the systems that we're using are sane, which OTOH is
impossible also due to DMCA or other law/court shenanigans: if your dependency
violates someone's copyright, it might end up disappearing from the snapshots)

~~~
datalist
> if you have outdated dependency artifacts, it doesn't make sense to compile
> without fetching them

I am sorry, I'd contest that. There are often cases where one might not want
to use the most current version of some library (incompatibility, undesired
features, etc.).

This entire forced update approach is already unpleasant enough in the context
of applications (most notably on Android and IOS) but becomes unbearable for
software development. While one might argue it could still make some sense for
the former, as average users might stay forever with old versions and
potential security issues, that argument should not be valid in a
"professional" context - where the "users" know about the implications - as it
should be given with software engineering.

~~~
berdario
> I am sorry, I'd contest that. There are often cases where one might not want
> to use the most current version of some library (incompatibility, undesired
> features, etc.)

I think we're misunderstanding each other.

What I meant is that if you have foo-1.2 in your dependencies.conf, and you
have previously downloaded foo-1.1 it doesn't make sense to compile, because
the result of the compilation will most likely not be what you want

~~~
aninhumer
Sure, but that doesn't mean you need to combine downloading and building.

You can just have the build say "foo-1.2 not found. Latest version is
foo-1.1", and the developer can decide how to respond.

~~~
berdario
Well, there are 3 cases:

\- "usual" case (if you agree): dependencies will be downloaded from the
Internet, single command download+build is useful

\- in a bank/SC environment: the developer configured a proxy/local mirror of
the package repositories, and the dependencies will be downloaded
automatically, single command download+build is useful

\- in a bank/SC environment: manual process to obtain and add locally the
dependencies, the build/dev system has limited networking: automatically
download fails, single command download+build is not useful but not harmful
either

Having a "prefetch" and "offline-build" commands are perfectly fine, but I
don't see the reason why the default shouldn't be a "build" command that does
prefetch+offline-build

