
Can Servo be a clean and modern reference platform for developers? - slasaus
https://github.com/servo/servo/issues/7409
======
acqq
"Personally I would prefer to have a less complex (and hopefully more secure)
engine, over a monolithic blob of code that is compatible with a lot of
websites but will end up to be utterly complex (as a NoScript/RequestPolicy
user I don't mind sites to break a bit)."

It seems to me that the author of the question never looked at any of the
details of what and how is actually implemented and asks "just so, in
principle." Which can be more distracting that bringing any value to the
project.

Even if we stay on that level of "not knowing what we actually and
specifically want and don't want" what's the expected use of the engine that
effectively doesn't work? The "reference." Would enough web developers then
really even consider it worth of the effort? What would the web developers
have by producing the code for something that somebody calls the "reference"
which is otherwise unusable?

~~~
slasaus
The benefit would be that compatibility code (which can obfuscate the base a
lot and take a large proportion of the code base) would be in a different
branch or automatically added. Web developers could use tools like cssnext to
automatically add all the goop without ever being distracted by it themselves.
Less code needs to be reviewed, tested and maintained.

As for the engine side, the benefits would be the same like with OpenSSH etc.
as mentioned in the GH issue. Unfortunately (or fortunately if he is right) a
Servo developer thinks "this separation (if it’s even possible) sounds like a
bunch of additional work with no benefit". I'm afraid this nice and shiny new
engine will end up being complex and unmaintainable like all the others. Again
this is speculation not based on facts, but experience with software
development in general.

~~~
pcwalton
I think you're under the impression that new Web sites tend to avoid all the
"goop". Presumably for OpenSSH, if you control the server and it's properly
configured you can drop the ugly, old code. Not so on the Web. Old and new Web
sites alike rely on all sorts of corner cases.

If I had to name the parts of Servo I'd love to drop, it wouldn't be what you
think: it'd probably be inline hypothetical box/containing block calculation,
followed by CSS 2.1.E compliant content layerization. Probably margin-collapse
too. These areas are where most of the complexity and bugs lie, but they're
also the areas that Web developers won't give up. Drop correct rendering of
inline hypothetical blocks and you break google.com.

The fact is that simple Web browsers already exist (ELinks, for example). You
aren't using them because they don't support enough of the "ugly" features.
Browsers have to be complex because the Web is complex. The Web is complex
because people rely on the complex features.

~~~
slasaus
/updated

> If I had to name the parts of Servo I'd love to drop, it wouldn't be what
> you think ...

Interesting.

> I think you're under the impression that new Web sites tend to avoid all the
> "goop".

Well as a web developer myself I do realize I'm adding a lot of workarounds,
that's part of my frustration. It seems counterproductive for you guys as
engine developers, and us web developers to use such an ever increasing amount
of code just to work around stuff we throw at each other.

> Presumably for OpenSSH, if you control the server and it's properly
> configured you can drop the ugly, old code. Not so on the Web. Old and new
> Web sites alike rely on all sorts of corner cases.

Maybe for OpenSSH, but I think OpenSSL suffers from the same legacy
compatibility issues as a browser. Still LibreSSL took an OpenSSH like
approach by separating out all system specific exceptions from the base and
put them in a different portability package.

> ... (ELinks, for example). You aren't using them because they don't support
> enough of the "ugly" features.

I'm not using elinks or other text based browsers not because they miss a lot
of features, but because they're not as secure as I did expect from a text
based browser[0]. They are still full of all sorts of features[1] that add to
it's complexity without having enough value, imho. I think it's a feature to
be not full featured.

[0] [http://marc.info/?l=openbsd-
tech&m=140516601718662&w=2](http://marc.info/?l=openbsd-
tech&m=140516601718662&w=2) about gopher support in lynx

[1] Elinks: Full-Featured Text WWW Browser - Lots of protocols (local files,
finger, http, https, ftp, smb, ipv4, ipv6)
[http://elinks.or.cz/about.html](http://elinks.or.cz/about.html)

------
pcwalton
Given that my comment was called out here, I should probably clarify. The vast
majority of the bugs I've been fixing in Servo have been related to standards
compliance, not inserting hacks to make existing Web sites work.

There will unavoidably be some of the latter, but the fact is that the Web
standards are pretty decent nowadays.

------
slasaus
Another issue I have with browsing in general is that it's such a complex eco-
system, there is simply no real secure webbrowser around that meets standards
comparable with other secure software like RedPhone [1], Dovecot, Postfix,
OpenBSD etc. All JavaScript engines are huge complex beasts (yes, V8 too), the
new EcmaScript spec is 566 pages long[2], WebRTC and all it's required
dependencies, xml, WebGL, etc. etc.

Even in order to be "only" compliant with modern specs, there is no escape
from creating an extremely complex piece of software. This hurts real world
security for everybody to the point where it might be better not to browse at
all on machines with which you can access private servers or your private mail
or conversations. It would be nice if there were some smaller browsers that
didn't focus so much on backwards compatibility, but on leveraging the most
important features so that most of the web could be browsed, still without
sacrificing end-user security by implementing each and every feature and
performance tweak. In the end, the OS and the browser should be trustworthy by
the user.

[1] "Overall code quality: After reading Moxie's RedPhone code the first time,
I literally discovered a line of drool running down my face. It's really
nice." [http://blog.cryptographyengineering.com/2013/03/here-come-
en...](http://blog.cryptographyengineering.com/2013/03/here-come-encryption-
apps.html)

[2] [http://www.ecma-international.org/publications/files/ECMA-
ST...](http://www.ecma-international.org/publications/files/ECMA-
ST/Ecma-262.pdf)

~~~
pcwalton
> It would be nice if there were some smaller browsers that didn't focus so
> much on backwards compatibility

There are lots of those.

> but on leveraging the most important features so that most of the web could
> be browsed, still without sacrificing end-user security by implementing each
> and every feature and performance tweak.

That's simply not possible. Even if you restrict yourself to the Alexa Top
100, browsing is impossible without implementing most of HTML5, JS, CSS3
(media queries + gradients + borders + flexbox + ...), SVG, etc. Even if you
restrict yourself to just _Wikipedia_ , most of the Web platform gets used
(including some uncommon stuff like multicol).

If you want this to be viable, you need to convince Web authors to stop using
the features you don't like.

~~~
slasaus
> If you want this to be viable, you need to convince Web authors to stop
> using the features you don't like.

Interesting point :) I guess what I'm talking about would be a niche browser.
The one that currently comes closest to this is Xombrero[1], still delivering
a lot of highly complex code because that's all we got nowadays.

[1]
[https://opensource.conformal.com/wiki/xombrero](https://opensource.conformal.com/wiki/xombrero)

------
github-cat
The reality is that we want everyone to be happy, so we are often forced to
workaround the workarounds. The end result is everything gets slowed down and
messed up gradually.

This happens because of many reasons but it will happen in the future again
and again. This is part of software development.

The best thing we can do to reduce this kind of cost is to think before we do.
A good design and discipline often saves much.

------
taeric
I would rather a realistic and functional reference for developers. Being a
clean and modern charade is almost certainly part of the problem with most
reference material.

------
mtgx
Mozilla really should be as aggressive as it possibly can be about Servo and
rewriting other core components in Rust. This is what will give Firefox that
new "fresh look" that Firefox is in dire need of. If they do this and they do
it quickly I guarantee both developers and users will start actually being
_excited_ about Firefox again, as opposed to content at best (mainly because
it's what they were already using or have a radical anti-Google stance,
despite all the security benefits of Chrome compared to Firefox right now).

In my opinion, Mozilla shouldn't have even tried to implement Electrolysis in
Firefox and shouldn't have announced a switch to WebExtensions in it either.
Instead, it should've done what Microsoft did and create a new browser in Rust
from scratch, with a much cleaner, easier to maintain codebase that would also
be much more secure in the long term.

I don't know if Mozilla's funding would even allow for that, but I think that
would've been the _better_ strategy. The putting lipstick on a pig usually
doesn't work, and it usually pisses off those who _preferred_ the way the pig
looked before, too, causing said platform to lose both old and potentially new
users.

~~~
Ygg2
> despite all the security benefits of Chrome compared to Firefox right now

I'm wondering, what kind of security benefits you had in mind?

~~~
slasaus
I guess mostly sandboxing? e10s is coming to Firefox, but per tab isolation
might take a while (first step will be 1 process for the UI, 1 process for all
tabs).

~~~
bbatha
The isolation delays are mostly for political reasons rather than technical
ones. Per tab isolation will necessarily kill off most of the legacy addon
apis (XPCOM and XUL) and make firefox addons more restricted like chrome ones.

------
hacker_9
Are the latest standards not full of legacy ideas though? Really we need to
throw away html/css/js and create something better from everything that has
been learned.

What about replacing clientside javascript with Rust?

~~~
steveklabnik

      > What about replacing clientside javascript with Rust? 
    

This is, in theory, possible with emscripten or, someday, WebAssembly. See
[http://myth.aaronlindsay.com/test/](http://myth.aaronlindsay.com/test/) as an
example of Hematite, a Minecraft renderer, compiled with Emscripten.

Doing this isn't exactly easy at the moment, as Rust and Emscripten haven't
been using the same LLVM versions for a long time, but will eventually be
pretty easy to do.

I still plan on just writing JavaScript, personally.

