Hacker News new | past | comments | ask | show | jobs | submit login
Can Servo be a clean and modern reference platform for developers? (github.com/servo)
62 points by slasaus on Aug 27, 2015 | hide | past | favorite | 27 comments



"Personally I would prefer to have a less complex (and hopefully more secure) engine, over a monolithic blob of code that is compatible with a lot of websites but will end up to be utterly complex (as a NoScript/RequestPolicy user I don't mind sites to break a bit)."

It seems to me that the author of the question never looked at any of the details of what and how is actually implemented and asks "just so, in principle." Which can be more distracting that bringing any value to the project.

Even if we stay on that level of "not knowing what we actually and specifically want and don't want" what's the expected use of the engine that effectively doesn't work? The "reference." Would enough web developers then really even consider it worth of the effort? What would the web developers have by producing the code for something that somebody calls the "reference" which is otherwise unusable?


The benefit would be that compatibility code (which can obfuscate the base a lot and take a large proportion of the code base) would be in a different branch or automatically added. Web developers could use tools like cssnext to automatically add all the goop without ever being distracted by it themselves. Less code needs to be reviewed, tested and maintained.

As for the engine side, the benefits would be the same like with OpenSSH etc. as mentioned in the GH issue. Unfortunately (or fortunately if he is right) a Servo developer thinks "this separation (if it’s even possible) sounds like a bunch of additional work with no benefit". I'm afraid this nice and shiny new engine will end up being complex and unmaintainable like all the others. Again this is speculation not based on facts, but experience with software development in general.


I think you're under the impression that new Web sites tend to avoid all the "goop". Presumably for OpenSSH, if you control the server and it's properly configured you can drop the ugly, old code. Not so on the Web. Old and new Web sites alike rely on all sorts of corner cases.

If I had to name the parts of Servo I'd love to drop, it wouldn't be what you think: it'd probably be inline hypothetical box/containing block calculation, followed by CSS 2.1.E compliant content layerization. Probably margin-collapse too. These areas are where most of the complexity and bugs lie, but they're also the areas that Web developers won't give up. Drop correct rendering of inline hypothetical blocks and you break google.com.

The fact is that simple Web browsers already exist (ELinks, for example). You aren't using them because they don't support enough of the "ugly" features. Browsers have to be complex because the Web is complex. The Web is complex because people rely on the complex features.


/updated

> If I had to name the parts of Servo I'd love to drop, it wouldn't be what you think ...

Interesting.

> I think you're under the impression that new Web sites tend to avoid all the "goop".

Well as a web developer myself I do realize I'm adding a lot of workarounds, that's part of my frustration. It seems counterproductive for you guys as engine developers, and us web developers to use such an ever increasing amount of code just to work around stuff we throw at each other.

> Presumably for OpenSSH, if you control the server and it's properly configured you can drop the ugly, old code. Not so on the Web. Old and new Web sites alike rely on all sorts of corner cases.

Maybe for OpenSSH, but I think OpenSSL suffers from the same legacy compatibility issues as a browser. Still LibreSSL took an OpenSSH like approach by separating out all system specific exceptions from the base and put them in a different portability package.

> ... (ELinks, for example). You aren't using them because they don't support enough of the "ugly" features.

I'm not using elinks or other text based browsers not because they miss a lot of features, but because they're not as secure as I did expect from a text based browser[0]. They are still full of all sorts of features[1] that add to it's complexity without having enough value, imho. I think it's a feature to be not full featured.

[0] http://marc.info/?l=openbsd-tech&m=140516601718662&w=2 about gopher support in lynx

[1] Elinks: Full-Featured Text WWW Browser - Lots of protocols (local files, finger, http, https, ftp, smb, ipv4, ipv6) http://elinks.or.cz/about.html


> Web developers could use tools like cssnext to automatically add all the goop without ever being distracted by it

Can't they do that now? What exactly would that proposal bring new?

> Less code needs to be reviewed, tested and maintained.

Now there's four, then would be five different browsers. Why does this reduce anybody's work?

> this is speculation not based on facts, but experience with software development in general.

I'd say that experience you refer to doesn't include the one described in:

https://xkcd.com/927/


> I'd say that experience you refer to doesn't include the one described in:

> https://xkcd.com/927/

In any case, this is a most innovation-unfriendly reference that is all too common.


Independently of the generality of your remark, here we have the very specific case of somebody not even knowing how the Servo is implemented or developed suggesting that the Servo developers should do how he says, not how they do, without giving any real proof of benefit to anybody.


> here we have the very specific case of somebody not even knowing how the Servo is implemented or developed suggesting that the Servo developers should do how he says, not how they do, without giving any real proof of benefit to anybody.

OP here.

Totally right, because I didn't know how Servo is implemented, I've asked. Sorry for mixing in some emotion and hope about which answer I would get. I respect any developer doing any work and do realize it's all too easy to spread opinion without hacking on some solution.

Maybe as a non-contributor I should not have ventilated my opinion.


Given that my comment was called out here, I should probably clarify. The vast majority of the bugs I've been fixing in Servo have been related to standards compliance, not inserting hacks to make existing Web sites work.

There will unavoidably be some of the latter, but the fact is that the Web standards are pretty decent nowadays.


Another issue I have with browsing in general is that it's such a complex eco-system, there is simply no real secure webbrowser around that meets standards comparable with other secure software like RedPhone [1], Dovecot, Postfix, OpenBSD etc. All JavaScript engines are huge complex beasts (yes, V8 too), the new EcmaScript spec is 566 pages long[2], WebRTC and all it's required dependencies, xml, WebGL, etc. etc.

Even in order to be "only" compliant with modern specs, there is no escape from creating an extremely complex piece of software. This hurts real world security for everybody to the point where it might be better not to browse at all on machines with which you can access private servers or your private mail or conversations. It would be nice if there were some smaller browsers that didn't focus so much on backwards compatibility, but on leveraging the most important features so that most of the web could be browsed, still without sacrificing end-user security by implementing each and every feature and performance tweak. In the end, the OS and the browser should be trustworthy by the user.

[1] "Overall code quality: After reading Moxie's RedPhone code the first time, I literally discovered a line of drool running down my face. It's really nice." http://blog.cryptographyengineering.com/2013/03/here-come-en...

[2] http://www.ecma-international.org/publications/files/ECMA-ST...


> It would be nice if there were some smaller browsers that didn't focus so much on backwards compatibility

There are lots of those.

> but on leveraging the most important features so that most of the web could be browsed, still without sacrificing end-user security by implementing each and every feature and performance tweak.

That's simply not possible. Even if you restrict yourself to the Alexa Top 100, browsing is impossible without implementing most of HTML5, JS, CSS3 (media queries + gradients + borders + flexbox + ...), SVG, etc. Even if you restrict yourself to just Wikipedia, most of the Web platform gets used (including some uncommon stuff like multicol).

If you want this to be viable, you need to convince Web authors to stop using the features you don't like.


> If you want this to be viable, you need to convince Web authors to stop using the features you don't like.

Interesting point :) I guess what I'm talking about would be a niche browser. The one that currently comes closest to this is Xombrero[1], still delivering a lot of highly complex code because that's all we got nowadays.

[1] https://opensource.conformal.com/wiki/xombrero


The reality is that we want everyone to be happy, so we are often forced to workaround the workarounds. The end result is everything gets slowed down and messed up gradually.

This happens because of many reasons but it will happen in the future again and again. This is part of software development.

The best thing we can do to reduce this kind of cost is to think before we do. A good design and discipline often saves much.


I would rather a realistic and functional reference for developers. Being a clean and modern charade is almost certainly part of the problem with most reference material.


Mozilla really should be as aggressive as it possibly can be about Servo and rewriting other core components in Rust. This is what will give Firefox that new "fresh look" that Firefox is in dire need of. If they do this and they do it quickly I guarantee both developers and users will start actually being excited about Firefox again, as opposed to content at best (mainly because it's what they were already using or have a radical anti-Google stance, despite all the security benefits of Chrome compared to Firefox right now).

In my opinion, Mozilla shouldn't have even tried to implement Electrolysis in Firefox and shouldn't have announced a switch to WebExtensions in it either. Instead, it should've done what Microsoft did and create a new browser in Rust from scratch, with a much cleaner, easier to maintain codebase that would also be much more secure in the long term.

I don't know if Mozilla's funding would even allow for that, but I think that would've been the better strategy. The putting lipstick on a pig usually doesn't work, and it usually pisses off those who preferred the way the pig looked before, too, causing said platform to lose both old and potentially new users.


This is a false dichotomy. Multi-process and sandboxing are sorely needed, so is a way to make the surface area available to extensions more stable/safe/secure.

Waiting for a whole-cloth Servo-based browser to be totally competitive would be not learning from the not-so-distant past, where Netscape 5 was canceled in favor of the "correct" option of shipping the totally-rewritten Mozilla Suite. Netscape 4 withered away in the market, and it took additional years for the Mozilla browser to be ready and Firefox for it to find its footing.

Making these changes to Firefox benefits users now, and deprecating XPCOM-based extensions in favor of Web Extensions helps pave the way to supporting other rendering engines, which will never support XPCOM (like Servo).

In the meantime Rust is finding its way into Firefox, so code and lessons being learned in Servo (and elsewhere) can find their way to users sooner.


> despite all the security benefits of Chrome compared to Firefox right now

I'm wondering, what kind of security benefits you had in mind?


I guess mostly sandboxing? e10s is coming to Firefox, but per tab isolation might take a while (first step will be 1 process for the UI, 1 process for all tabs).


The isolation delays are mostly for political reasons rather than technical ones. Per tab isolation will necessarily kill off most of the legacy addon apis (XPCOM and XUL) and make firefox addons more restricted like chrome ones.


This is more than a bit premature. Mozilla hasn't announced anything about how Servo and Firefox may or may not be integrated.


The blog post about the new extension API explicitly mentioned merging in Servo in the future as if it was already planned... What gives?


Exactly what's been announced. The add-ons change is designed to pave the way for new technologies (Servo being just one of many), but there have been no definite plans announced.


Actually, and as far as I know, Servo is a research project targeted at the mobile market and not the desktop one. So it's not going to replace Firefox's engine on the desktop anytime soon.

(someone correct me if I'm wrong)


Are the latest standards not full of legacy ideas though? Really we need to throw away html/css/js and create something better from everything that has been learned.

What about replacing clientside javascript with Rust?


  > What about replacing clientside javascript with Rust? 
This is, in theory, possible with emscripten or, someday, WebAssembly. See http://myth.aaronlindsay.com/test/ as an example of Hematite, a Minecraft renderer, compiled with Emscripten.

Doing this isn't exactly easy at the moment, as Rust and Emscripten haven't been using the same LLVM versions for a long time, but will eventually be pretty easy to do.

I still plan on just writing JavaScript, personally.


This is simply never going to get widely deployed.


Why not replace it with something roughly similar, but better? Such as Lua.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: