Hacker News new | past | comments | ask | show | jobs | submit login
Servo Continues Pushing Forward (servo.org)
354 points by robin_reala on May 1, 2015 | hide | past | web | favorite | 100 comments

If someone is interested in contributing and not sure where to start, here's a list of issues tagged as easy: https://github.com/servo/servo/labels/E-easy

Not sure if it is still true but a few months back (6 maybe?), there was a post on HN about contributing to Servo where OP said that he found it easy to contribute despite the fact that he didn't have prior experience in rust. As this post mentions, the code base is relatively small and a lot of simple features were still missing, at least back then.

Bugs Ahoy is a handy tool for finding easy issues in Mozilla projects but seems to be down at the moment http://www.joshmatthews.net/bugsahoy/?servo=1

  > OP said that he found it easy to contribute despite the 
  > fact that he didn't have prior experience in rust
Just yesterday I was talking to the person who contributed WebGL support to Servo (a relatively recent and substantial patch), and they reiterated this sentiment. It was their first time ever using Rust, and although they'd reluctantly done some C++ in the past their primary programming language is Javascript. They were pleasantly surprised at how easy it was to contribute such a large patch, and had bountiful praise for how well-organized the codebase is. The patch actually bounced a few times, and they were also very pleased at how the core contributors continued to work with them over the course of days to track down the breakage and keeps things moving.

This person's primary interest is in virtual reality, specifically WebVR (http://webvr.info/), and given Servo's incredible benchmarks so far they're very optimistic about its ability to advance the state of the art of virtual reality on the web.

Their biggest complaint was build times. Yeah... we're working on that. :)

How long does a Servo build take?

It takes around maybe ~15 minutes on my 6 year old laptop.

Nope, in the blog post I was referring to the author went to some event (in France, I believe) where some of core contributors helped him, if my memory serves well.

Still this is a good post, thanks for linking it!

Whoops! I merged a PR earlier today that broke searching Github issues; thanks for pointing that out!

Thank you for this! I'm just not getting acquainted with Servo and starting to get interested. This is a great place to start.

I guess I'll have to try contributing an easy patch over the weekend.

Hopefully it's as easy as you make it sound!

Last week I tried my hand at putting vim functionality into servo, a la pentadactyl.

It's not even close to being ready. Even rendering a simple page like hackernews is broken.

That's not to say I am not excited for this. I think servo will be huge and I think it will see widespread adoption. It's just that with all the news we hear about it, you might be lead to think that today it's competing with webkit. It's not. Soon, though.

Every time I hear about Servo, it mostly can be boiled down to:

A: We did a ton of work

B: We're not even remotely close to something really useful yet

I think that says more about how websites work, to be honest.

The stack of web technologies is staggeringly enormous. It may not feel like it just scanning over a simple HTML page, but it is, and there's a lot of invisible-yet-critical stuff like proper same-origin policy enforcement. The miracle is not that it takes Servo a while to be created, it's that this stack works at all, and that they're even trying!

Couldn't they just do something similar to Microsoft's Edge/IE11 approach? Servo for HTML5+/CSS3+/Javascript ES6+ and then keep Gecko as well inside Firefox for "legacy websites"?

HTML5/CSS3/ES6 define how to process legacy content as well as current — the only actual branching for versions is quirks mode and that has relatively minor effects which aren't that much code to implement.

Edge should, in principle, support everything, including legacy sites (at least as well as other browsers do today). Trident is only being kept around for sites that do extensive UA sniffing and rely on obscure behaviour — and pretty much only for the intranet.

If the point is to make a more secure browser, allowing a webpage to opt-in to use the old, less secure engine instead is probably a non-starter.

(In any case, I think the complexity is fundamental to many of the specs involved.)

HTML5 is no smaller than its predecessors. It's considerably larger.

Gecko is not exactly separable from Firefox (unlike Blokc/Webkit and I'm guessing Trident/Spartan). The firefox UI depends heavily on things like XUL and XPCOM and whatnot which are nonstandard features in Gecko and would be a pain to support in Servo.

I guess you could run two browser engines -- one for the UI, one for the content, but I'm not sure how well that would work.

FireFox using some libraries written for Servo is a nice deviation from that.

The sheer amount of work involved in getting Servo to a usable state should be less surprising when you consider that all of the browser engines in current use date back to the 90s in some form (1997 for Gecko and Trident, 1998 for KHTML/WebKit/Blink).

Not only that, but Servo is experimenting with new techniques for just about everything as they go. Progress may seem slow if you measure how "done" it is, but it's much faster than other projects if you measure how much research is being produced.

Research? I don't think they've published a single paper have they?

The parallel algorithms that Servo uses to implement advanced features of CSS are, as far as I know, novel (block formatting context size speculation, float impaction detection, absolute positioning traversals, parallel automatic table layout, parallel border-collapse).

I vaguely recall them discovering a number of bugs in the spec along the way as well.

Yes, we've discovered several spec bugs, as well as bugs in other browser engines.

Breakthroughs don't always result in research papers. They sometimes end up in products instead. :)

Practical research, not academic research.

There have been academic papers on parallel CSS layout, actually, and my Servo work extends that state-of-the-art. But I've got my hands full with hacking :)

You're the applied physics to CompSci's theoretical physics!

the B is wrong :> they are getting closer and closer to gecko rendering level. gecko vs servo (rendering a couple of sites) https://www.youtube.com/watch?v=pZGhnqtXVdc

Hacker News is surprisingly un-simple, due to extreme reliance on <center> (the major remaining broken thing, with annoying interactions with CSS) and automatic table layout (very complex).

To add to this, Hacker News does not display well in either Chrome for Android or Firefox for Android[0], so it's not like it's only Servo (or even only Firefox) that has issues.

[0] Notice the inconsistent sizes of comments (which actually does not always even correlate with nested depth): https://www.dropbox.com/s/b47lhe3px83wq9j/2015-05-01%2019.29...

That's a feature ("font boosting"). It's a total mystery to me how something working so badly could ship, especially since the Android Webkit fork had a similar feature working perfectly. But apparently it's of "acceptable quality" in Chrome https://bugs.webkit.org/show_bug.cgi?id=84186#c24

The basic problem is that it's hard to tell the difference between nested tables used for nested tabular content and nested tables being (ab)used for layout, as HN does.

Oh, that was happening on IE mobile and I just assumed it was their way of showing the post score on a mobile phone. Didn't realize it was a bug :)

Mobile browsers attempt to make content more prominent than navigation. This matters a lot on e.g. mobile news sites.

Unfortunately the algorithm they use breaks horribly on comment threads, because it decides that long comments are content, while short comments are not important. (At least, this is my deduction from seeing it in action.)

You'll see it on reddit, forum threads, and here on HN -- so this is not exactly a layout bug, but a problem with how Firefox/Chrome try to apply some simple semantics to web content.

Sadly true. That's why I'm not using Firefox on android despite using it on the desktop. They never got the right algorithm for font inflation (that's how they call it).

HN displays ok in Opera Android but mostly because Opera can reflow text on zoom. That fixes almost everything.

The Servo team plans to ship an alpha-quality release later this year, and my entirely amateur opinion is that Servo will be ready to compete with Webkit by 2017. Place your bets...

Is there any place where I could follow progress of Pentadactyl-like plugin for Servo?

Thank you! I'm glad someone cares about vim functionality enough to work on it.

I'm really curious how Servo (and Electrolysis which I know is un-related) are planning to handle the GPU. Most GL drivers if not all of them are not multi-thread happy.

Are they implementing an internal command buffer like Chrome or is there some other solution that allows multiple threads and/or multiple processes to all output content to the same window?

Separate per-thread GL contexts, with texture sharing via EGLImageKHR on EGL, pixmaps on X11, and IOSurface on Mac.

WebGL and content display lists are proxied to a separate process for security (in the multiprocess branch).

I feel like an idiot after reading that, and I'm studying EE hoping for a career in firmware/low-level programming. Good luck to me :)

You shouldn't. They're specifically titled data structures and functions inherent to various specific (some to a given platform) compositors, display servers and graphics libraries. You memorize them as you go, but it's the ideas that are more important.

Wayland might be taking over from X11 on a widely used Linux desktop as early as 6 months from now (Fedora 23):


Obviously X11 will not just disappear, but is targeting Servo at X11 at this early stage in it's development like writing a Python library targeting Python 2?

Well, when I wrote this stuff for the first time a couple of years ago, NVIDIA's official position was "we have no plans to support Wayland". Even now I suspect X will be around for years.

I will be the first to cheer X's demise, as we've had so many problems with multithreading and X, but we have no choice right now.

Here's an overview from last year showing nvidia's wayland support under way:


X11 won't be going away any time soon.

Also, the EGL backend should work just fine in Wayland.

And it's working? All my experience with multiple threads on different contexts is crash, crash, bug bug bug. I'm surprised it's stable at all on any platform.

I think we will need to wait for Vulkan before we get a true multithreading GL.

Vulkan will not be GL anything.

What do you mean?

It's 100% an entirely different API.

It's not "OpenGL" or "GL" or "GLnext", it's "Vulkan".

Vulkan's codename before reveal was "GLnext".


My uninformed guess is that they will rely on the new multithread friendly systems such as DX12 and the next generation of Open GL.

May be Vulkan will help with that.

What is the current goal with Servo? The last time I felt well informed, I think it was described as just a research project, with zero intention to ever supplant Firefox. Is that still the case? There seems to be an awful lot of promise and effort if there's no plans to make it a production ready browser.

Our official roadmap/goals for Servo are here: https://github.com/servo/servo/wiki/Roadmap

So, it's definitely no longer "just a research project," but the challenges of finding initial product targets and building out the rest of the Web Platform are definitely large ones!

At worst firefox will use some of the libraries that come out of servo. Firefox is supposed to use the url parser from servo by 2016.

Here's the patch that's tracking adding Servo's Rust-based URL parser to Firefox: https://bugzilla.mozilla.org/show_bug.cgi?id=1151899

If Servo becomes the default layout engine for Firefox, I really hope that having one more engine in the mix forces everyone towards standards compliance, or at least uniformity. Obviously, they're still working out basic functionality, so they've got a way to go before that becomes a concern. But if the project is successful, it's going to be one more browser type for web designers to worry about. Either that or Servo's multi-threaded speed and safety are so awesome that all the major browser developers decide to ditch their own engines and everybody settles on Servo.

One fun thing about reimplementing the Web Platform is that you run across all of the spec holes (particularly in much older features, like tables, borders, etc.). Where possible, we're trying to at least determine what current browsers do with the intent of codifying those behaviors in the specifications and W3C cross-browser test suite.

That said, if Trident/Edge, Webkit/Blink, and Gecko don't all agree on the implementation of an older feature, we know that it will be a challenge to find middle ground and get things changed and the spec updated to anything more than softer statements using, "MAY."

I have a feeling Servo will finally make me give up Chrome. From what I've seen so far the performance is very impressive [1]. I just wonder if Elecrolysis will ever arrive. It seems like I've been waiting for it forever. Chrome's sandboxing is great for security and performance purposes and I can't give that up (I never seem to have a problem switching to another tab even when I have 40 of them open in Chrome, while in Firefox the whole browser "hangs" sometimes when I load a heavy tab).

[1] - https://www.phoronix.com/scan.php?page=news_item&px=MTgzNDA

> I just wonder if Elecrolysis will ever arrive. It seems like I've been waiting for it forever.

It's here already, in Nightly: https://nightly.mozilla.org/ (more info: https://wiki.mozilla.org/Electrolysis).

I haven't really had the problem you're talking about in Firefox as it is, but you can try it out and see if Nightly is any better. Keep in mind that Nightly is slower overall than Firefox, because it has a number of debugging flags enabled that impact performance.

Obligatory disclaimer: Nightly has lots of bugs and quirks compared to Firefox, so it should not be considered indicative of the normal Firefox experience[0]. Also, many extensions have not yet been ported (which is why it's available in Nightly for developers before hitting Firefox).

[0] That said, it's pretty damn stable. I'd be using it for my daily browsing except that I can't live without Vimperator, which hasn't yet been ported to e10s: https://github.com/vimperator/vimperator-labs/issues/211

I know it's in Nightly, but wasn't it in Nightly 3-5 months ago, too? Shouldn't it have moved to the "next" development version by now, like Aurora or whichever is the next "more stable" one, if Firefox follows a 6-weeks development cycle?

Not every feature can be completed in a single six-week train, especially one as complex as rewriting huge chunks of a browser that was not initially designed with process separation in mind. :)

That said, we should have a go/no-go on Monday for uplift with Firefox 40. We're getting Very Close Now. :)

If we do the uplift, I believe the plan is also to hold there until we clear out a bunch of less-critical, but shipping-blocking bugs. So it might be on Aurora for a few releases.

Electrolysis isn't a Servo thing but a Firefox thing. Something that many of us have been waiting for also. From what I've seen with the way Servo works, it should be working in basically the same way from the beginning but I'm not as confident on that as I'd like to be. Since it renders and works on each iframe in another thread and each tab is implemented as an iframe with mozbrowser extensions I would think it should fix that problem.

I think he was talking about Firefox.

FWIW in Servo when talking of multiprocess stuff we colloquially call it "Electrolysis". I think Patrick has some work in progress on this. It's not as much work as the Firefox e10s (we already use threads and channels for everything), but I think there is some (relatively) minor work to be done.

how do you handle component sandboxing in Servo compared to firefox e10s? (firefox being child process sandbox, ipc to parent, and things being a little "unperfect" in the separation)

The plan is to use gaol [1] for sandboxing. For IPC, we want to send Rust objects serialized via serde [2] over native OS IPC pipes (AF_UNIX sockets on Unix).

[1]: https://github.com/pcwalton/gaol

[2]: https://github.com/serde-rs/serde

that looks nice i hope the separation between processes themselves will be nicer than firefox (or even chrome) since its "from scratch" :)

(ie that it wont be remoting everything and only protect the system from escaping a child - makes sense in firefox/chrome, less with rust being the compiler i guess)

I suspect that if servo doesnt try to be an OS like firefox/chrome do, a whole process sandbox may even be fine enough. But if it does.. :)

It does look like amazing performance. But It may be too early too conclude on that point. The gain of performance may only be because of less feature implemented in servo leading to less thing to compute => less computational time.

I never experience that in Firefox and have much more than 40 tabs open. Granted if the tab is not actually loaded, then it will go off and load it. How does Chrome's model help here?

I'd love the servo team to show more demos.


Could servo-shell be used to implement a keyboard driven browser (a la dwb)? Looking at https://developer.mozilla.org/en-US/docs/Web/API/Using_the_B... it looks like the API does only offer few interaction points with the underlying iframe or shouldn't that be a problem? Can the browser html intercept all key presses and change the iframe content to display follow links, for example? :) That might be a great way to finally have a better keyboard driven solution (since, i think that the browser addons like pentadactyl suck very much).

Are they planning on implementing a JavaScript engine in Servo as well?

No concrete plans for now. JS is complicated, and we don't have enough people to do that. Relying on a JS engine that's currently being used (which we know will get security updates) is much better than rolling our own, for now.

The js engine is separate from the rendering engine as far as I know. There might be some exciting new js engines written in Rust, but I don't think they would be part of Servo per se.

No, they plan on embedding SpiderMonkey like Firefox.

It uses SpiderMonkey.

If possible, will it be very complicated to use V8? Or is it just impossible?

Spidermonkey compares very well with V8: http://arewefastyet.com/

(beats it in Sunspider, is roughly equal in others)

And it maintains the lead in ECMAScript6 support: https://kangax.github.io/compat-table/es6/

It's not impossible, but I don't know why you'd want to use V8.

I would think a JavaScript engine would be done better on a different domain specific language than Rust. It seems that a virtual machine has a different set concerns to deal with than a rendering engine.

Rust isn't a domain-specific language. I don't see why it wouldn't be a good language to implement a JS engine in, from my experience.

Language VMs have to be very low-level and do a lot of "unsafe" things that Rust and such wouldn't like.

More low level then a memory allocator?

I compiled and tried servo+servo-shell just now:

- it compiles very fast, that's cool

- it doesn't render my static pages properly, albeit its not too bad

- its slow as hell (i did use --release for build and run)


  > it compiles very fast
You must have lower standards than the Servo team, because pcwalton is rather irate at the current build time. :) Rust will be prioritizing massive advances on compilation speed for 1.1 and beyond.

It's definitely not slow at rendering. Network performance isn't great, but try resizing the window.

It's slower than Chrome on my 2011 Macbook Air. Which is disappointing, considering it doesn't yet implement the same features.

Did you build with optimizations enabled (./mach build --release)?

I did. I'll definitely have to check it out again, though, because it looks like pcwalton already fixed it! [1]

[1]: https://twitter.com/pcwalton/status/595287602153926656

The fix, for anyone who is curious:


Do you have a test case? What in particular were you doing?

Just some very unscientific tests: loaded a few sites on both Servo and Chrome and compared the effects of resizing their windows. For instance, the frame rate looked about the same on Wikipedia, but Servo struggled on Reddit.

Is Servo running on Windows? The comparison could be more precise on my 120hz Windows box.

No, it's not.

I'll check that test case again; we might have regressed something recently.

I've tried it a few times and it's been super slow for me as well. Maybe it's just slow on Linux.

Yeah, there are some linux issues (forgot what).

Also, default build is a debug build. Very slow.

Also, we fetch resources serially. It can be done in parallel, but we haven't gotten to that part yet.

What in particular were you doing?

I really like how this sounds. I just really hope CSS is implemented with great care and detail. Right now - Firefox and IE has a lot of quirks, and I only find WebKit and Chrome to have it working well. I've been doing CSS intensely for over 5 years, happy to give my feedback along the way.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact