Not sure if it is still true but a few months back (6 maybe?), there was a post on HN about contributing to Servo where OP said that he found it easy to contribute despite the fact that he didn't have prior experience in rust. As this post mentions, the code base is relatively small and a lot of simple features were still missing, at least back then.
Bugs Ahoy is a handy tool for finding easy issues in Mozilla projects but seems to be down at the moment http://www.joshmatthews.net/bugsahoy/?servo=1
> OP said that he found it easy to contribute despite the
> fact that he didn't have prior experience in rust
This person's primary interest is in virtual reality, specifically WebVR (http://webvr.info/), and given Servo's incredible benchmarks so far they're very optimistic about its ability to advance the state of the art of virtual reality on the web.
Their biggest complaint was build times. Yeah... we're working on that. :)
Still this is a good post, thanks for linking it!
Hopefully it's as easy as you make it sound!
It's not even close to being ready. Even rendering a simple page like hackernews is broken.
That's not to say I am not excited for this. I think servo will be huge and I think it will see widespread adoption. It's just that with all the news we hear about it, you might be lead to think that today it's competing with webkit. It's not. Soon, though.
A: We did a ton of work
B: We're not even remotely close to something really useful yet
I think that says more about how websites work, to be honest.
Edge should, in principle, support everything, including legacy sites (at least as well as other browsers do today). Trident is only being kept around for sites that do extensive UA sniffing and rely on obscure behaviour — and pretty much only for the intranet.
(In any case, I think the complexity is fundamental to many of the specs involved.)
I guess you could run two browser engines -- one for the UI, one for the content, but I'm not sure how well that would work.
The sheer amount of work involved in getting Servo to a usable state should be less surprising when you consider that all of the browser engines in current use date back to the 90s in some form (1997 for Gecko and Trident, 1998 for KHTML/WebKit/Blink).
 Notice the inconsistent sizes of comments (which actually does not always even correlate with nested depth): https://www.dropbox.com/s/b47lhe3px83wq9j/2015-05-01%2019.29...
Unfortunately the algorithm they use breaks horribly on comment threads, because it decides that long comments are content, while short comments are not important. (At least, this is my deduction from seeing it in action.)
You'll see it on reddit, forum threads, and here on HN -- so this is not exactly a layout bug, but a problem with how Firefox/Chrome try to apply some simple semantics to web content.
HN displays ok in Opera Android but mostly because Opera can reflow text on zoom. That fixes almost everything.
Are they implementing an internal command buffer like Chrome or is there some other solution that allows multiple threads and/or multiple processes to all output content to the same window?
WebGL and content display lists are proxied to a separate process for security (in the multiprocess branch).
Obviously X11 will not just disappear, but is targeting Servo at X11 at this early stage in it's development like writing a Python library targeting Python 2?
I will be the first to cheer X's demise, as we've had so many problems with multithreading and X, but we have no choice right now.
Also, the EGL backend should work just fine in Wayland.
It's not "OpenGL" or "GL" or "GLnext", it's "Vulkan".
So, it's definitely no longer "just a research project," but the challenges of finding initial product targets and building out the rest of the Web Platform are definitely large ones!
That said, if Trident/Edge, Webkit/Blink, and Gecko don't all agree on the implementation of an older feature, we know that it will be a challenge to find middle ground and get things changed and the spec updated to anything more than softer statements using, "MAY."
 - https://www.phoronix.com/scan.php?page=news_item&px=MTgzNDA
It's here already, in Nightly: https://nightly.mozilla.org/
(more info: https://wiki.mozilla.org/Electrolysis).
I haven't really had the problem you're talking about in Firefox as it is, but you can try it out and see if Nightly is any better. Keep in mind that Nightly is slower overall than Firefox, because it has a number of debugging flags enabled that impact performance.
Obligatory disclaimer: Nightly has lots of bugs and quirks compared to Firefox, so it should not be considered indicative of the normal Firefox experience. Also, many extensions have not yet been ported (which is why it's available in Nightly for developers before hitting Firefox).
 That said, it's pretty damn stable. I'd be using it for my daily browsing except that I can't live without Vimperator, which hasn't yet been ported to e10s: https://github.com/vimperator/vimperator-labs/issues/211
That said, we should have a go/no-go on Monday for uplift with Firefox 40. We're getting Very Close Now. :)
FWIW in Servo when talking of multiprocess stuff we colloquially call it "Electrolysis". I think Patrick has some work in progress on this. It's not as much work as the Firefox e10s (we already use threads and channels for everything), but I think there is some (relatively) minor work to be done.
(ie that it wont be remoting everything and only protect the system from escaping a child - makes sense in firefox/chrome, less with rust being the compiler i guess)
I suspect that if servo doesnt try to be an OS like firefox/chrome do, a whole process sandbox may even be fine enough. But if it does.. :)
(beats it in Sunspider, is roughly equal in others)
And it maintains the lead in ECMAScript6 support: https://kangax.github.io/compat-table/es6/
- it compiles very fast, that's cool
- it doesn't render my static pages properly, albeit its not too bad
- its slow as hell (i did use --release for build and run)
> it compiles very fast
Is Servo running on Windows? The comparison could be more precise on my 120hz Windows box.
I'll check that test case again; we might have regressed something recently.
Also, default build is a debug build. Very slow.
Also, we fetch resources serially. It can be done in parallel, but we haven't gotten to that part yet.