Hacker News new | past | comments | ask | show | jobs | submit login
Parallel page rendering with Mozilla Servo (lwn.net)
101 points by vezzy-fnord on June 25, 2015 | hide | past | favorite | 16 comments



I have to admit, I'm surprised they have no plans to migrate Firefox over to Servo. It seems like a waste, considering that desktops and laptops could both benefit from the increase in parallelism.


Baby steps! We're first trying to figure out how we can share components written in Rust - ideas people have had are URL parsing, media container parsing, image decoding, etc.

If (when!) Servo continues to execute well, plans for integrating more risky pieces will require more top-down planning than the bottom-up, grassroots work that is going on today. As much as I love Servo, even I wouldn't argue we should stop the world and move 200 developers off of Firefox onto working on it full-time for the next year. Such initiatives rarely go as planned.


The way I read it, they have no plans in 2015/2016.

Longer-term, we plan to incrementally replace components in Gecko with ones written in Rust and shared with Servo. We are still evaluating plans to ship Servo as a standalone product, and are focusing on the mobile and embedded spaces rather than a full desktop browser experience in the next two years.

https://github.com/servo/servo/wiki/Roadmap


Multi-threaded rendering is less of a win when a single desktop core is already pretty powerful. Servo is going to make a big difference on mobile platforms with multiple underpowered cores.


A 3 Ghz P4 from 2005 way well blitz a Core-M from 2015 on certain single-threaded benchmarks. But if you're concerned about energy bills, emissions, fanless computing or all day battery life then performance per watt matters even on x86-64 desktop environments.

From the article:

parallelism results in power savings. Multiple threads working in parallel on a page-rendering job allow the CPU to complete the entire page in the same amount of time while running at a lower frequency.


That's not true. Layout is still often slow (multi-hundred ms) on desktop. You can notice that.

Also powerful x86 CPUs are so good at shared-memory multithreading (cache coherency, large caches) that x86 is actually the best case for us (which is not to say we're bad on ARM, of course).


multi-hundred ms is slow.... But what is strange to me we have to go through a magnitude of handles and paralleling a browser engine to get to sub 100ms layout. Could we not further improve the current engine?


That would kind of waist the opportunity to create a new product brand that's not tied to Firefox's current trajectory.

Since Mozilla doesn't have the ability to fund advertisement campaigns the way Google and Microsoft can they have to be more careful about momentum and not waist growth opportunities by spending them on a established product which is consistently shrinking in marketshare.


This, along with the recent missteps Mozilla have been making (proprietary software included in the package, Cisco's h264 renderer, etc), remind me a lot of the Mozilla SeaMonkey days, when people were tired of bloat, and decided to give gecko a newer lighter body to live in.

I'm hoping someone steps up to write a new lightweight servo frontend in rust.


> the team hopes to have an alpha release before the end of 2015. "I wouldn't recommend logging in to your bank with it," he said, "but it should be usable as a basic browser."

What? Was this meant as a joke in relation to something else? -- Why shouldn't you log into banking sites with it?


The comment was meant to be lighthearted, but on the serious side we both:

1) Probably won't have done the full sort of inspection of our SSL cert checking to really ensure you aren't being MITM'd.

2) We won't have a chemspill / 0-day security fix infrastructure in place, so we won't have a way to push down fixes to 0-day exploits in OpenSSL, the JS engine, etc.

I'd love to be at a better point on those features by the end of this year, but it's not likely with our current roadmap & resourcing plan.


An alpha release of a new browser engine is likely to have quite a few security bugs. Don't risk your bank account details for the sake of testing a new browser engine.


Fair point. Not trying to minimize what you're saying -- But I am genuinely curious as to how you'd test such a vast code-base written by so many different people for security issue?

e.g. how will 1.0 be any more secure than alpha aside from developers having more time to "run into" the security issues by sheer luck/accident?

Maybe someone vet the code by hand -- or maybe they will use a security testing suite to automate it? If so, wouldn't it make sense to do that before accepting code that could potentially introduce security bugs?


> how you'd test such a vast code-base written by so many different people for security issue?

I think their point is that correct, well-thought-out and well-tested code (i.e. doing exactly what it needs to do, without side effects and errors) will inherently be more secure than any hacked-together it-runs-ship-it MVP/alpha/demo release (which is what Servo is, at the moment).

Security-specific tests would be done in the same way as they're done for any mainstream engine out there.


commit to same api as CEF project is very nice feature. This will give application developer to choose between CEF or servo. I think many will give servo a try if it works fine.


Off topic, but about LWN.net in general: these simple site designs do not work when you use third-party ad services. You have the content in a plain, 1999 style format without CSS. Then on top and to the side, you have modern designed ads with animation. It's a very jaring contrast. I understand people like simple sites (especially on HN), but the reasons for having one are invalidated when you integrate third-party ads.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: