>> At some point in mid-2017, all new CSS will be built with Quantum plane parts, not Gecko. Going forward, it will be much easier and more enjoyable to implement new CSS properties. Which makes it easier for folks in the open source community to contribute.
If you're not careful this can drag on for years with half the stuff done one way and half the new way - especially once you reach the point where "all new stuff is built the new way". There can come a point where you need to just purge the old stuff. We've seen this in places like GIMP moving to GEGL, and several project moving to GTK3 (GIMP, Inkscape, and others), the purge of java from LibreOffice, and now there's this mushy migration to Wayland. Firefox is not the only project migrating to rust either.
In all these cases I'm actually in favor of the migrations, and many of those have been completed over years. It's just that once a path has been defined I feel like there needs to be a more concerted effort to actually complete the transition sooner rather than later. Let new features take a back seat and get your internals switched over. I know it's a balance but working on a project that's straddling the fence is almost always slower in the long run.
Having said that, firefox is huge so subsystems will need to switch to rust one at a time. I'm just saying that once CSS can be done the new way they need to require it to all be done that way, not just new stuff. Perhaps that is the case and it wasn't spelled out clearly in TFA.
Mozilla is pretty good at the latter, cf. https://arewefastyet.com
>Safari defaults to 64-bit machines and doesn't need to worry about 32-bit anymore. Big pieces of their engine is 64-bit only. As a result showing Safari on 32-bit machines would give incorrect results.
Maybe this has something to do with it?
Now I try to ensure we agree up front which issues will be blocking, and how much extra memory / CPU is acceptable, which features have to be kept, etc.
Exactly. I would suggest to focus on Servo and a HTML based UI, and take some ideas from Vivaldi browser (which has an HTML5 UI). Firefox needs a "reboot", like Firefox was a leaner reboot of Mozilla Suite - but as Mozilla Suite back then, Firefox got too bloated.
I had never really thought of Servo in these terms before, but it makes a lot of sense.
Mozilla, being a nonprofit, can't compete with Google / Apple / Microsoft doing things their way. They inherently have a greater need for the free-time contributions.
Given that, if you can make contributions easier and "fun", by choosing a more modern language and toolchain, it could theoretically be a competitive advantage.
Thinking about it like this, the recent Rust marketing articles make a lot of sense. Rust marketing matters because the more Rust developers there are, the more potential Servo (and thus Firefox) developers there are.
Whether it pans out or not remains to be seen, but as a way to attempt to hack a problem, I really like it.
In this way, even folks who have no intention of hacking on Servo can effectively end up contributing to the project. And, by the same token, we also in effect contribute to lots of projects in the Rust community.
This specific article is by Mozilla from the Developer Relations team, so you might call it marketing, but there hasn't been such an article from Mozilla for quite a while now.
But yeah, I agree, Rust will make it easier to contribute to Firefox IMO. We have a lot of contributors on Servo.
It depends on how one defines "just another software company", but I don't agree with that characterization:
Our mission is to ensure the Internet is a global public resource, open and accessible to all. An Internet that truly puts people first, where individuals can shape their own experience and are empowered, safe and independent.
Much more of the same here: https://www.mozilla.org/en-US/mission/
I don't see Google, Facebook, Microsoft, or other software companies aggressively, creatively pursuing those goals (and achieving them). If Mozilla was just another software company, would there even be an open web today?
Gecko may be oldest, but not with wide margin: KHTML (the origin of webkit and blink) was started in 1998, only a year after Gecko. Of course, probably not much (if any?) of the original KHTML source is still present in webkit, and because of the nature of open source development, there is blurry line between projects.
I actually ran KDE 1.0 as my main DE and used KHTML as my primary browser (falling back to Netscape when particular site was incompatible). I did immediately appreciate the coherence & elegance of KDE. It lasted for several years before OS X came and finally won me. Oh, those were times.
Have any resources documenting this? I'd be curious to find the building blocks of an old empire in some modern skyscrapers.
There are various sites that had article about early IE history. And I suggest you to install Mosaic browser on Win7 32bit and try it out yourself - you will see how familiar almost everything is, if you know IE from everyday usage. Not just familiar, but also little things, traits and glitches are still in todays IE and Edge.
At least this part is still there: http://www.hanshq.net/roman-numerals.html#khtml
Definitely worth listening to, tons of info there.
from show notes interesting presentation: https://docs.google.com/presentation/d/1-FSfNO-oT9Wqo2swvm6U...
Working at a company where I occasionally create presentational sites, I often have to create and optimise animations that some designer thought would be so very radical and the two browsers that are extremely painful to support are usually Firefox and Internet Edge-splorer (pardon the pun).
I have animations that run at a smooth 60fps in Chrome and Safari effortlessly, quite often producing 1-2 FPS in Firefox. Their updates are quite weird too, between the last 3 updates, I've seen the project I was working on run bad enough that I had to fallback to the mobile-lite version on FF to running acceptably (30+ fps) to running like crap again.
And maybe we'll see 16 year old style rendering bugs get fixed finally, too? Please, pretty, please?
After ripping on it a bit, tho, I have to say: I'm looking forward to running Firefox again in the future, as much as I like Safari at the moment, it's closed source and I don't like being exposed to Apple's whims (especially lately), where if they decide to destroy by making it unusable I'm left out to dry and quite frankly Google Chrome just creeps me out now.
Can you give an example? I'm sure Mozilla's animation people would love to hear about this.
But it sounds like now the official plan is to get Servo into Firefox. Great!
I think however it is better off to interpret that Mozilla is getting more serious on replacing many parts of Firefox in Rust. Mozilla already started doing that last year .
Apple's work on WebKit for several years after forking KHTML was essentially just reverse-engineering IE and fixing site-compatibility bugs; Opera had maybe 10% of all people working on Presto up until the end largely reverse-engineering the competition and fixing site-compatibility bugs; etc., etc., etc.
That's not even the only interesting question: nobody has tried to parallelise layout before, and how does that interact with things like the Grid Module? How about Houdini?
That's to Mozilla's advantage, given that no company in the world has more historical experience reverse-engineering browsers. :P
> how does that interact with things like the Grid Module?
Servo devs have definitely identified places in the HTML spec which, entirely by accident, require serial behavior. Having a pervasively-parallel browser will hopefully go a long way towards preventing such accidents in the future.
I'd claim Opera (or rather the consortium that now own the Opera browser) have the most experience of reverse-engineering browsers, given Firefox's market-share solved a lot of the site-compat issues a long time before Opera gained market-share (oh, wait, it never did significantly after the rise of IE). On the other hand, almost nobody from the old Presto team is still at Opera, so… ;P
I have been worried about Firefox and seriously questioned some of the decisions along the road but I think and hope I always respected the devs who volunteer for this.
creating a renderer in 90% of the work, the other 90% is documenting its bugs and quirks.
I think the biggest lesson from Servo is that entry barrier to the web market is already much higher than we would like it to be.
Having a great new web engine supporting 100% of web standards does not get you even close to being ready to serve the web.
And the willingness of web developer community to take the "one browser only feature" bait is a major part of the problem.
Of course, that doesn't mean that we'll get rid of all bugs, but it does mean that more bugs should be found before shipping.
(As a disclaimer: this is essentially the majority of what I've got paid to do over the past year-and-a-bit.)
1) A lot of flight tests can't be done without a large fraction of passengers on the plane.
2) It may make getting to the final end state faster (though maybe not), but it means you don't get any benefits until you make the switch. Doing things incrementally means you start seeing benefits much earlier. Classic throughput/latency tradeoff.
1) One big switch. Can probably be done faster, but has more risk that big problems won't be discovered until late in the game and has user-visible benefits until the switch happens.
2) Incremental changes. Take longer to complete, but allow for better mid-course correction and can show user-visible benefits much earlier in the process.
Which of those will lose more customers? It really depends on the market reality and at how successfully the incremental changes can be made.
One part of Quantum, which does not have anything to do with Rust, is to figure out how we can spend less time running unimportant JS (like maybe in a background tab), which will hopefully reduce CPU usage.
Which, also posted on front page of HN recently, a user's medium blog on compiling Rust to WASM. https://medium.com/@chicoxyzzy/compiling-rust-to-webassembly...
Sites using JS to run simple scripts or to fire api calls/etc. will continue to use JS, no reason to use WASM.
Also, I believe WASM is truly compiled code and not JIT ran? I may be wrong...but pretty sure that was the whole point of creating it. Native app performance in the browser... that will take a big bite out of the app stores that take a good cut of revenue and control what can and can not be in the store.
And sure, WASM can also have memory safety problems, especially with C/C++ being the primary languages to target WASM from. But that's also where Rust would shine, making it much more safe when compiled to WASM.
The existing WASM implementations do a mixture of JITing code at load-time and JITing code at first-call, depending on the size of the WASM blob, AFAIK. The gain performance wise over asm.js is primarily in parse-time, as far as I'm aware.
WASM VM itself is perfectly possible to be memory safe: WASM code cannot read outside of the memory allocated to it (and malloc/free are implemented on top of that memory allocated to the VM, hence there's memory safety at the VM level).
The big problems come when you need to guarantee your JIT code doesn't violate memory safety, and that's something Rust's type system cannot (currently) solve. You need guarantees that the generated code will never have any memory access errors, and will never race for memory reads/writes, because it's running with the privilege of the VM, not the limited powers of the WASM code running within it.
 Note: parallelism is not the same as concurrency.
Similarly, I'd argue that C does not have parallelism (just system calls ...), but C doesn't even have concurrency. As such I do not consider the lack of parallelism a feature of C.
The combination of language and library provide affordances that make some idioms easier to represent, and others harder. This influences the design of code written in the language. Clumsy idioms generally don't get far, even if they would be more performant or provide some other benefits, like promote easier concurrency. And sometimes there's a tension: e.g. mutable structures tend to be more performant but harder to add parallelism to, and harder to reason about (leading to more copying than strictly necessary). Rust in particular has different ideas here.
I think the jury is still out on whether Rust will in practice tend to promote code that is faster or slower than C++. Of course any given project in Rust could be rewritten in C++ to be just as fast, or perhaps faster. But that's not what happens in the real world. Human factors matter, path dependence matters, affordances matter.
So it doesn't produce more efficient machine code, but it does open up new opportunities for optimization at a higher level.
I would even say that the room for improvement is potentially greater for rust than it is for C/C++, due to the type system. It hasn't been taken advantage of much yet, but MIR is one of the first steps to do so. It's an intermediate stage representation, like llvm's IR, but with all type resolution intact. There are plenty of known and even formally proven code optimizations that can only be used if certain guarantees are proven (such as the guarantees made by a strong type system).
(though the debug version in Rust is probably slow - that will be something interesting to see)
I've been looking at the HTML parser they are writing (servo/html5ever) which is incredibly impressive and parses the hideous Daily Mail homepage in my tests in < 30ms. It uses the tendril library which contains lots and lots of unsafe code, but this library is set out to do a very specific thing which is make parsing strings much much more efficient. This is a design tradeoff that is desirable; as long as tendril can be shown to be robust it's a good decision over a slower parser.
The quantum projects don't involve moving Servo's DOM into Firefox (they do involve implementing lessons learned from Servo's DOM, but not the DOM itself), so this isn't an issue.
People have been asking for open-sourcing of it ever since the switch happened, but I guess we'll never see that happen.
I see a number of engineers use broken phrasing like that in posts and presentations and it comes off really badly to others not in on the "joke". Does anyone really find that sort of thing actually funny or additive or is it the equivalent of trite office humor gone online?
Do I personally like it? I sure would not say that I love it, but I do enjoy reading texts that are less impersonal than a press release. Patterns like this can help identifying them, so I tend to be happy when I see them. What I hate, and I feel tempted to claim that all of humanity is with me in this, is when these patterns are used to engineer press release type communication into appearing like something different.
The one thing I would have appreciated from the post is for the author to have written about the "Mor Better Ideas" that make Servo better than Gecko.
To me that particular phrase in this particular post is a little out of place and lowers the tone slightly. I expect that to others it will be seen as fun/human and indicating that the post isn't written by a corporate drone.
It's an informal blog, I read the same reference, thought, "must be some joke I've missed", and got on with reading the rest of the blog.
That blog is informal in style but not in content; it's a professional blog by a Mozilla employee. I don't feel like it respects its audiences' time, which is sadly the norm for software-related technical writing. Whings won't get better by saying nothing, and asking people to improve is never a waste of time (if done nicely).
Related: Normalization of deviance in software: how broken practices become standard
Basically, I do find it "additive", but it's obviously only feasible in very informal contexts and the percentage of the audience for whom it is not additive (and possibly quite jarring) might be quite high.
Is this the tone of the whole project? I'd love to contribute, but I don't want to wade through a reddit-comment-thread-level of forced references and memes to do that.
I get the reason for doing it, and I like that the internet isn't full of drab corporate-speak all the time, but this piece just missed the mark I think. The title doesn't really communicate the intent well and instead feels like its trying to be a little clever for the sake of being clever. It's really a two or three sentence press release that got fluffed up with some non-corporate language.
I'm going to have call you out on this. You have no wishes to contribute. Because asking this question reveals you haven't even bothered to look at the repositories, join IRC channels, or subscribe to the mailing lists.
You're here to complain and you're using hypothetical constructive motivations to masquerade that fact. State your opinion without the fussy lie, there was really nothing wrong with it.
The term for the grandparent comment is concern trolling.
The world in general, and personal blogs in particular, doesn't exist to serve people who are "busy".
And being flippant on one's blog has nothing to do with being "unprofessional".
In fact, the blog where TFA is from is not a business or professional endeavor to begin with. So "professionalism" (however some provincial people conceive it) is not a requirement for it.
Not to mention that billions are made in fully professional endeavors that actually include such references and humor (from Google's easter eggs to funny messages in update notes in various apps, etc).
Writing clear, expressive prose is hard! Plus, it's not taught as part of software engineering & it's not required to be a great engineer.
Some of the other comments are pertaining to the "Jet Engine" title. For anybody that might be confused, like I was :)
Then eventually you'll be able to replace enough components that it's all Servo, and no Gecko. At that point Servo will be functionally complete, as its individual components will be on par with the Gecko components.
But I wouldn't be surprised if Servo is usable on mobile in 2017.
Much of Mozilla's "quantum" work is actually about improving C++ code e.g. the Quantum DOM project , which is a drive to improve scheduling so that a single background tab, even in the same process, can't jank the tab you are currently using . Importing servo components wholesale simply wouldn't help here because Servo doesn't have a sufficiently complete DOM implementation, and I'm not sure the implementation it has contains all the desired properties.
 I am occasionally taken to refer to this project as "Presto: the good parts"
The original title wasn't particularly descriptive. That doesn't mean it was misleading, and actually, I thought it was very apt: they're talking about a technology to swap pieces of Servo in over time, and the jet engine analogy was appropriate (particularly with respect to the complexity). I quite liked it.
That said in this case I agree with you.
I also think we are mostly fine for now but I still hesitate to upvote funny comments.
> If your account is less than a year old, please don't submit comments saying that HN is turning into Reddit. It's a common semi-noob illusion, as old as the hills.
No one can understand what the link is about with just engine and jet as nouns in the title.
But there's definitely place for speed gains due to multithreading.
I don't think this will be a problem. One example...
Maybe I'm wrong and I'm not a good example, but I the only add-ons that I have installed that I really care about is 1password (which is easy to live without, as I can use the stand-alone application) and HTTPS everywhere (which just means that I have to be extra careful when browsing).
I'm pretty sure that there are a lot of internet users that do not even have add-ons in their browsers, but I doubt that there's any user who would not care about browser speed/battery life.
Then there is the fact that every other browser has, or is in the process, dropped any notion of direct native add-on APIs and an increasing majority of actively developed add-ons for all browsers is now modern "Chrome-style" JS+HTML components.
Firefox doesn't allow addons that contain binary XPCOM components anymore. Everything has to be JS, whether the addon is XPCOM based, addon SDK (JetPack) based, or WebExtensions-based.
NPAPI is deprecated and is to the point where it is only used for Flash.
This story here is IMO should be extempt from the "original title" rule - the original title is annoyingly misleading.
If we stick with the airplane analogy, maybe this is better: it's like incrementally converting an older model jet engine to a newer model jet engine in between flights. Every incremental version must work correctly, which isn't easy, but the upgrades are definitely performed on the ground.
Curious as to why use a CPU language for the display engine rewrite? It seems to be wasting the GPU all modern systems now contain..
Also, is it fully replacing the display engine, including the font renderer and image codecs?
Parsing, styling, the DOM, running JS, handling networking, computing layout are all things that a browser engine does that don't involve rendering. Many of these can't be done on the GPU, and for others the GPU brings no additional benefit because it's not the kind of load the GPU can magically parallelize.
Servo's rendering stack does make extensive use of the GPU. https://github.com/servo/webrender/ (talk by patrick in https://air.mozilla.org/bay-area-rust-meetup-february-2016/) There's also work on glyph rasterization on the GPU going on right now.
(I don't think they've swapped out the font renderer and image codecs yet, but there are quite a few bits and pieces they have swapped out as the Rust community delivers more pure Rust libraries.)
Too good, in fact: it won't run on any laptops that I own because their GPUs don't support a sufficient level of OpenGL.
One great aspect of Firefox-with-Gecko is that you can throw it onto just about any machine with more than 512MB of RAM and it will provide bootstrap web access ( slow or otherwise ). That's going to be lost when everything is Servoised. I guess it's back to links2 at that point.
For ease of development, webrender2 is configured to compile with the latest OpenGl 4.x features. I believe there is a way to build it to target a lower OpenGl version.
Edit: Correction, its currently targeted at OpenGl 3.2. There's an open issue for bringing it down to OpenGl 3.1 which is the version supported by integrated Sandybridge GPUs. Considering 40% of people running Intel are using Sandybridge or older, that's probably why it wont run on your laptops.
Intel Corporation Mobile GM965/GL960 Integrated Graphics Controller
OpenGL renderer string: Mesa DRI Intel(R) 965GM
Maximum OpenGL version for that chipset is 2.0, apparently, though I can't find a way to push it beyond 1.4 on Linux
I appreciate that six years old is ancient by SV standards but I would be interested to learn what proportion of Firefox users are on equally old hardware. I did try launching Servo with CPU rendering but it hung indefinitely.
I don't think the software fallback has gotten much love but AFAIK it'll work fine once llvmpipe support is added to webrender2 . Once that is done, rendering will work with the vast majority of laptops and desktops running Linux/MacOS/Windows. The OpenGL version supported by that chipset is over a decade old so devoting time to compatibility this early in the project would be a waste of time (and defeat the purpose since the standard predates the explosion of mobile devices).
EDIT: to be clear, with the Servo nightlies you need to do this yourself but Firefox could bundle it eventually.
1. On the CPU, architecting the drawing commands and resource management for the GPU
2. GPU programs (in the form of API calls, shaders, output buffers, etc.)
Generally speaking, part 1 is not the resource-intensive part unless you also have to run an intensive algorithm on the CPU. If you see the renderer as a synthesizer unit, the CPU is the number of user parameters you have - the number of knobs and switches and sliders and patch points. You would typically only have a few knobs to turn, if you were writing a custom renderer for your application. But a complex, general-purpose renderer like that of the HTML DOM is more like a wall of patch cables; and within that, the Servo renderer found opportunities to make "risky" optimizations that parallelize, make tighter use of memory, etc., but many of them are only reasonable to do from within Rust, because of the additional checks it can perform.
The term "rendering engine" is misleading; in the context of browsers the rendering engine does basically all the core browser tasks. Basically, all the stuff needed to make a web page _work_ is part of the rendering engine. Stuff like history, bookmarks, URL bar, is part of the rest of the browser. See https://news.ycombinator.com/item?id=13331505
Servo's usage of Rust is all for the rest of the stuff. We do have a component, http://github.com/servo/webrender/, which handles rendering in particular. That indeed does the architecting of the draw commands and resource management on the CPU, and has a bunch of shaders handling everything else.
That's not the aim of Rust. Rust, according to its own website is:
> Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety.
Mozilla spending time on this suggests Mozilla doesn't know what to spend time on.
They are going for small incremental changes, instead of rewriting code from scratch, which is what Joel warned against. They aren't saying, fuck Firefox, replace it with Servo. Instead the idea is use Servo as a way to discover better layout, rendering components and transplant it into Firefox.
See Strangular Application pattern
To me that sounds like the end-state is a Chimera and not a Lion, without any incremental end-user benefits along the way.
This is an aesthetic judgment. There is no such things as «a Lion» in the real world. Engineering is the work of building chimeras that solve your problems.
> without any incremental end-user benefits along the way.
The expected benefits are :
- speed (thanks to the massive parallelization permitted by Rust «fearless concurrency»)
- reliability (thanks to Rust's memory safety to avoid crashes and better type system that helps reducing the number of bugs)
- security (thanks to Rust's memory safety to avoid exploits)
Component from Servo will be included in Gecko one by one, the user will gain on these three fronts each time a new component is included.
Creating a new langauges to incrementally change and clean-slate rewrite is an orders-of-magnitude bigger investment than what either what Joel's criticizing or advocating. I don't think that article is very relevant here one way or the other.
The article specifically mentions that Netscape shouldn't try and re-write their browser. But if they hadn't done that, we wouldn't have Firefox. We'd still have an incremental improvement over Netscape 4.7.
For those who can't remember Netscape 4.7, let me remind you, it wasn't very good at the end. If they hadn't rewritten it, causing Joel to write that article to tell them they were wrong, there would be no relevant browser called Firefox today.
I've been in a talk of a mozilla developer like a decade or more ago, shortly after they had released the rewrite. His comment on starting a project like that from scratch: "don't". What they intend to do (slowly and gradually transition to Servo) seems consistent with an organization that has learned from this experience.
> We could keep flying the current plane, while starting from scratch and building an entirely new plane on the ground — special ordering every nut, every bolt, designing a new cockpit, and waiting many years before we to get to fly that plane. But we don't want to do that either.
Mozilla is exactly following Joel's advice.
The only reason they still exist today and have relevance is their willingness to rewrite.
I don't disagree with Joel - but his main point seems to be - be prepared to spend longer than you expect on a rewrite - that doesn't mean you shouldn't do it.
Netscape had purchased another company called Digital Style. Their layout tech became Gecko but was called Raptor and NGLayout before being renamed to Gecko. Mariner / Netscape 5 was canceled and Mozilla migrated to Gecko, which was used in Netscape 6.
SpiderMonkey survived the transition from Mariner to Gecko and is used in Servo, too.
Mozilla browser was Netscape 6 with XUL 'hotness' and massive UI lag.
Firefox was stripped down and fast (though XUL is apparently still around).
Eventually a skunkwork project was started to drop all non-browser components and focus on user UI, and that became Firefox.
Mozilla was a refactored version of Netscape (to remove closed-source dependencies)