So is there actually any reasons for having competitive open-source full browser stacks? I believe licensing reasons (the one behind FreeBSD and Linux) is not very important for browser world (Gecko and WebKit have very similar licensing terms). And I don't see much ideological reasons that can't be fixed by forking.
What else? Usually developers hate abandoning their work and switching to improve competitor's solutions — especially in the open source world. Developers like the feeling of doing important stuff and money. Market share of Firefox is still quite high (~25%) and Mozilla still receives money (company is non-profit, but developers are being paid I believe). Probably for this reasons the struggle will continue for some time.
However it is difficult to assume that there is something useful for web platform in this struggle. Every new web standard feature requires independent implementation from two different open-source projects and the whole platform adoption process goes as fast as the slowest team goes.
Then again, this is why we first develop web standards or markup languages (e.g. HTML, XHTML, XML) in the first place ;)
"The Ideology of Competition"
The general pattern seems to be that, if you're interested in building a better X (browser, compiler, operating system), then a monoculture is bad. If you're interested in building on top of X (websites, code, applications), then a monoculture is great, so long as the dominant entity is good enough.
In lots of cases, I think the people building on the platform tend to get their way, both because they're more numerous and because the technology in X eventually stabilises, so fewer people want 'a better X'. If a truly better X does later emerge, it then needs a Herculean effort to break the monoculture (e.g. Firefox in the bad old days, clang, the Linux desktop). Then there's an interesting transition period before, perhaps, a new dominant force emerges.
Maybe Mozilla can maintain a mixture of rendering engines by being the determined underdog. It would probably make it easier for a new rendering engine to emerge, but the open web may seem less inviting than more uniform technologies. For all its faults, one of the reasons Flash did so well for so long was that developers didn't have to deal with multiple competing implementations.
"For several years (2001-2005) we've already had the situation when the single browser engine (Trident, MSIE) had the dominant share of 90% and more. Indeed, this was bad for the Web as a platform, pretty much in ways the article describes.
However it is difficult to assume that there is something useful for web platform in this struggle. Every new web standard feature requires independent implementation from two different open-source projects and the whole platform adoption process goes as fast as the slowest team goes."
I can't think about a case that WebKit "bosses" won't like a change from another party, but that might happen. We need competition for implementations. What we don't need is double standards for the web.
Microsoft proposed and standardized CSS grids which is awesome, but Google and Apple for some reason do not implement it in WebKit. Microsoft will not do that too. Both ends are enjoying the situation. Pulse.com works great in IE10 because of CSS grids. Microsoft can put their "work best in IE" logo on their website again. Webkit seems don't care much about CSS grids because they think flex-box is the solution.
This is the problem. We have companies making and implementing new standards for their use without caring for the rest of web.
Such things already happened, for example Google wanted to add changes to WebKit to support another VM in the browser (for Dart). Apple devs blocked the attempt for technical reasons, but some speculate political ones were relevant as well.
[unrelated to topic]
Always wondered, if someone is hellbanned, how come anyone can see it? In other how did you see it and how did you know he was shadow banned?
A lot of people find this distasteful for obvious reasons, but it's fairly effective. The occasional good posts from hellbanned users (which are the vast minority; most are terrible, spam, or at best noise) are simply a result of that moderators are not perfect.
So, what we learn from history is that people don't learn from history: they just repeat the same mistakes over and over.
These literal kids just want to build cool stuff and they have more power/freedom/resources than ever before. Give them time and the good ones will learn from their mistakes.
When the browser war was fought (and won):
- Netscape Navigator was actually a pretty terrible piece of software. I dare say they deserved to lose. Towards the end, most Mac users were running IE5 -- it was a better browser (http://en.wikipedia.org/wiki/Internet_Explorer_for_Mac).
- There was diminishing market interest in supporting alternatives to Windows because non-windows marketshare (Mac OS X, UNIX workstations) was plummeting. This allowed Microsoft embrace-and-extend to succeed.
- The web development field was nascent at best. Similar to how many web developers are moving into the mobile space today (and bringing their ideas of how to write apps with them), you had Windows-centric enterprises migrating towards writing web sites (not apps!) and bringing their ideas of how to do so with them.
I very much doubt that the web browser war would happen again in quite the same way. Between the availability of open-source browser stacks, the genuine viability of multiple platforms and vendors (iOS, Android, Mac OS X, Windows, and even Linux), and the established web development community (which would have to be co-opted), I don't think it would be quite so easy for someone like Google to 'win' a browser war and then stagnate indefinitely.
There are other options too. My main ask for web devs would be to test on mobile Firefox and/or the Windows phone browser in addition to whatever webkit browsers you normally would. Actually, for now testing on Opera would probably be better than either of those; the general consensus seems to be that they're the best wrt standards.
It won't prevent the monoculture, but it will mitigate its effects and help keep the field open for competition.
The entire article is written as though it's a parody, yet we still have to live with their inferior browser and cumbersome ignorance of what people want.
As it stands, your comment is just... boring.
It was because IE had a HORRENDOUSLY SLOW development cycle with minimal resources committed. It was impossible to get all of those issues fixed in a timely manner. If they can be fixed within a few months, it won't be 5+ years of entrenching people in egregious bugs before a new version comes out that breaks all of those if/elses in everyone's code.
We develop ONLY for Chrome. It (and WebKit especially) has bugs, too, and we report them regularly. But the development cycle is rapid enough that we can put a TODO in the code, file a bug against it, and a few months later we actually get to fix the code because, oh my gosh, it's fixed on Chrome stable.
I doubt that Chrome/WebKit will stagnate if it becomes the de facto standard but I'd rather it didn't have the chance and I'm constantly amazed that so many comments on HN seem desperate for one browser to rule them all.
It was also closed source, so nobody who found a bug could just go fix it. You had to ask Microsoft, nay beg Microsoft, to fix it for you. And they didn't.
These effects compound one another. The result is that MAJOR bugs become features.
How do you think it got from IE1.0 in August 1995 to IE6 in October 2001? (That included the IE5.5 release as well.)
That rate of development with the resources Microsoft had to get the best developers available is really unacceptable.
The IE team produced 7 versions in six years and there were some very substantial advances, including a new layout engine (Trident). They also did Mac and Unix versions, a mobile version, and a tabbed version for MSN (before iE had tabs).
IE certainly developed a lot faster than anything else on the market in the 1990s, bearing in mind that Netscape took three years to get from 4.7 to 4.8.
Safari entered the market late (2003) and still took the best part of seven years to make it to version 4.
Nobody shipped major versions "yearly".
So we are forced to hack web sites to make it work in the majority of WebKit based browsers out there in the field.
I think a lot of people project some war between browser vendors that doesnt reflect reality. Only speaking for myself (as a Mozilla employee) but I want what I believe to be the best for the web, not for Gecko / Mozilla (luckily Mozillas entire goal is to do the best for the web, so we have joint interests)
Since "Mozilla is a proudly non-profit organization dedicated to keeping the power of the Web in people’s hands" then it is able to do things differently.
However, users are basically self-interested and short sighted (basically "how fast does a tab load") so just being morally superior isn't a win ;-)
Of course, it does come down to individuals in the end, and
open source has pointless turf-wars, so nothing is guaranteed....
If just Opera switched to it, that would be bad - loss of one implementation is almost always bad - but it would at least maintain some non-WebKit market share in mobile. So less bad but still quite bad.
So yes, I am guessing that Mozilla folks would say it would be negative (I would).
Notwithstanding the other points made, how is rapid adoption of new features, and a competitor's ostensible inability to keep up, preventative of innovation?
EDIT: Another great example is the stunt Intel pulled with AVX to intentionally sabotage AMD's ability to compete in the market, as documented here:
Essentially, Intel published a proposed new instruction format, and AMD said 'that looks great, we'll be compatible with it'. After AMD announced this and had started preparing to ship their new chips, Intel suddenly announced that they had changed their instruction format from what they published - after it was too late for AMD to adapt.
The end result was AMD shipping chips that were incompatible with Intel's despite AMD's best effort. Intel knew that as the majority market share holder, developers would prioritize Intel compatibility over AMD compatibility, and AMD would lose.
But the premise of the article doesn't even require innovation in features. It just requires changes that change behavior that sites then depend on and that you have to reverse-engineer.
And reverse-engineering is very time-consuming and slow. If all possible competitors have to reverse-engineer to become viable, that puts in place a huge barrier to competition.
Note that WebKit already behaves this way in various cases: their transitions draft proposal was very vague (as in, what they described could have been figured out in a few afternoons by someone playing with the functionality and their developer docs) and then the editors (Apple employees, note) did nothing to improve it for a few years, forcing everyone else to reverse-engineer WebKit to implement this "standard"...
In the case of having only one engine, obviously the standard suffers but in the case of multiple engines the developers suffer. Which is worse?
I feel like we as developers have done a good job to mitigate the suffering from having multi engine compatibility with frameworks to the point where it's still better to maintain the direction of multiple engines and code for the implementation rather then the spec simply for the sake of accountability.
At the same time it would be nice to fork everything off the best candidate and unify things but then we wouldn't have anything to compare it to in order to know it's STILL the best candidate.
Though this post claims to be talking about WebKit, I see something like this:
> There’s a bug — background SVG images with a prime-numbered width disable transparency. A year later, 7328 web sites have popped up that inadvertently depend on the bug. Somebody fixes it. The websites break with dev builds.
And have to wonder if they're trying to project IE's issues onto WebKit. I used to have a lot of respect for Mozilla. But now that MDN is stagnating, Firefox is a much less inviting development environment than Chrome (oh how things have changed), and with Mozilla talking shit at every turn, I think I have to revoke all respect. Good luck, guys.
> Backwards bug compatibility
This is obviously pointed at what IE did after years of maintaining the same bugs that people relied on. The article actually figured out WHY this happened: taking a year to fix a MAJOR issue results in people relying on it. IE was always updated very, very slowly. Nobody else really had this problem, and IE is the one who instilled major flaws as "features" and refused to correct them later on. This very much describes exactly what Microsoft did with IE.
But then they try to ascribe that, somehow, with major handwaving as a problem that will be seen if everyone uses the same code base. This coming on the heels of Opera deciding to use WebKit. It's very clear they are making the claim that the same problems IE was having would be caused by WebKit. This is called talking shit.
The problem is that these problems were caused by a well known phenomenon: taking years to fix major issues. Google has maintained a PHENOMENAL rate of development on their own browser, and they aren't even competing with Mozilla anymore, they're way out in front. Pushes to stable are slower but Canary is updated almost daily. I've seen major bugs and regressions get fixed in HOURS. And I've seen independent players issue patches to fix problems.
This is a picture that is antithetical to IE. It is the POLAR OPPOSITE of the scenario under which this problem initially became so egregious. There is no basis for this claim. And to the point that we "NEED" multiple implementations of a standard to find where things are ambiguous, please feel free to peruse the discussion groups at your leisure. Not only are these ambiguities discussed without referencing Firefox, IE, or anyone else, but they're often resolved with changes in the implementation or, rarely, in the spec. This would be an impossibility if we all worked on 1 code base, clearly.
Sure there is. Consider https://bugs.webkit.org/show_bug.cgi?id=36084 which is unfixed for many years now because of backwards compat issues with non-Web stuff on Mac that uses WebKit.
I can find you more examples if you'd like.
Sometimes WebKit is willing to break compat to make progress, but very often they are not. And if they were not competing with others, I fully expect them to be less willing to break compat: right now they mostly do it when the standards and other UAs force them to.
Look, I am not dissing webkit devs whatsoever. They're smart people doing a great job, and they're getting strong support from their employers because doing a great job really matters. But if we end up in a webkit monoculture, I imagine much of both the intrinsic and extrinsic motivation will disappear. You're running a for profit corporation. Should you continue pouring resources into a game you've already won, or should you shift them towards something else that's going to impact your bottom line? That, and only that, is the connection I would make with IE6.
If that's talking shit, then fine I'm talking shit. But I don't think it's unfair to expect webkit's caretakers to behave like rational humans.
As for needing multiple implementations to find problems, I won't challenge the assertion that people are uncovering and resolving ambiguities on discussion forums, without needing multiple implementations. But how many are found this way? Surely you would agree that some problems are found by trying something out, having it not work, testing it in a different browser, and seeing it behave differently? I assert that many, many problems are found this way. I further assert that ambiguities don't really matter to people if all browsers behave the same way - until you need to do something different (eg make something faster or add a new feature), at which time those ambiguities suddenly become critically important. Enumeration order of JS properties comes to mind here. The Web came to depend on creation order despite it not being specced. But what about indexes? What if you have some of both?
Right, this is Google now, not Google of 2018 or Google in 2030. Nothing prevents Google from saying, oh well this sucks... Let's kill Chrome. Or shift manpower to something else, slowing down rate of development.
Also I don't understand what is your point, this argument was about WebKit, not Chrome.
Usability (configuration, ease of migration of configuration, etc): win
Developer tools: win+++++
Speed: win (JS + speed of rendering, though this is a lot closer than it used to be)
System footprint: win
Availability of experimental features in beta: win (though from time to time Firefox does some crazy awesome shit in beta builds, Chrome is more consistent)
I've long said the only browser remotely close to Chrome is Firefox, but for me, they are a clear leader. It's unfortunate to see Mozilla behaving this way in public. I take Opera's choice of using Chromium as a validation of my assertion, and I really believe if they had chosen to use Mozilla's software instead of Google's, that Mozilla would be fairly silent right now.
Sometimes Chrome is easier to configure (With Firefox you need to enable click to play in about:config, while Chrome has a checkbox somewhere in the "advanced" menu), sometimes is harder (Try to configure a proxy in chrome)
>Ease of migration
I haven't tested it myself, but i know that Firefox can load your history/bookmarks from Internet Explorer and Chrome.
If you mean between the same browser, Firefox Sync is on par with Chrome for me.
Chrome has better tools ootb, but you can install Firebug in Firefox which is mostly on par.
You're talking about addon? Because Firefox's addon can be way more powerful than Chrome's addon.
Firefox usually uses less memory than Chrome (but it's more prone to memory leaks).
For the CPU, i don't know about Chrome, but my Firefox installation is currently using about ~1% of it
So in reality you don't develop against standards (alas), you develop against their implementations, and having multiple engines helps in no way.
But the result is that it's very rare to find cases that are really developed "to standard".
Even worse, though, on mobile right now people aren't even trying to develop to standards. UA sniffing and locking out of non-WebKit UAs and use of -webkit-prefixed things even when standard alternatives are available is rampant and purposeful. And when you ask these people to develop to standards they just laugh at you.
A quick example: look at document.cookie. They are referring to DOM2 information. It's undergone some (slight) changes in HTML5. Nothing is referenced, even though that section is relatively stable and marked as safe for implementation. That's a 2000 spec vs a 201x spec. Nobody's even gone in and pointed out that this has changed in HTML5.
For example, I had no idea document.cookie got changed in HTML5, and I bet neither did anyone else involved.
For some weird reason removed front the front page in short time. I have seen submissions with less votes and less discussions stay on the FP much much longer than this.
In fact, I think that a single OSS project is far more effecient than attempts at standardization across competing products when it comes to user-beneficial innovation.
Look at the open-source UNIXes. Nearly all the value-add has come from cross-pollination of "proprietary" and not-yet-standardized enhancements, which are consumed by users and application vendors targeting those platforms.
Being open source doesn't change the fact that it's the same codebase.
Why should there only be one browser engine? I'm a web developer and I hate cross-browser testing/compatibility, I prefer to use Chrome for it's devtools. I would hate for there to only be one browser in the world.
Duplication is not equivalent to standardization.
On the mobile space users are at mercy whatever WebKit version gets to be integrated into a specific OS release.