This is attacking a straw man. I'm not worried that WebKit isn't going to continue to improve. I'm more worried that WebKit is going to dominate so much that WebKit's implementation quirks result in de facto standards. Take the HTML Hard Disk Filler from a few days ago: the reason that WebKit can change to fix that bug is that the Web doesn't depend on the semantics that WebKit implemented. If Web sites relied on subdomains' quota not counting toward the parent domain's quota, as WebKit implemented (contrary to the recommendations of the spec), then that security issue would be much harder to fix without breaking sites.
"The proliferation of WebKit will be a rising tide that lifts all boats."
Assuming that no better engine comes along. If the Web starts to depend on WebKit's implementation, then the Web will basically be defined by this large pile of C++ code, presumably in perpetuity. That might be better in the short term, but it doesn't seem like a good long-term bet.
WebKit is also constantly changing, so even the current (small) set of "implementation quirks" that really are 100% identical in all products that use WebKit will not stay the same very long.
Microsoft provided no compatibility library for IE6/7 or ease of use while Android's makes it really easy to backport along with using third party tools like ActionBar Sherlock and Holo Everywhere. Only thing really missing is going back to 2.2 (with the download api) and that's now < 10% of the market share.
In short, people that claim Gingerbread is the new IE6 are either ignorant of the Android development process or are mostly spreading FUD. I'm not the only developer out there that agrees. Biggest hassle is really various DPIs and resolutions and having to provide resources for 3-4 types (depending on what one supports). Though one has to do that for iOS as well to a point. Nothing under 480x800 though really matters if one is doing 2.2+.
I haven't used iOS enough to say for sure, but isn't Mobile Safari also tied to the OS version in a similar way and can't get updates without the OS being updated?
But you are correct--it is far from the only example. There are major problems with iOS's model, too--once your device has received its last update, you're stuck with its rendering engine forever! Browsers using alternate rendering engines aren't allowed in the store. With Gingerbread, at least people can install a browser with a newer rendering engine.
But in general, I think the "Gingerbread is the new IE6" claims are pretty hyperbolic. IE6 remained such a scourge because so many companies wouldn't upgrade it to maintain consistency and compatibility with their custom, non-standards-compliant internal web applications. In contrast, mobile devices currently aren't kept very long, and there's quick iteration. After a few years, those Gingerbread users still using their devices can get a new browser if they're insistent on using the device, and idevice users will have to get a new idevice. But those users won't be tied to using those devices by corporate policy. At worst, I think developers will have to worry about the Gingerbread browser and older versions of mobile Safari for a few years, not the better part of a decade like with IE6.
This seems very arbitrary. So WebKit isn't a "browser". Is the author saying that a monoculture at another level is ok, but the "browser" level is somehow special and we want to keep diversity there? No, I think we need diversity at all levels.
Speaking of Linux which is the main example in the article - yes, if Linux were to become completely dominant then that would be a bad thing, even if the author calls it an "enabling technology" and is somehow ok with a monoculture there. I am a huge Linux supporter - I am running on Linux right now, my desktop has been linux for many many years now, and I encourage people to switch to it and abandon proprietary OSes like Windows and OS X - but we still don't want Linux to dominate the OS kernel space.
Thankfully Linux is not doing that. It might dominate the open source kernel space, but there is still Windows Server and OS X Server. And applications written portably can often run on all of those.
Linux is a great kernel, but it has downsides like any software. If everything ran linux, it would be very very hard to invent something better than linux and get adoption for that new thing. The same is true of WebKit.
None of this precludes a changing of the guard in the future. Just ask the gcc guys about egcs and llvm…
Besides, by your argument, LLVM should never have been started, because they should have contributed to GCC. Yet I'm very glad they did, because LLVM is much more hackable and this flexibility has enabled many new projects, like Emscripten and llvmpipe.
See Chandler Carruth's talk "Clang: Defending C++ from Murphy's Million Monkeys". At the beginning between 2:20 and 4:00, he quotes Richard Stallman's response to their proposed changes and demonstrates that using gcc is a non-starter.
| LLVM is still fighting GCCisms
But Intel did have a presence at WWDC when the Intel switch announcement was made. Intel was trying to sell developers licenses to the Intel compiler (as they should).
A huge corporation like Apple will typically be able to overcome the additional effort.
By all means, ask the LLVM people about all the work they had to do to overcome the single-implementation status of gcc. clang must support gcc's arguments and behavior very carefully, and still cannot build all open source projects, simply because so many open source projects - including the linux kernel btw! - have been designed with only gcc in mind.
LLVM managed to overcome that through a lot of effort. LLVM is funded by Apple, a massive multinational, one of the largest tech companies in existence and of all time. Not all new projects have that luxury. In an ideal world, you don't need those kind of resources to challenge an existing implementation.
Both of your examples clearly show that it takes huge resources to overcome a single implementation in a field. That is far from optimal, it means the barrier is so high that innovation is being stifled.
As another example, look at the single-implementation status of Microsoft Office. Despite huge investments and efforts by multiple parties in the industry, it remains essentially unassailable.
The best way to avoid that is to not have a single implementation, but rather to have standards, and to have good open source implementations of those standards.
I think it's more optimal than the alternatives tried so far. You're ignoring the "period of peace" between upheavals during which (almost) the whole world is working together to make something better for everyone. That more than makes up for the difficulty of dethroning (or forking) the king when needed.
Office is a closed-source product controlled by a single company, not analogous at all to WebKit.
But the cost is quite high.
10 implementations might be a lot of overhead. But a monoculture of 1 is too little. 2 or 3 might be an optimal number.
"Monoculture" is a loaded word. The differing priorities that might manifest in completely separate web rendering engines still have plenty of room to manifest when multiple big players are working on WebKit, with nothing stopping any of them from forking if the differences get too large.
(And anyway, Gecko does still exist, after all…)
But we already see problems today from WebKit's dominance on mobile. Non-WebKit browsers have trouble rendering the mobile web which was designed with only WebKit in mind. It got so bad that Opera just gave up and adopted Chromium (not even just WebKit).
The remaining non-WebKit browsers, IE and Firefox, are left with an even bigger problem and it is even harder for them to disrupt the WebKit mobile web. And it would be even harder for a completely new engine.
So general arguments about cycles and all that might sound good, but we already see the damaging effects of WebKit monoculture (you argue it's a loaded word, but it fits).
Of course, as I already agreed before. There are benefits to centralization.
It's a question of degree, not absolutes. As I said, 10 or 100 might be too many rendering engines, while 1 is too few. 2 or 3 seems, to me, to be optimal, but again this is a matter of degree so others may prefer a little more or less.
Agreed, this is not just a WebKit monoculture issue - plenty of other problems in that area as well, as you say.
But I don't think that you can do the same in browser space. If you want to create a new rendering engine, it absolutely, positively has to render 95+% of most-visited websites from early stages of development (before you "ship" a browser). Nobody would use a half-baked browser that's unable to render most websites. So, you have to also support WebKit's bugs-turned-into-standards.
In another words, you don't compile 500 different programs in a single day - if LLVM can compile the one program that your company is developing faster and better, it's a good fit for you. But you visit hundreds of websites a day. If a new engine can't display even 10% of them correctly, it'd a show stopper.
So, your choices are to either fork WebKit, or create a new engine that "simulates" most mainstream WebKit engines. Both result in WebKit becoming more and more of an standard.
As SQLite3 is.
Bring back WebDB!
So if someone invented a new kernel that is better than linux, it would have two problems: The usual problem of getting adoption and interest in a new project, but also the problem of all existing code being designed with linux in mind.
Whereas today, people generally try to write code that runs well not just on linux but also on other kernels. Not because they have lofty ideals necessarily, but because there are other kernels.
If we had only linux, that wouldn't be the case.
This is the basic question of standards. Open source is great - as I said above, I have been a huge supporter for a very long time - but standards are an orthogonal issue to open source, and just as important. Writing to standards instead of the bugs/idiosyncrasies of a single implementation is the only thing that makes it easy for new implementations to show up. And standards are dead when there is a single implementation.
Think of all of Tanenbaum's design decisions that Linus ended up changing. One big reason we know, in detail, what would have been wrong with a Minix monoculture is that it never happened.
I wonder how many war refugees from that era feel the same way. I won an xbox and gave it away to charity because I didn't want it in my house either.
The market has handed Microsoft several monopolies. Many times they've squandered those opportunities, leaving a bunch of pissed off customers in their wake.
I don't really see any evidence that anything has changed recently at Redmond.
Somehow I feel secure knowing someone like Linus is in-charge. Maybe because the Linux project isn't maintained by people whose ultimate goal is profit?
And yet I acknowledge that I am closed-minded in my religious support of Mozilla. I have had my bouts of doubt, and most recently wrote about my awe over Microsoft's IE 10 benchmarks . Obviously I want to rationalize the benchmarks as tilted toward IE, but to be honest with myself, I have to admit that IE 10's performance--rendering performance in particular--is quite shocking.
Observing the hardware acceleration of IE 10 on my i7 3770K with a discrete nVidia GPU fills me with regret that I cannot stomach the use of Internet Explorer. I know I am squandering CPU and GPU cycles using a browser that is decidedly less efficient. And simply because I am familiar with my favorite browser's UI and because I like its particular quirks more than the other guy's quirks.
Here is how I rationalize my behavior, though: I love that Mozilla has two competitors. I love that they are being motivated to continuously improve their hardware acceleration (among other things) by attacks on two fronts. I'd like even more competition, but two major competitors will suffice. I feel that the good-natured rivalry between the three major teams is a very good thing.
My fear is that without a sufficiently wide field of competitors, certain areas of innovation will shrivel away. As evidenced by the IE 10 benchmarks, especially those related to hardware acceleration, both Mozilla and Apple/Google have not to-date made hardware acceleration a priority. At least not on the desktop, which is where I do most of my web consumption.
I am hopeful that IE 10's kick in the rear will give them a little incentive to snap out of their complacency. I would love a Firefox build with the hardware accelerated rendering performance of IE 10.
I'm not worried about a monoculture insomuch that the particular rendering quirks of Webkit will be deemed the Holy Standard of the Web. To a degree, that's already the case, at least on mobile. As regrettable as that is, it's not my particular worry. Rather, I am worried about a monoculture because it inevitably reduces innovation, oftentimes in subtle ways that aren't immediately obvious and that we may not be able to perceive because the alternate possible course of history is closed off.
If Microsoft were not pushing the hardware acceleration envelope, evidently no one would be. (Actually, to be clear, we'd simply accept the degree to which Google, Apple, and Mozilla are focused on hardware acceleration to be a reasonably degree of focus because there would be no counter-example available.) And we would probably all consider the rendering performance of Chrome and Firefox to be good enough. "Good enough" sucks, as I have ranted at length about elsewhere. Good enough is one of the worst sentiments in technology.
No, it's absolutely not good enough that the background animation of my blog causes lesser computers to bog down to a crawl (go ahead, take a look and post your complaints). It should not be so computationally intensive to do relatively trivial SVG/SMIL animation in a browser. (Irony: IE 10 doesn't support SMIL, so I can't vouch for its ability to animate my background; what I do know is that it makes the section navigation animation look absolutely effortless compared to Chrome and Firefox.)
I fear the loss of competition because of what that means for innovation. It entrenches "good enough," and I hate that.
| I fear the loss of competition because of what
| that means for innovation. It entrenches "good
| enough," and I hate that.
Servo, I am looking at you.
Couldn't help but pause and reread that.
Uhm... not exactly.
"A man who has no fear has lost a friend."
I like that quote, where's it from? My google-fu can't find anything like it.