I'm not sure the engineers realized despite their secrecy, it would be noticed by the press immediately after deploy.
But the best part is how Google engineers immediately on seeing it figured "oh yeah, we should do that too" (although they apparently got the necessary approvals however that was done at Google, it was easier to do because they figured "well, youtube must have done due dillegence before doing it.")
I don't know how they didn't all get fired. Like, ALL of em, including everyone who set up the special "OldTuber" priv long before.
But... it worked! This is a hacker story for the history books, it sounds like the kind of thing programmers did 20+ years ago for nothing except the reward of doing it right (against their own career interests), that I feel like doesn't happen so much in a more professionalized industry.
I can think of 2 reasons:
1. The gamble paid off. It turned out the time was ripe. Thus, there was really no negative impact on the organization that people had to be punished for.
2. The OldTubers were the experienced backbone of the group. Firing everyone with that privilege would likely have destroyed the group's technical expertise. YouTube as it was would have died.
That latter point is a bit of a strong negotiation position that many engineers don't realize they have. For example in my team we're having a lot of difficulty filling job openings (we're picky and frankly our interviewing pipeline sucks). As such, there's a significant cost/benefit calculation to firing someone. Maybe the person did something egregious, but is it likely to happen again? I know some people in my team are... not great, but are they so bad that we would be better off without them? (I must reluctantly admit, no)
When I started reading the story, I was afraid it was going to go in the direction of sneaky sabotage, like intentionally causing performance problems or crashes. There've been accusations of that happening, within the last year. Earlier, there was a long history in the computer industry of sneaky sabotage of competitors, which was suggested by the (perhaps only a joke, in that particular case) mantra "DOS isn't done, till Lotus doesn't run".
You mean like they seem to be doing to Firefox right now?
Ever heard the story of Ken McElroy? He was a rapist, burglar and child molester murdered in 1981 in Missouri.
In front of 46 witnesses. Nobody saw anything.
"He had come in on an otherwise normal day to find email from every major tech news publication asking why the second largest website on the planet was threatening to cut off access to nearly a fifth of its user base. Fortunately for us, the publications had already settled on a narrative that this was a major benefit to the Internet. By their call, YouTube was leading the charge towards making the web a faster, safer experience for all of its users."
Go with the Zeitgeist, and you'll have a lot of wind in your sails.
That seems extreme.
I've worked in some organizations where the assumption is that engineers aren't capable of making product decisions. They are merely walking, talking machines that are there to implement the vision of the PM and designers.
In other orgs engineers are very much in tune with how product decisions are made, what the business is optimizing for, and what constraints they need to be aware of. They will generally work with the product owners to shape the user experience.
I'm guessing that YouTube was much more in the latter category.
>They will generally work with the product owners
is exactly what the engineers didn't do in this story. The company was completely blindsided.
More importantly, the EU's decision on bundling IE with Windows came down in 2007: https://en.wikipedia.org/wiki/Microsoft_Corp._v._Commission
The EU has continued pursuing antitrust cases, and the FTC has now set up a task force to investigate antitrust in tech. The main problem is that the wheels of the legal system turn very slowly, much slower than the startups rise (and fall).
If you choose Firefox or Chrome it never suggests that you should switch to Edge.
Microsoft’s lawyers are not stupid.
Even the UI for changing the default browser in Settings heavily encourages you to set Edge.
Try something like EdgeDeflector (https://github.com/da2x/EdgeDeflector) for redirecting those links to your default browser.
With Win10 Enterpise I’ve never encountered pop-up nags for Edge when changing the browser. Links are never redirected into Edge even from other MSFT products.
This is on like three different Win10
I’ll have to ask the help desk guys what they do to our corporate images if anything. To me it looks like plain-vanilla Win10 Enterpise while deploying from our WDS server.
A lot of that input will be garbage. Some seriously laughable in a good way, a few gems.
Why not get them?
Imagine working at a medical or financial company and having secretive 'old timer' permissions that basically backdoored the company's engineering processes. Even if the engineers saw a good reason to do so.
I mean, who are we to say that this was doing it right? It turned out to be beneficial but there are thousands of ways that this could explode in your face if you tried it yourself.
No medical or financial company has this vibrant engineering IT culture like the web startups 20 years ago had.
Everything was new and exciting, and things moved foreward exactly due to the flexibility of all involved, the uncertainty, the flat hierarchies, and the fact that many rules either didn't exist, or only existed on paper.
This is what happens everytime a new market emerges, where structures need some time to settle.
Once they have settled down, procedures become standardized, but they also become so boring that all innovation is lost.
What happend at Youtube 12 years ago is exactly what happens right now at some Cryptocurrency start ups. The start ups that are succesful are exactly those start ups where people like this Youtube engineer do not get fired. If YT had such an authoritarian attitude in 2009, they would never have become succesful.
It's easy for the blog author to say "I have no idea why we weren't fired" but I bet back then it was all fun and games. (Getting fired isn't even that bad, what really harms people is when everyone around them starts to tell them things like "You could have ruined everything!")
And something similar to the excitement and fluidity within the Youtube of 12 years ago happened 80 years ago in pharmaceutical companies, or 200 years ago within a Bank, when such structures were still freshly formed.
> I mean, who are we to say that this was doing it right? It turned out to be beneficial but there are thousands of ways that this could explode in your face if you tried it yourself.
How would this explode in your face, exactly? What was the risk here?
The risk could be that a competitor could step into the market and use "legacy support" as a positive of their platform. Youtube is weird - especially at that time - since it didn't really have any movers and shakers in terms of either consumers or producers, but if they were selling software that corporations legitimately use (for instance - vimeo offers good streaming video embeds that do "things" to try and restrict the video to a limited audience) then a company using this service to host their streaming videos may be motivated to shift off of that service to a competitor that won't artificially limit the users that can actually use this third party plugin.
Again, it's a bit hard to say here with YouTube in particular, because it isn't a place where work happened at that point so none of the consumers wielded significant power - honestly YouTube may have been fine just say "Welp, 18% of our user base is forcing us into supporting janky old browsers - guess we're losing 18% of our user base and saving a bunch on code maintenance."
 Now there are streamers that may legitimately threaten the company if they threaten to jump ship, it's economically weird but beside the point... just, back then there was nothing.
All that said... I had to work with ie6, it is terrible, and these people are heroes.
The most awful thing that could happen to 2006-era YouTube: they could stop delivering ad impressions for an hour or two. Oh wait, no; 2006-era YouTube didn't even have ads yet.
That said, the people with this access probably knew WTF they were doing.
And it sounds like it wasn't just DOS - also just plain "it doesn't work in ie6" seems the most likely outcome; crashes and DOS that the article describes just sounds like the most egregious cases.
They got lucky, IE6 in that sense was ready to die, and they basically declared mutiny with good cause.
Can you imagine any other scenario where insisting your idea is correct above every alternative, to the extent you can bypass the organisation and enforce your idea, could be utterly catastrophic? These developers in particular set up a system to subvert their own company.
They even said it themselves: Legal shit their pants because they tried it and saw Chrome as the first suggestion. They were lucky that the media didn't capitalise on that if they got the same ordering.
This story with Youtube and IE6 is romantic. It's not realistic or aspirational. Besides which, we would not hear of it if the attempt failed.
Ok, and if IE6 had not been ready to die, what kind of catastrophe would have ensued?
> Can you imagine any other scenario where insisting your idea is correct above every alternative, to the extent you can bypass the organisation and enforce your idea, could be utterly catastrophic? These developers in particular set up a system to subvert their own company.
I can imagine catastrophes. But they're mostly the sorts of catastrophes that are bad for an organization, but great for a civilization.
> This story with Youtube and IE6 is romantic. It's not realistic or aspirational. Besides which, we would not hear of it if the attempt failed.
Something we don't hear about if it fails, but produces a great social benefit if it succeeds seems like exactly the sort of thing we should be doing.
I think a difficulty seeing a catastrophic is because YouTube is not valuable to businesses and when your income source is more distributed (no large contracts) there are less chances to really shoot yourself in the foot. You have to piss off all your customers in a way that - really - you should probably see coming.
 I don't mean to deride the service, but it didn't have any commercial tie ins at this point worth mentioning, stuff like professional streamers and hosted partnerships with IP holders came later, at this point they were interesting in pop-culture but not a tool for making money.
Given that IE6 was one of the worst pieces of software in all of human history, there was never likely to be a significant downside.
The problem with IE6 is Microsoft declaring victory in the Browser war with Netscape, and IE6 development stopped. For years until Firefox and Chrome threatened Microsoft's browser dominance.
for one thing that actually happened because they did (and got away) with this: youtube engineers feel like they can leverage their site to dictate what they want in the browser space, by deliberately making it 5x slower on firefox and edge.
As a rule of thumb, if something looks fishy and shady, but just because it doesn't directly explodes in your face, treat it as a "but I don't have anything to hide" situation, and be against it. Because eventually, it will be your face.
There's no evidence that that was deliberate. These two things are also pretty completely unrelated.
Have you heard about Kerviel? A trader that cost his banks a few billions of euros? He was able to do this because he could bypass reviews by his boss and the internal compliance team.
Edit: to be fair, it look like he took the blame for all the failings of the company
Can't we presume that all of this setup happened because everyone involved knew perfectly well that it was just a moderately popular video hosting site? And that they would all have behaved differently if they knew that peoples' life savings and medical histories were at risk? Did it go out of style somehow to presume that people were mostly sane most of the time?
Although... the number of people involved in the "OldTubers" backdoor seems to have been pretty substantial, I don't know if they could really have fired all of them...
But who is anyone to say if doing anything is "right"? We're all equally qualified to have a moral compass.
(Which is to say, with more generality: people who get acquihired are basically temporarily invulnerable to office politics.)
- 'OldTubers' vs. the new Googlers is a clear us v. them thing. Bad for culture, especially when you get privileges on top.
- Their plan was actually successful so there must have been a thing or two to learn on the product side, even if it's as simple as accepting engineering input and understanding the difficulty of maintaining stuff (as opposed to relying on the analytics).
- There's some good thinking and ingenuity in there, despite the methodology.
I'd want to keep hold of that talent while dealing with burgeoning elitism. Up to them to quit if they don't like the changes. Fire them if their contribution begins to suffer.
Formal warning, perhaps, if it actually contravenes a written policy, but in this case is unlikely to have been one. There was no actual financial or reputation loss to the company.
A reality that we would still be supporting IE6 today because so many businesses still use it seems realistic to me. Maybe we would have developed a facade for it..
This brave actions of the OldTuber saved us a ton of effort.
Hopefully somebody from Oracle sneak in Clojure in the JVM!
That's really blowing it out of proportions. It's a minor button addition to the UI and they paid attention to not be sued by the European commission for it for anti competitive behavior. It's a job well done with nothing to criticize.
It's a BAU example of how works get done in a large bureaucratic organization.
>The next morning, Poirot discovers Ratchett was murdered during the night, having been stabbed a dozen times. [...]
>With the train back on track, Poirot concludes that justice is impossible in this case, as Cassetti deserved death; for the first time, Poirot will have to live with a lie and imbalance.
Because this is how everything on the web used to be done. Google won the internet partly because they moved away from this first and best, but product decision making by engineering and overriding approval processes in the name of a good outcome were pretty much standard back then.
Compare with ex-Firefox VP Johnathan Nightingale's recent thread about Google "amateur hour" and "oopses" that only affected Firefox:
It's sort of the reverse of "nobody ever got fired for buying IBM": nobody ever got fired for refusing to implement an accessibility measure that not even IBM (or Google, in this case) bothers to implement. Unless you're in a very specific sector (in which case you know what your obligations are), your legal duties will always be a strict subset of those of $BIGCORP; so you can use $BIGCORP's shirking of a particular duty as a heuristic to determine whether you can safely shirk that same duty.
Are you suggesting that a new browser that isn't out of beta should be less likely to weirdly break on some websites?
Now why they're still sniffing user agents instead of doing feature detection is a good question for Google, who themselves, to the best of my knowledge, push for feature detection instead of sniffing user agents as best practice. Do as we say - not as we do?
Note: I think safari has since fixed indexeddb?
It's one of the reasons I stopped accepting lawyers and law firms as clients. Every other client would pay on time. Those in the law field would consistently slow-pay, I believe because they knew they wouldn't get sued over it.
"That overdue invoice? Oh, we never received it. Send it again." "Oh, Jenna in accounting must have it, but she's on vacation for two weeks." "Oh, that's in process." "Oh, we still haven't received it." "Oh, you have a signature on a certified mail delivery? It must be upstairs for approval." "Oh, we've already run all the checks for the month, it'll be in next month's batch."
I eventually went to a pre-paid hours, payable by credit card-only model. It was the only way to stay afloat. Plus it was delicious when someone would use up all their hours and give give some excuse.
"Oh, we really need this on the site today." "No problem. I can do that as soon as you buy more hours."
"Oh, we'll get that paid next week." "No problem, your website went offline an hour ago and will be back when you buy more hours next week." Poof! payment comes through three minutes later.
Maybe my population is skewed- two of the three doctors I see are Eagle Scouts, but not everyone is a cheapskate.
Wow only _30_ visits to the doctor and you get a $10 gift card? I have never been to an orthodontist but how often does one typically go? 30 times a year? Are those visits free?
On top of that, you have do free advertising (t-shirt) for him, plus having some useless trivia knowledge... I hope the questions are simple enough for people to always get right? This would just be pure frustration for me, I have learned over the years to literally dread trivia "opportunities".
And the best part... you can otherwise donate your paltry gift card to help him pay to go on vacation, albeit helping people, but you said vacation, therefore I'm guessing he works a couple days.
Are those visits free?
On top of that, you have do free advertising (t-shirt) for him,
plus having some useless trivia knowledge...
The trivia questions were pretty easy- "what is the outer layer of your teeth called?", "what is the capital of $state?" etc. It's not that big of a deal.
something something vacation
The best you can do is test the nightly or beta track all the time (if they have one). But that may have false positives due to bugs on their end. And once you notice the bug, you still have to fix it. This isn't something to drop everything for like a security bug, so it may have to wait on other things.
Meanwhile, people outside the company will assume it's intentional sabotage, because that's how the Internet thinks these days.
Google did an "oopsie" and blocked Firefox users from using Gmail
I.e. something which Hanlon's razor should remove, but actually still exists.
It doesn't matter if they were intentional. The author mentioned there may hav been hundreds of these "oops" events. Assuming it wasn't intentional, Google knew there was a problem and chose not to fix it.
"Fool me once, shame on you. Fool me twice, shame on me."
Or in Google's case, "fool Mozilla a hundred times..."
IE6 was a browser of the same era as Netscape 4, when it was common and unsurprising for a CSS error or a malformed bit of HTML to crash the browser.
The idea that YT or any other Google product team would justify not CIing against IE6 or Firefox is absurd to me.
Can you elaborate? I kind of agree on the Kubernetes part in that it's really biased towards Google-scale, but I fail to see such biases in Go, despite having used it as my primary language for 4 years now.
Web developers the world over hated dealing with IE6 lack of compliance with established standards. Microsoft even had a newer versions of IE that was better. Everyone wanted this, it's just that nobody was willing to do anything about it.
This is the world we live in. Even with a near worldwide consensus on what needs to be done, nobody is willing to say it for fear of repercussions. People would rather jump through hoops to support the status quo than risk backlash.
Another problem is that the people who would normally have to approve a change like this are not the people who had to jump through the hoops to support IE6.
The ‘rest of the web’ had these banners for a long while before Google ever got around to adding it. My websites (and anything I worked on) certainly did.
No OMTP, OMTC, no hw video acceleration, no smooth scrolling, buggy dark theme, etc, so Firefox on Linux is not as polished and/or performing as on other platforms. The excuse for that was, that Linux has small market share, so it is not worth for putting more resources into solving these problems (some of them have bug reports open for over a decade!).
Well, Firefox itself has also small market share, it is in single digits now. The irony is in the part, that they actively used this as rationalization for what they do, but do not like it, when others use exactly that reason on them.
I'm not talking about validity of the rationale itself; just about the irony of finding them on the bottom of the hole they helped to dig out.
If you try doing much hardware graphics (let alone game development) for Linux end-users you'll understand why features like OMTP or OMTC are late to ship on that platform, if ever.
Even on Windows some new stuff like WebRender has to be manually opted in on specific driver versions and architectures one at a time, because it turns out video drivers are really bad and if you ship early your app will crash.
No, they are not. A massive subset of Firefox devs are Apple users. For Linux, the occasional Redhat or SuSE dev does the most-needed fixing.
> Those features are missing largely because they are hard to implement reliably on Linux and customers (who are largely not Linux users) won't feel the impact.
Neither of these are true. It does make easier rationalizing Linux mistreatment, though.
If there was no impact, the equivalent functionality would not be needed to be implemented under other OS-es either. You have bugzilla full of feedback, that it does have an impact.
It also makes Linux itself look worse, compared to other systems.
> There are often flags you can set to try and use them on Linux, YMMV.
I know, that you can force OMTC, for example. I've been doing that for years, with zero bugs. Why can't Mozilla do that by default? They haven't reviewed the situation for years, that's how much they care.
Also, for example, the EGL situation (needed for Wayland, obviously Wayland apps cannot use GLX without X11) was purely Firefox problem, not the driver stack one. (Worked on by Redhat dev).
> If you try doing much hardware graphics (let alone game development) for Linux end-users you'll understand why features like OMTP or OMTC are late to ship on that platform, if ever.
Mesa is one of best of breed drivers, especially compared to some drivers you can find on other platforms. See here: https://code.blender.org/2019/04/supported-gpus-in-blender-2... for what Blender devs think about the driver situation.
> Even on Windows some new stuff like WebRender has to be manually opted in on specific driver versions and architectures one at a time, because it turns out video drivers are really bad and if you ship early your app will crash.
I'm fine with incremental enablement. I'm not fine with the enablement never arriving.
And again; the original post was about the irony of the situation Firefox itself finds themselves in; exactly the same excuses are used against them.
On the other hand, the above mentioned features could work, if Firefox would implement them, as they did for other platforms. Other applications on Linux are able to use the needed APIs available, why Firefox is not?
Web apps and Chrome at Google are being made by entirely different team. Unless there is a mandate from the top to the web teams, it is irrelevant. They could be as well a different companies.
> Linux just gets less priority because it's quite frankly irrelevant as a Desktop OS.
That's exactly the point I was making: Firefox is also getting less and less priority, because it loses it's relevancy as a browser.
Except now Mozilla doesn't like exactly this same argument.
There's no other explanation for Mozilla's attitude towards Linux than negligence; objectively, it is neglected. The degree of willfulness is up to the debate.
We too got fed up with all the IE6-specific hacks we had to maintain. One day on the login page, we added a "IE6 might be a HIPAA violation, please upgrade your system" banner. It was technically true... the browser was well past its end-of-life support and was acquiring a running list of unpatched security holes.
Our analytics showed the remaining holdouts upgraded their systems over the next few months.
Unfortunately, that attitude in healthcare leaks to things which _are_ connected to the internet, and you get disgraceful incidents like the hacking of Britain's NHS in 2017.
Years ago, we even tried turning it into an intrusive pop-up for a percentage of users. They just clicked through the pop-up, presumably without reading it.
I wonder if it worked in this case because it started a movement?
> Between YouTube, Google Docs, and several other Google properties posting IE6 banners, Google had given permission to every other site on the web to add their own. IE6 banners suddenly started appearing everywhere. Within one month, our YouTube IE6 user base was cut in half and over 10% of global IE6 traffic had dropped off while all other browsers increased in corresponding amounts. The results were better than our web development team had ever intended.
I think this is the answer. Most people at the time weren't on IE6 out of choice. The fact that youtube (or more likely google docs) had this banner, gave disgruntled employees a better excuse to force IT to upgrade than "I don't like this browser"
I guess they can use this as an excuse to not upgrade their shitshow flash app that beings a multi-core processor to it's knees just navigating between entry fields.
ADP has always been awful, I only wish they would get taken out by a competitor.
Thank god, because we just switched to ADP and then there’s VMWare....
Chrome's definition of "always" is very different to everyone else's. And that's on purpose. But I have no say in what corporate app I can use, and I have no way of getting it migrated quickly. The Chrome devs seem actively hostile to the poor slob who must us the corporate app and have no way of migrating away from it.
It was continuing a movement and giving it visibility outside web developers. The youtube banner happened mid-2009, web developers around the world had soured on IE6 and actively trying to kill it off since 2005~2006.
What the youtube banner provided was a way to finally get the word out, and an argument to give PHBs: it became much easier to sell browser upgrades (or not supporting IE6 anymore) when you could point to Youtube / Google and go "these folks have put their foot down".
As a web developer at the time, I could put that in my proposals, saying “it’s the standard Google browser support policy”. At which point clients signed off on not needing legacy browser support.
The perspective on outdated (insecure) browsers is becoming more broadly accepted.
Ahh yes, the era when far too many web developers gleefully gave the finger to anyone not running Windows. I'm so glad those days are over.
When it was in the >90% market share, a lot of people did that though.
For some reason they think that IE with very expensive tools to lock it down is more secure than a modern browser.
Our clients are businesses (less than 10000 employeees), and we still have 30% of userbase on Internet Explorer. We will soon be dropping IE9 and IE10 support.
I suspect that we'll be supporting IE11 for another two years or so (even considering Windows 7 expiry!).
We installed Firefox 4 and it ran at a perfectly fine speed...
Over the last week we hit the jackpot at work, in that we had errors come into Rollbar caused by every version of Internet Explorer we don't support all the way back to 6. Yes, there was a little celebration in our dev team when we scored our first IE6 exception. :)
Ultimately, I think top-down change at enterprises is much faster than bottom up, and a national news story is just the sort of thing to spur a lot of top-down changes.
Don't underestimate the power of employees to influence employers.
iPhones didn't end up all over the enterprise space because Blackberry walked away from the market.
My guess from my time in the trenches is about 1 in 5.
That's what you train users to do when you show them intrusive popups that aren't actually important.
For example, tomorrow Google can implement a DRM that would require a plugin that works on Windows, Android (with Google Play Services) or Mac, but not on Linux. After all, Linux is not a DRM-friendly system (allowing the user to hack anything is not what copyright holders want), and almost nobody uses it on desktop, so why bother supporting it? Or Google can use it against new, not yet very popular browser, to slow its adoption.
I was reading your comment and wasn't sure if it's well hidden sarcasm or you are seriously don't know this already happened:
Basically you cannot watch high resolutions streams on Linux already even in Google Chrome because of DRM Google among others helped to push into HTML specification.
Problem with what Google among others did is that they was actually in position to push against making something like that part of the web specification. Yet they went all in and what worse become gatekeeper for new web browsers to implement Widevine DRM.
Same way Google might push for more open Android ecosystem, but it's not their interests so they did the opposite. And with all Chrome-exclusive features and AMP it's pretty much the same thing happening.
My Firefox is up to date (60.6.1esr), and Skype Web tells me it's unsupported and refused to work.
AFAIK Web Skype no longer works in any version of Firefox.
I smell an untold story... maybe one of the other teams' banners was accidentally visible to IE7 users as well? Or did IE7 sometimes spoof IE6?
IE8 was the included browser with Windows 7, which was generally well received compared to Vista which included IE7.
The timing lines up within the release window of IE 8 (March 2009) and Windows 7 (July 2009), so potentially a lot of upgrade push from other things in the air at the time.
Edit to add: people on ie7 because it came with Vista probably had a more capable/pushy update system than those on XP with ie6; but I don't remember the details on how microsoft system updates worked at the time.
the people on vista had the pushiest update system of all: vista was terrible, and most users (especially organizational users) switched to windows 7 ASAP.
It was a terrible piece of work and I was glad when it never achieved any significant market share.
IE7 didn't have tabs, IE8 did.
IE8 did have a really bad bug around the JSON.parse implementation though.
I then used the parse2 in everything at that time. In the end, it sucked and I'm glad I don't have to do it anymore. IE8's quirks were what I would consider the last bad version of IE... since then it's been mostly okay on release, but ages rapidly, but enough turnover that you aren't on it too long. Until current IE11 as it's tied to windows releases, and some people/orgs didn't upgrade.
Corporations don't have single agendas, they don't think with one mind, they can't be simplified to a single narrative.
Rather, they're collections of 1,000's of individual each doing their own theing, and the CEO is trying (and often failing) to herd the cats in a single logical direction.
Plenty of good things (like this) can come out of it. But also plenty of bad things, like security breaches, anticompetitive behavior, and invasions of privacy.
Whenever anyone says "because Google always does <x>" or "Google is always like <x>", a story like this is a great antidote.
(And now we’re back where we started with Chrome as the new IE.)
Except for the part where you get BSoD and infinite recursion or any of the other stated IE6 nightmares...
Really, the only thing Chrome and IE have in common are market share. Chrome is magnitudes better than IE in every other way as far as being a web dev goes.
Easy to forget that now.
And Firebug and Chrome Dev tools copied it almost verbatim. I've never seen any sort of acknowledgement of just how much the community owes to that early tool. The announcement blogs about it were still accessible a few years ago.
I realise that's not something a lot of people in a commercial job would have had the luxury of doing.
I don't know if I'd ever want to target IE at all (I'd prefer to pretend it doesn't exist), but I do like to test stuff on niche browsers (NetSurf, Dillo, lynx/links/w3m) and see how usable the site is.
It was actually blazing quick, far, far faster than the JS implementation I tried (like instant vs 20 seconds). While XSLT was hard to develop in and perhaps deserves to die its death, it really was a quite amazing bit of tech.
Although the Developer Toolbar had dom explorer and all that jazz, way more than a console, it really was amazing when I first used it.
Here's the release blogs:
For web apps, IE6 and anything that came before was always terrible. It used to be hard to build a decent web app, even if most of your code ran on the server. There is a huge qualitative difference between building for an ad-hoc platform and one that is rigorously built and tested against some externally defined standard.
Sure, Chrome may be lagging in some areas, but--like all modern browsers--the platform it provides conforms rigorously to the standards it supports. And that makes all the difference.
document.all, the way IE worked had a few advantages... Layers in NN didn't allow you to break apart a form into child components you could hide/show was a particular issue. So you had to mirror values in/out of the child layer portions of the form. That was about the most common and painful problem between the two imho.
Another was when IE 5.0.0 broke the older API for dealing with changing the children of a <select> node. Which was fixed in 5.0.1, but 5.0.0 was on every Office 2000 and Windows 2000 Disc, and what a LOT of people had. That was painful to deal with too.
My point is that in the IE6 era (and previously), doing so was categorically hard -- I didn't say impossible. Also, that changed with FF 3.5, IE8, Chrome, etc.
Admittedly, IE6 and FF 1.5 were the first browsers I had to develop apps for from scratch. (And FF was already far more robust.)
As far as divergence from the standards, again, that was generally not the case when IE6 shipped - if anything, it was at the bleeding edge, and past it in some ways. The real problem with it was that it wasn't updated fast enough to keep up with the rapid pace of web standards after the initial release; and when IE7 finally came, it had very minor improvements (IIRC the biggest one was PNG transparency?). But that was a different era already.
MS IE 1 was developed for Windows 95 Plus!, MS IE 2 for Windows NT 4.0, MS IE 3 was for Windows 95 OSR2 (essentially new OS, it even included it's own brand-new filesystem!), MS IE 4 was for Windows98 (but was rushed with Windows 95 OSR 2.5), MS IE 5.0 was for Windows 2000 (but was included as part of Windows 98 SE), MS IE 5.5 for was Windows ME (which wasn't initially planned - that's why it got 5.5 number, not 6), MS IE 6 was for Windows XP, MS IE 7 was for Windows Vista (and since Vista was horribly delayed so was MS IE 7) and MS IE 8 was for Window 7!
The first version of MS IE which wasn't developed as part of Windows development was actually MS IE 9.
IE2 was the first version available for Windows 3.1, Mac OS, etc. Sure it was released at the same time as NT 4.0 but I don't think it was developed with it.
The big exception is IE4.0 which was deeply integrated with the Windows 98 shell (hello Active Desktop).
Considering the context and era, IE being updated through Windows Updates was actually a huge improvement over the state of the art at the time.
The longer Chrome stays popular, the more websites will optimize for Chrome, bringing back the days of "Best viewed in Internet Explorer". And as more and more sites become accessible only in Chrome, user share tilts further in Chrome's favor, and so on in the vicious cycle we experienced in the browser dark ages.
That's the fear, at least. I don't think anyone is concerned that Chrome is going to start BSoDing Windows anytime soon.
The omens of monopoly are certainly here, though. All of Google's websites, which constitute a large percentage of web traffic, have been a "best viewed in Chrome" affair for a few years now. Microsoft recently dropped support for Firefox on the Skype web client. And I've seen my fair share of niche/corporate/etc websites that are Chrome-only.
(FYI, I'm not personally worried that we'll see the same browser dark ages as before, but it's something we should be vigilant against.)
I'm sure we'll find that a Pareto Distribution is an unavoidable market phenomenon, but I still believe its healthy for technologists to remain skeptical when one client massively outperforms others.
Not quite the case. Chrome + Extensions create a very OS-like environment, hence not even needing Chrome OS.
Soo like - you mean: built in proprietary app stores, random DRM-enforcing blobs with camera/microphone access, phone-home tracking behavior, and gobs of memory use?
side note: Firefox not IE user.
Google Meet does block me though..