
The Era of the Trident Engine - ttepasse
https://schepp.dev/posts/today-the-trident-era-ends/
======
arexxbifs
> One part of why Microsoft's ideas didn't really catched on was that we
> developers just didn't get it. Most of us were amateurs and had no computer
> degree.

Personally, I'd say it was mainly because we were tech geeks who, back then,
actually believed in a web that should be accessible to as many different
platforms as possible using as many different browsers as possible, not just
Windows and IE users.

In a sense, that notion prevailed: the web is ubiquitous, browser lock-in is
considered a douche move and you can still access some really worthwhile sites
even with rather simple means.

In another sense, we all failed, lending our hands to help create the current
privacy-invading, dark-patterned UX nightmare we're still trying to pass off
as a reasonable way of making the world a better place.

I never thought I'd lament the day Microsoft stopped making their own browser
core. As a web developer, one less rendering engine to GAF about is nice. As a
netizen, the Google dominance is going from unsettling to scary.

~~~
jackcosgrove
If the web's reliance on the Chromium code proves to be a hindrance, it will
take a monumental effort to correct. Brendan Eich, speaking of Brave, said
something to the effect that it would take hundreds of engineers years to
build a new browser equivalent to Chromium in capability. For a free piece of
software.

I think it's more likely that a new communication paradigm will supplant the
web than that a Chromium competitor will replace Chromium, just as the new
smartphone/cloud computing platforms finally broke the Windows stranglehold.
PCs are still dominated by Windows.

~~~
stilisstuk
I don't get it. Firefox works great, maybe better

~~~
1123581321
Firefox is built on a similar level of effort. Brendan Eich is talking about
the effort required for a company to develop a new, full-featured browser
engine to complete with Firefox and Chromium.

~~~
MaxBarraclough
It may be true that starting from scratch would be impractical, but it might
be practical to take Firefox/Chromium and make deep changes, to the point that
it's effectively another browser.

Mozilla themselves have been doing this with Firefox. They've been replacing
C++ by Rust (with the Servo rendering-engine project). They've replaced the
JavaScript JIT more than once, if I recall correctly.

We've already seen WebKit give rise to two divergent major browsers: Chrome
and Safari.

~~~
acdha
The problem is keeping pace with the upstream: it's not that it's impossible
but that it's very expensive — when Google forked WebKit they dedicated a
large and very talented (i.e. expensive) engineering team to the project. The
same could be done again but you're looking at companies like Microsoft, not
startups.

~~~
MaxBarraclough
The resources needed to maintain a Gecko-based or Blink-based browser will
depend on the amount of customisation. Vivaldi/Opera/Brave are doing fine, but
they make relatively shallow changes over Chromium.

I just discovered Goanna on Wikipedia, a fork of the Gecko engine, presumably
with relatively thin resources. Don't know how well it compares to mainstream
engines though. [0]

I suppose the short version is that the workload is a function of the goals.

[0]
[https://en.wikipedia.org/wiki/Goanna_(software)](https://en.wikipedia.org/wiki/Goanna_\(software\))

~~~
acdha
That’s the point: if you’re not customizing Blink, you’re not changing the
huge influence which Google has over de facto web standards. If you want to
make more than simple customizations you need a significant commitment just to
keep pace with the upstream – Microsoft can afford that, Samsung can, etc. but
it’s not clear that Brave or Opera can.

------
leeoniya
the webkit monoculture is saddening.

just yesterday i ran into chrome's 2016 img/flex-basis bug which works
properly in firefox but requires an extra wrapping div as a work-around in
chrome.

[https://bugs.chromium.org/p/chromium/issues/detail?id=625560](https://bugs.chromium.org/p/chromium/issues/detail?id=625560)

what possible motivation is there to fix it when you're not competing with
anyone?

hopefully microsoft can help fix it now?

also yesterday, i was writing some ui tests that use getBoundingClientRect()
at different media query breakpoints. not only does chrome intermittently fail
to deliver consistent results between runs (even with judicious timeouts), at
different screen pixel densities its rounding errors are _several_ pixels off
and accumulate to bork all tests in a major way. on the other hand, firefox
behaves deterministically across test runs and there's a _single_ pixel (non-
accumulating) error in one of several hundred tests.

somehow, i made it through the dark ages of IE6 without permanent hair loss,
but i dont have fond memories of those years in my career.

now manifest v3 is starting to roll out in Chrome 80. once uBlock Origin stops
working, i will use chrome even less (i try only to use it for its devtools
currently)

~~~
MR4D
Why does nobody say “the TCP/IP monoculture is maddening,” or “the HTML
monoculture is maddening” ?

Seriously, I need somebody to explain it to me because I don’t get it.

Having one core base of code is not a bad thing to me.

What am I missing here?

~~~
rst
There is no single TCP monoculture codebase -- even among Unix-alikes, the
Linux networking stack is a reimplementation, with no common ancestry with the
BSD stack. (To say nothing of routers with their own implementations, often
including hardware assist at the high end.)

------
shdon
Although I can see the advantages for Microsoft, I don't think it's a good
thing that the browser engine landscape is going to get even more homogeneous.
Firefox is now the only web browser of note that is not based on a
WebKit/Blink/Chromium derivative.

Sure, it's one fewer target to test against, but it saddens me to think that
this is going to make it even more likely that web developers target Chrome
and its ilk only and that the layout bugs in it are becoming the de-facto
standard, just as happened with IE6 for ages. This is actually bad for Firefox
even in the parts where it adheres to the standards when other browsers won't.

~~~
CydeWeys
It seems to me that having a single pervasive open source web rendering engine
is actually the ideal state of affairs. How is fragmentation helpful in this
area? It's duplicated effort and it makes Web development harder (and less
efficient) for the millions of Web developers out there. I personally don't
see people taking this approach to, say, Linux; generally people are happy
that it's dominant in the server realm and that they can learn one thing and
use it everywhere.

IE used a proprietary rendering engine. It's now being replaced with a free
and open source one. This seems like a strict improvement. It's the opposite
situation -- a single proprietary engine being dominant -- that is the
doomsday scenario, and that's what we saw a decade and a half ago with IE. The
farther we get from that being a possibility, the better.

For related reasons, I'm not happy that DRM has become part of the standard
Web feature set.

~~~
mintplant
> It seems to me that having a single pervasive open source web rendering
> engine is actually the ideal state of affairs. How is fragmentation helpful
> in this area?

Because the Chromium monoculture has allowed Google to dominate the web
standards process. They can veto any feature or force one through by shipping
it and pushing sites to depend on it (including their own).

There is an army of Googlers whose job it is to keep tacking on new web
standards. And Google will implement the features before proposing the specs,
so their competitors—well, now it's just Mozilla and Apple, I guess—are kept
playing constant catch-up. Meanwhile, anything that comes from outside of
Google will have to brave the same army trying to smother it in committee.

Just ask anyone who's dealt with web standards politics from outside of
Google. It isn't fun anymore.

(Oh, yeah, and because there's essentially no accountability now, plenty of
these new features rushed through the door are buggy and introduce security
holes. It's like IE all over again.)

~~~
DaiPlusPlus
Apple’s WebKit is kept in-sync with Chromium - whenever Google adds something
then Apple gets it for free within a couple of months - though Apple tends to
disable or vendor-prefix new features it doesn’t like.

~~~
pcwalton
The opposite is true. WebKit is now quite far diverged from Blink. Apple very
much does not keep it "in sync" with Chromium.

~~~
DaiPlusPlus
Ah - my mistake. I was operating on the assumption the Blink and WebKit teams
were exchanging patches regularly.

That's a dang shame then :/

Apple's a big company - but we saw how they mishandled their own first-party
Maps service after divorcing from Google - I can see Apple's Safari
potentially falling behind badly if they can't keep-up with Google's work on
Blink.

~~~
emn13
They already are; there are some pretty glaring bugs/missing features in
webkit nowadays.

In fact, I can't think of _any_ webkit developments that positively surprised
me the past few years; development seems glacial, at best. A list of somewhat
notable stuff chromium and gecko have that webkit is still missing:

Stuff because it's better to make your devs pay licenses for no good reason:

\- [https://caniuse.com/#feat=webp](https://caniuse.com/#feat=webp)

\- [https://caniuse.com/#feat=av1](https://caniuse.com/#feat=av1)

\- [https://caniuse.com/#feat=opus](https://caniuse.com/#feat=opus)

\- [https://caniuse.com/#feat=ogg-vorbis](https://caniuse.com/#feat=ogg-
vorbis)

\- In "fairness":
[https://caniuse.com/#feat=hevc](https://caniuse.com/#feat=hevc)

There's a whole bunch of stuff that would make it easier for webapps to
replace app store apps or otherwise appear native; can't have that!

\- [https://caniuse.com/#feat=vibration](https://caniuse.com/#feat=vibration)
(trying to push people to the apple app store?)

\- [https://caniuse.com/#feat=webgl2](https://caniuse.com/#feat=webgl2)

\-
[https://caniuse.com/#feat=fullscreen](https://caniuse.com/#feat=fullscreen)

\-
[https://caniuse.com/#feat=registerprotocolhandler](https://caniuse.com/#feat=registerprotocolhandler)

\- [https://caniuse.com/#feat=css-containment](https://caniuse.com/#feat=css-
containment)

Weird stuff:

\- [https://caniuse.com/#feat=flow-root](https://caniuse.com/#feat=flow-root)
(supported on osx, not ios?)

\- [https://caniuse.com/#feat=input-datetime](https://caniuse.com/#feat=input-
datetime) (mostly supported on ios, but not osx?)

There are the missing features that just seem there to bug users and devs:

\- [https://caniuse.com/#feat=link-icon-png](https://caniuse.com/#feat=link-
icon-png) (I mean, seriously?)

Then there's useful stuff they don't seem to be willing to work with:

\- [https://caniuse.com/#feat=css-text-align-
last](https://caniuse.com/#feat=css-text-align-last)

\-
[https://caniuse.com/#feat=requestidlecallback](https://caniuse.com/#feat=requestidlecallback)

\-
[https://caniuse.com/#feat=shadowdomv1](https://caniuse.com/#feat=shadowdomv1)

\- [https://caniuse.com/#feat=custom-
elementsv1](https://caniuse.com/#feat=custom-elementsv1)

Obviously, there are features that webkit has that others do not, but by and
large they're not as interesting or plausibly useful.

Webkit is _definitely_ not blink; not anymore.

------
FreakyT
Really enjoyed the overview of all the ahead-of-their-time features introduced
by IE!

I feel like many people forget that, back when it completed against Netscape,
IE really _was_ the best browser on the market. The problem was that once they
had “won” the browser war, Microsoft just completely abandoned development,
allowing the product to languish and become the terrible abomination many of
us remember having to write ridiculous workarounds to support.

~~~
zerotolerance
Having more features does not make a browser superior. IE had a universe of
memory leak and CSS rendering quirks. Their rendering was so overly permissive
that "anyone" could put a mess together and it'd render. That permissive
client almost single handedly slowed web development progress by a decade.
Because nobody wanted to produce a browser that failed to render some web
pages that could be rendered in other browsers. Nobody cared that the pages in
question were gibberish. If anything all these IE only features did more to
harm web dev than help it. They're why we're in the situation where all this
enterprise software still requires old versions of IE to operate.

~~~
yoz
The idea that _" if only web browsers didn't render bad HTML, the web would be
so much better"_ is one of the oldest myths in web development, and I'm kind
of amazed that it's still showing up.

What you call the permissiveness of web browsers - in other words, their
insistence on attempting to render invalid or badly-formed HTML - is what has
made the web succeed at all.

Firstly, it was fundamental from the start: NCSA Mosaic was implemented that
way, as was Netscape, so there's no point blaming Microsoft.

Secondly, and far more importantly, the robustness of web browsers is the
reason why you can read 99.9% of web pages at all, including the one you're
reading right now. (Yes, it's invalid:
[https://validator.w3.org/nu/?doc=https%3A%2F%2Fnews.ycombina...](https://validator.w3.org/nu/?doc=https%3A%2F%2Fnews.ycombinator.com%2Fitem%3Fid%3D22146629)
)

I know it's tempting to believe that draconian error handling would have
forced people to code web pages "properly". Unfortunately, when draconian
error handling was added to the web (as XHTML), it failed to take off. Check
the history:
[https://www.w3.org/html/wg/wiki/DraconianErrorHandling](https://www.w3.org/html/wg/wiki/DraconianErrorHandling)

Mark Pilgrim wrote several excellent pieces about why non-draconian error
handling is better, and as someone who wrote XML feed parsers and validators
that were among the most robust and thorough in existence, he is deeply
qualified to know. My favourite of those pieces is the "Thought Experiment"[1]
but I also recommend [2], which includes:

 _There are no exceptions to Postel’s Law. Anyone who tries to tell you
differently is probably a client-side developer who wants the entire world to
change so that their life might be 0.00001% easier. The world doesn’t work
that way._

[1]
[http://web.archive.org/web/20080609005748/http://diveintomar...](http://web.archive.org/web/20080609005748/http://diveintomark.org/archives/2004/01/14/thought_experiment)

[2]
[http://web.archive.org/web/20090306160434/http://diveintomar...](http://web.archive.org/web/20090306160434/http://diveintomark.org/archives/2004/01/08/postels-
law)

~~~
s_gourichon
@yoz is right. If only web browsers didn't render bad HTML, the web would not
be so much better, it would not have worked.

There is a historical precedent to show it.

I remember when XHTML was the future, about 2002-2005. Pages were loaded in
Firefox by a XML parser. If the page was invalid XML for any reason, Firefox
would render a parser error message: "error X in line Y, column Z" with a copy
of the offending line and a nice caret under the error position thanks to a
monospace font.

Wrong percent encoding? No page rendered. Invalid entity? No page rendered.
Messy comment separator (two minus signs)? No page rendered. Inserting an
element where not allowed ? I guess no page rendered.

This is nice for a rigorous developer perspective, I appreciated it. But (I
used to hate that but a wise person sees the world as it is) it is a
catastrophe for real-world adoption.

Fixing one static page on your dev machine, thanks to the error message, is a
thing. Making a dynamic website become practically impossible unless all your
engineers are extremely rigorous and well-organized, and/or use a framework
that generates guaranteed valid XHTML any time.

But all frameworks (except a few unknown ones) had (have?) no notion of a
document tree or proper escaping but just concatenate text snippets.

From a business perspective, it means your website is much more difficult to
get displayed at all (let alone correctly displayed). And even if it works
today, it can blow up at any time because of a minor fix anywhere. Worse, the
pages your team tests are okay, but real-world visitors will hit some corner
case and get an error message intended for a developer.

One may have hoped that some cleaner framework would appear and serve
guaranteed valid XHTML any time. I would have liked this option. Developer
would create tree hierarchies in memory and serialize them into XHTML. Please
commenter name some that do and how popular they are. Did it save XHTML?

That aspect may be the reason number one why XHTML was ditched in favor of
HTML5: the web worked because it did a best effort to render invalid pages.
Any solution that strays away from this principle will not be adopted at
large.

Meta bonus: we're discussing the HTML level but this kind of discussion we
would have had at any other level, had the stack been consistent a few levels
higher (script) or lower (HTTP, TCP). It's funny how HTTP and TCP looks like
they just work, but they have their own corner cases and spec holes. The
ecosystem just happened to have mostly converged on a few implementations that
mostly work okay. (No, let's not talk about IPv4, NAT, and the like. ;-)

~~~
zerotolerance
XHTML was late to the party by at least four years. And the problem was not
specifically about doctype validation, it was about how pages looked.
Contracts with web users are visual first.

~~~
s_gourichon
"How page looked"? Any reference on this? When I ask Google why XHTML fails it
replies [https://www.quora.com/Why-did-the-XHTML-specification-
fail](https://www.quora.com/Why-did-the-XHTML-specification-fail) which
mentions first invalid pages not rendering then more subtle interoperability
problems, not "how page look".

------
adev_
I find the article to be pretty lineant, or amusingly silent, on "why" IE
technologies never got adopted or standardized.

I am old enough to remember the time where VML and the others were invented.
And at this time "Open Source" was still called by Microsoft a "cancer", and
Linux users frightened for patent violation every 2 month by MS.

It would have been insane to re-implement or standardized anything coming from
Microsoft at this time, leading you for sure in front of a judge for patent
infringement...

Anything coming from Microsoft was radioactive due to stupid political
decisions and aggressive patent & IP attitude.

This is sad, and cause us to loose 10 years in the web evolution, reinventing
the wheel many many times.

Without even consider the millions of hours of engineering wasted to fight
with broken HTML compatibilities, locked-in technologies (flash, Sliverlight,
ActiveX, Vbscript) and continuously deprecated proprietary APIs.

------
cstross
I was __really __disappointed that the OP wasn 't talking about the UGM-133
Trident:

[https://en.wikipedia.org/wiki/UGM-133_Trident_II](https://en.wikipedia.org/wiki/UGM-133_Trident_II)

(Alas, I think they'll be around for a few decades more, unless someone is
wicked enough to use them.)

~~~
ribs
Ditto. As I understand it, that program largely put me and my sister through
college...

~~~
cstross
So every cloud has a silver lining!

(Even the fiery mushroom cloud surrounding the fireball burning your face off
a few years later ...)

------
reaperducer
I don't like IE. I never have. But this part rings true:

"Internet Explorer already had many of the things that we came to reinvent
later and that we now celebrate as innovations."

It happens all the time in tech, and IE probably reinvented some things from
Hypercard or whatever came before it.

------
teilo
Every one of these "innovations" was invented by Microsoft alone, without any
dialog or input with or input from the standards committees. Their intent was
clear: to subvert web standards in any way they possibly could in order to
force people to Windows. And in large part they succeeded.

Treating them as advances that were, sadly, not adopted by the rest of the web
takes a lot of chutzpah.

~~~
Rexxar
That's was the standard way of doing things at the beginning of the web : Let
browsers experiment things then standardize what's working. Every body was
doing this.

The problem with Microsoft was aggressive pricing and forced default
installation, it was not the technical side of the browser.

~~~
teilo
This is not true. It was only Microsoft's innovations that were tied to a
specific operating system. Other browsers were cross platform. Even
Microsoft's short-lived IE for Mac was not really IE.

ActiveX nearly destroyed the web, and as recently as a few years ago there
were still enterprises digging themselves out of the proprietary mess they
developed themselves into.

~~~
kevingadd
It's unreasonable to blame ActiveX for this. NPAPI was not implicitly portable
in some fashion, it was still just native code being embedded in the browser,
with no particular guarantee of quality and no guarantee that a Win32 NPAPI
plugin had Mac and Linux ports.

The ActiveX API was also well-specified and debuggable in a way NPAPI was not,
and it was possible to embed it in other runtime environments like Office
documents and Visual Basic applications relatively easily, because COM was
truly wonderful technology (even if using it was, at times, very painful).
It's not a coincidence that Firefox made heavy use of COM for a long time
(though they've rightly been removing it).

Having used COM and ActiveX extensively, despite their flaws they were vastly
superior technologies compared to NPAPI and they were a pleasure to work with.
The security model was bad but again none of the competitor technologies were
any better. I shipped large-scale native apps that successfully embedded
ActiveX controls (like the flash player) and this was reasonable specifically
because of how good the APIs were.

Even after NPAPI and ActiveX made an exit, the web still was infected by swf
files and unity games and what have you. Those things are all either dead now
or on life support because it turns out browser vendors don't want to maintain
them and they're not portable.

------
ksec
>MHTML was proposed as an open standard to the IETF but somehow it never took
off.

I had always wished MHT replaced PDF. But due to the rivalry at the time
Firefox refuse to support MHT ( even to this day ). Webkit has WebArchive
which as far as I know isn't support outside of Apple's ecosystem.

I dont actually buy the argument it was Vista that slows down IE development.
IE 7 wasn't that much difference to IE 6. It shows Microsoft had very little
incentive to improve the Web. I dont know how many actually hate them for not
complying with ACID "Standards". I certainly dont. But at the time the Web has
so many low hanging fruit a lot of people ( or just me ) were pissed Microsoft
didn't even bother improving while holding the web standard hostage with its
IE dominance. Along with the crap called Windows Vista.

Luckily we got the first iPhone, 2 (?) years later. And the rest is history.

~~~
wbl
MHT does not do what PDF does. Web text rendering remains abyssal in
comparison to what 1980's computer technology can achieve.

~~~
felixfbecker
But it would do a lot of the things people use PDF for much better.

------
uncle_j
This is a shame in some ways. Internet explorer was always very strict on how
it works.

Anything before 8 was a challenge due to some atrocious bugs.

This had it problems but it really taught you not to write sloppy CSS and JS
as it would usually just wouldn't work.

Versions After 7 basically anything that wasn't in the spec supported wasn't
implemented so you had to write code pretty much bang on the spec.

Just this Friday I solved a rendering problem with IE where SVG TEXT elements
weren't being rendered correctly, I was calling _element.innerHTML_ to set the
text which was incorrect. I should have been _element.textContent_. Using
_element.innerHTML_ is incorrect as SVG elements shouldn't have a innerHTML
property (they are not HTML). IE11 was actually working correctly, where the
latest Chrome behaviour was incorrect.

So spending time making it work in IE has improved my code.

~~~
kyle-rb
>Using element.innerHTML is incorrect as SVG elements shouldn't have a
innerHTML property

Is that definitely the case? Chrome, Firefox, and Safari all return a value
for the innerHTML property of an element in an SVG document.

This W3C spec [0] specifically mentions XML documents in addition to HTML
documents. And as I understand it, it seems like embedded SVG elements also
inherit from the Element interface which includes InnerHTML.

IE11 might also be correct, following an older spec, but I don't think you can
jump to the conclusion that Chrome is wrong just because the property is
called innerHTML.

[0] [https://w3c.github.io/DOM-Parsing/#the-innerhtml-
mixin](https://w3c.github.io/DOM-Parsing/#the-innerhtml-mixin)

~~~
uncle_j
That is interesting.

I assumed that innerHTML must have been wrong because textContent works in all
the browsers I have tried it on whereas innerHTML doesn't work. A cursory
search textContent vs innerHTML seemed to suggest textContent was the correct
way.

It looks like it isn't a simple case of IE11 (I haven't had a chance to test
on 9 & 10 yet) being correct and the others being incorrect. Thanks for the
info.

------
userbinator
I wish they'd open-source it and put it on GitHub... one thing I've noticed
with IE is that for non-script/app HTML it tends to use far less memory and is
faster at rendering than Firefox or the Webkit browsers. I suppose that's
because it originally was written to work in Win95 with very limited memory,
and so there's been a lot of optimisations around that.

------
est
> One part of why Microsoft's ideas didn't really catched on was that we
> developers just didn't get it. Most of us were amateurs and had no computer
> degree. Instead we were so busy learning about semantic markup and CSS that
> we totally missed the rest

Should we address the elephant in the room? For us without a CS degree, Flash
was easily the first choice, IE have lagging problems whenver there's more
than three layers of <DIV>s around. Yes it has lots of cool capabilities but
it's rendered largely impractical. Even Adobe Flex was about take over the
"business app" world.

On the Microsoft side, .NET happened and Silverlight happened.

But ultimately, the iPhone happened. 1-charge per day battery phones happened.

BTW the article didn't mention <IMG DYNSRC> and background MIDI music support.

------
AgentME
I'm a bit irked at the specific way the article keeps on bringing up
unstandardized and clunky-as-hell looking features (I never knew about IE's
behavior attribute; that looks scary) that were only ever implemented in IE as
if IE was being unfairly judged. It's neat to see IE did those things, but the
article glosses over that web features are good when they're cross-platform,
standardized, and mesh well with other/future features. It's easy to just add
features if you're IE and not worried about those things. It's telling that
the article links to a demo for one of the old features and mentions you need
a VM with some specific old IE for the demo to work. One way standardized
features are better is that they tend to stay working in future browsers.

------
jameslk
I was wondering what the building is shown in the first photo of this article.
It's this:

[https://en.m.wikipedia.org/wiki/Buzludzha_monument](https://en.m.wikipedia.org/wiki/Buzludzha_monument)

------
kirstenbirgit
Skimming through this article, while all these IE features are cool, they seem
to have been created with no common strategy or goal other than to make
something that another MS team thought useful, like MSXML for the Outlook Web
Access team.

The implementation of them seem to be totally inconsistant (sometimes weird
nonstandard CSS syntax, weird meta tags, ".htc" files, etc. etc.), and very
IE-specific, so it's almost impossible for other browsers to implement.

This is the real reason why they cranked out all these weird features: to
vendor-lock people into IE.

------
Keverw
I hope they bring the Chromium based version of Edge to the Xbox too. I was
playing some Babylon.js demos and it kept freezing up once I had to force
restart the whole thing. But webGL + the controller API means you could ship
to the console directly.

Not sure if Playstation browser could do this either, but be nice since
console's are more locked down but seems they are opening up since Fortnite I
believe is the first cross platform game where your Playstation and Xbox
friends can play together. Then I think if you created something like a
virtual world where dynamic content is allowed, I think the console makers
might not be too thrilled about that, so really like the idea of being able to
publish console games as just a web app directly.

I also think Microsoft is more open too when it comes to consoles, for example
you can go to Walmart and buy a Xbox and then turn it into a DevKit while I
believe the others make you buy a expensive DevKit hardware that isn't the
same as the console already shipped, but maybe this is because of Microsoft's
PC background. So from my understand it's easier to publish to the Xbox if
making a native game compared to the other consoles, can get started faster
but still need approval to ship - while I think with the PlayStation you have
to spend a lot of money just to license the tools before you even write that
first line of code.

------
OliverJones
Yes, the Redmond Middle School science project called Internet Explorer
prototyped some excellent concepts. For sure. You can get a LOT of cool
concepts for a hundred megabucks a year from people of the intellectual and
creative firepower hired by Microsoft.

But, why did it fail?

Of course, as the article says, one reason was the Ballmer-era tsunami of
bureaucratic confusion that inundated Microsoft and stymied the release of the
Windows versions that carried IE.

Another was security. Cybercreeps love IE. Drive-by malware? IE. "Internet
Exploder."

A third was the cost of compatibility. It necessary for web developers and
later, web app developers, to develop and test once on all the browsers and
then again on each version of IE. It didn't help that it came bundled with
Windows: large-org IT managers often forced their users to use an atrocity
like IE6 years after it upgraded. This bogus standardization shackled a ball
and chain to third-party developers.

A fourth was, paradoxically, the whole ActiveX Control subsystem. Apartment
threading, anyone? Monikers, anyone? It was just barely good enough that DICOM
and other high-end imaging systems could use it. That took away incentives to
get <canvas>-like stuff working well.

Other companies have done similar things. DECNet, GM's MAP/TOP. Apollo Token
Ring. SysV vs. BSD. But none of those things hobbled an industry quite like
IE.

Trebuchets are cool tech too. But imagine if every UPS truck had to carry one
to place packages on peoples' doorsteps.

------
dejawu
Every now and then I daydream about what it would take to totally re-invent
the frontend stack with today's knowledge that the web is a place for
applications, not just documents. There are over 40 years' of legacy and
backwards-compatible baggage in HTML+CSS+JS and it's basically impossible for
a small entity with little funding to build their own browser engine now. What
would a new spec for a platform for delivering applications look like?

------
ndesaulniers
An incredibly well cited and researched article! So many of these I have never
heard of. (If only M$ had documented these as well as MDN, maybe folks would
have used them more and demanded their implementation in the competing
browsers. Ah, getting to the end of the article, this is mentioned).

>The other reason could have been a lack of platforms to spread knowledge to
the masses. The internet was still in its infancy, so there was no MDN ...

Really incredible demos, too. You can see the URL in some of the demos; looks
like the author stood up a VM and wrote many of the demoes. (Recent Star Wars
trailers in Windows XP?) Ah, later there's an Internet Archive link to a M$
published VM image!

> You think Internet Explorer could not animate stuff? Not entirely true.
> Because, back in the days there was already SMIL, the Synchronized
> Multimedia Integration Language. SMIL is a markup language to describe
> multimedia presentations, defining markup for timing, layout, animations,
> visual transitions, and media embedding. While Microsoft was heavily
> involved in the creation of this new W3C standard, they ultimately decided
> against implementing it in Internet Explorer.

This brings back bad memories; I recall being taught SMIL briefly in a web-dev
class in college. I think Mozilla implemented it. IIRC, you could
declaratively animate SVG via XML-like tags, rather than JS or CSS. I didn't
know it could access DOM/HTML, or play audio/video!

The implementation of the currentScript() example uses `i` without declaring
made me quickly panic. (Thank god JS doesn't allow that to work; had to double
check in a console quickly though; "surely `undefined`++ won't convert
anything to a number).

~~~
arh68
> _surely `undefined`++ won 't convert anything to a number_

You're right, it converts it to Not-A-Number. undefined++ becomes NaN.

------
stmw
This talk by Adam Bosworth from 2007 is on the same topic and quite
interesting take on the early history of IE.
[https://www.eweek.com/networking/googles-bosworth-why-
ajax-f...](https://www.eweek.com/networking/googles-bosworth-why-ajax-failed-
then-succeeded)

------
Theodores
We missed out in using these features in part because we were too concerned
with making pixel perfect copies of designs that had to look exactly the same
in every browser because that is what our clients signed off on. This was in
the days before responsive and all our time was spent fiddling with margins
and padding to get things pixel perfect. There was no scope to take a step
back and play with some of this Microsoft technology.

Invariably the cast in stone designs were PDF drawings from Photoshop where
the art was in second guessing what the designer was thinking of and where
they stole their influences from.

You could not implement a table in a cool way on MS IE and in a more boring
way on the other browsers, knowledge was just not there or the space to
experiment.

------
dwb
I remember using page transitions as a teenager to emulate PowerPoint in a
programmable info-screen thing for the school library. Baby's first PHP. Lots
of copy-and-paste cos I didn't really understand the fuss about writing
functions. Good times, RIP.

~~~
Keverw
Oh yeah, Powerpoint Jeopardy! I remembered we did that in high school science
class once. Totally forgot about powerpoint being able to link buttons to
different slides but in that case I do think it was a lot of copy and pasting.

------
amelius
Why did Microsoft choose Chromium, and not Firefox?

~~~
Macha
Gecko is apparently a pain to embed. Other browsers built on Gecko such as
Epiphany and Flock later moved to Webkit.

~~~
samantohermes
Blink is also not embeddable.

------
aikah
I never use EDGE on my computer because I found it to be slower than Firefox.

But I still use IE11, ironically, because I like to develop quick HTA tools
for enterprise in HTML and Typescript powered by excel documents or access
database using COM instead of having to download Electron and what not, which
I don't need since I'm only developing for Windows.

it's unfortunate that everybody lost nearly 10 years because Microsoft stopped
taking Webtechs seriously in order to focus on Silverlight and what not, which
they later abandoned anyway.

I need to find an alternative to HTA though that still support COM since
eventually Windows will stop supporting HTA apps.

~~~
uallo
> it's unfortunate that everybody lost nearly 10 years because Microsoft
> stopped taking Webtechs seriously

Yes it is. It is also unfortunate that some people refuse to upgrade their
browsers and engineers have to jump through hoops in order to still support
them... ;)

------
svnpenn
>
> [https://sec.ch9.ms/ch9/5736/312f4d46-e479-4087-b562-2aece012...](https://sec.ch9.ms/ch9/5736/312f4d46-e479-4087-b562-2aece0125736/3-114R.wmv)

Who still uses WMV?

------
Ericson2314
Let's not celebrate embrace extend extinguish, but let's also not celebrate MS
handing the keys to google so they can do the same with Chrome. Why help your
competitor? They should have gone with Firefox.

Around 2012 when all 3 major browsers had similar market share [1] is what we
want: everyone can make try extension but no one can ram them down our
throats.

[1]:
[https://en.wikipedia.org/wiki/Usage_share_of_web_browsers](https://en.wikipedia.org/wiki/Usage_share_of_web_browsers)

~~~
jfoster
In adopting chromium they're not just helping their competitor, but their
competitor is now also helping them.

At the end of the day, the default option for most users would naturally be to
stick to Edge, but they instead have been installing Chrome. Why? Because
Chrome was better. Now Edge is like Chrome.

There's much less of a reason for users to install Chrome as a result.
Microsoft are likely to regain some marketshare with this approach. If Bing is
good enough, that may also mean billions in revenue.

~~~
stOneskull
That's it, and everything syncs across easily. Installing Chrome is now pretty
much redundant.

------
monoideism
Warning: article is excellent, but contains rapidly flashing colors that could
trigger a photosensitive epileptic reaction.

------
tarsinge
> One part of why Microsoft's ideas didn't really catched on was that we
> developers just didn't get it. Most of us were amateurs and had no computer
> degree.

Well if you only look at the web and exclude the bulk of enterprise software
in the 00’s.

------
acdha
This is generally good but the conclusion is mostly wrong

> One part of why Microsoft's ideas didn't really catched on was that we
> developers just didn't get it. Most of us were amateurs and had no computer
> degree. Instead we were so busy learning about semantic markup and CSS that
> we totally missed the rest. And finally, I think too few people back then
> were fluent enough in JavaScript, let alone in architecting complex
> applications with JavaScript to appreciate things like HTML Components, Data
> Bindings or Default Behaviors. Not to speak of those weird XML sprinkles and
> VML. > > The other reason could have been a lack of platforms to spread
> knowledge to the masses. The internet was still in its infancy, so there was
> no MDN, no Smashing Magazine, no Codepen, no Hackernoon, no Dev.to and
> almost no personal blogs with articles on these things. Except Webmonkey.

People were building complex JavaScript apps back then (the term DHTML was
coined in 1997), and there were plenty of experienced software developers and
people interested in really learning how to code well. Similarly, there were
many sites where you could learn techniques — not just A List Apart but many
personal blogs and sites like Philip Greenspun's
[https://philip.greenspun.com/panda/](https://philip.greenspun.com/panda/)
(1997). If you look at the first A List Apart entry in the Wayback Machine,
they list many well-known resources:

[http://web.archive.org/web/19991005040451fw_/http://www.alis...](http://web.archive.org/web/19991005040451fw_/http://www.alistapart.com/news.html#res)

(There is, however, a quite legitimate argument that back then English fluency
was a very significant barrier)

The main problem was that 1990s Microsoft was all about “cutting off their air
supply”. They threw huge amounts of money into building out tons of features,
exclusive bundling agreements and promotions with various other companies,
etc. but they were not nearly as lavish in spending on QA or developing web
standards even before the collapse of Netscape lead them to pull most of the
IE team away. If you tried to use most of the features listed, they often had
performance issues, odd quirks and limitations, or even crashing bugs which
made them hard to use in a production project and many of those bugs took
years to be fixed or never were.

In many cases — XHR being perhaps the best example — going through the process
of cleaning the spec up to the point where another browser would implement it
might have lead to the modern era dawning a decade earlier with better than
weekend-hackathon-grade quality implementation in IE. I look at that era of
Microsoft as a tragedy of management where they had some very smart people but
no strategy other than “prevent competition”.

------
mianos
Having the width include the border because that is how a physical box works
has to be one of the all time stupid design decisions of all time. What sort
of management structure would lead to something like that? It suggests there
was some guy/girl who were supreme dictators who made lots of these decisions.

While committees often make crazy decisions this is the other end of the
spectrum. Similarly, orders from the office team for extensions.

The stories behind these things would even better reading.

~~~
untog
> Having the width include the border because that is how a physical box works
> has to be one of the all time stupid design decisions of all time.

...no? If I want two side by side divs, I could give them width: 50%. But with
regular box-sizing, if I apply a border to those boxes they'll stack
vertically. That's pretty dumb.

------
013a
While having a very long, complete answer is interesting, there's a very
simple tl;dr that answers the question just as well.

You only have N bandwidth available to the house. Cable companies have a
tradition in providing content; content can be very complex, and very rarely
was content ever coming back from the house. So, instead of doing N/2 upload
N/2 download, they did, lets say, N/4 upload 3N/4 download. Why haven't they
fixed it? Legacy systems. We've all been there.

------
sayhello
Disclosure: I work at Google on Chrome. Opinions are my own.

Using chromium as a base, browsers will have to differentiate by offering a
better product.

“Better” is less likely to be better performance, compatibility, or
accessibility.

By using much of the same code as another browser implementer, any browser
vendor [hint hint] that still makes their own engine could reduce the
resources they put on the foundations and web platform and put more of it on
the product itself.

Perhaps we’ll have more groundbreaking innovations to move browsers forward.
The last few big ones: multi-process architecture, tabs.

In effect, by using the same foundations, the browser wars could in fact be
reignited and the users could be the winners.

On the web platform side, I.e., the stuff you see on MDN and w3c specs, using
the same “base” doesn’t mean the browsers won’t have different implementations
of future APIs if the vendors’ opinions diverge strongly. Case in point:
Chromium used to use WebKit as its renderer and now uses blink, a fork of
WebKit.

~~~
pcwalton
> By using much of the same code as another browser implementer, any browser
> vendor [hint hint] that still makes their own engine could reduce the
> resources they put on the foundations and web platform and put more of it on
> the product itself.

In other words, throw out all the advantages that come from using Rust in the
browser in favor of a codebase that has a policy forbidding any use of that
language. No thanks.

Furthermore, I have to note the irony that you say nobody else should
implement their own engine when your team is the one that forked Blink from
WebKit in the first place.

