"In conclusion, people are finally beginning to understand the hard way:
Web Browsers really make poor client application runtimes for all but the most basic of things.
ActiveX is a security nightmare and counter to making cross-platform applications.
The web is a constantly changing and evolving standard separate from Microsoft or any specific operating system, as such implementing an application for just one specific browser and version with no intent to keep it updated is just asking for disaster.
Tying local help files or other local content to a "live" web browser that may change in functionality is a bad idea.
But, nobody ever listened to anything I have to say."
Narrator: And nobody listened to anything he said.
In fairness, "DHTML" / programmatic manipulation of the DOM was just invented in 1996 and eventually ended up being the envisioned successful client application runtime implemented across browser versions and runtimes. Now, of course with the Chromium hegemony we have an old-is-new-again set of problems on the web, but I do think that for a time, people _did_ listen to this advice and stopped writing applications for _only_ IE4/5/6.
As an aside, I do still think that if a cross-platform application framework that were built from the ground-up for secure Internet-delivered application use had ever materialized in a robust way, we would probably be in a better place as an industry. But, that didn't happen, and it certainly wasn't for lack of trying - see Java Applets/Web Start/JavaFX, Silverlight, and of course the elephant in the room Flash / Flex.
Instead, HTML evolved into dynamic HTML / "the DOM" / HTML5, and each layer of abstraction built there was "good enough" to dethrone all challengers - in no small part due to, again, people heeding the advice of the OP quote and not implementing browser-locked software.
It’s worth remembering that Flash was in fact extremely successful. I think we’d still be using it today if not for some combination of (1) Steve Jobs and (2) Adobe mismanagement.
People say Flash was bloated but try running Slack on 2008-era hardware and let me know how it goes. And there’s no reason Flash couldn’t have been adapted to use responsive breakpoints and such for mobile friendliness.
I agree, I think the two things that killed Flash were "Steve Jobs" (aka - a lack of foresight around the explosion of the mobile market and no competitive offering in the space), and the continued treatment of security as a technical debt item rather than a differentiating purchase distinguisher that buyers were actually concerned about. That is, I don't think Adobe correctly recognized that corporate buyers would start buying Intranet applications that were not built on Flash/Flex specifically due to security concerns.
Slack and other ill-optimized Electron apps are not great, but Flash was far worse, at least in my experience. With Flash a single banner ad in some tab you forgot you had open could consume half or more of your G4/G5/C2D, whipping your fans into a frenzy and roasting your lap (if using a laptop). Slack, etc are a detriment in terms of heat and battery life too, but not to such an extreme extent and at least you're getting a reasonably functional app in exchange.
Computers are easily a thousand times more powerful than they were in the year 2000, and Electron apps still cook my thighs. They use more RAM than my computer had hard disk space.
> With Flash a single banner ad in some tab you forgot you had open could consume half or more of your G4/G5/C2D, whipping your fans into a frenzy and roasting your lap (if using a laptop).
And not coincidentally, html5 banner ads will do the same thing on a G4/G5 today in TenFourFox, if you disable the built-in adblocker.
Which is to day, I actually think modern websites are much heavier. What changed is the hardware.
Java or single operating system applications were the alternative. Where would we be today if you couldn't swap between Mac and Windows laptops -- or worse, had to have both -- because you depended on applications that had no common operating system?
The reason we can use linux as a desktop is because of web applications. Now, it's possible that we could have ended up with Web Start or some alternative, but we didn't.
Even today, cross-platform applications are terrible or expensive or both. Even today, it's a giant pain to write cross-platform apps in anything but C or C++. (QT's documentation is not great for the alternatives IMO, and you end up needing to understand C++ anyway.)
If we want to get back to this, we need a better option. It turns out that nobody is willing to put in the effort when we already have web applications.
I think the answer would have been for Java desktop apps to take off. I think if we had had a more diverse desktop landscape in the late 90s / early 2000s Java desktop apps would have won. As it was that was peak Windows and Java desktop apps were never given the time they needed to mature since the cross platform pitch was at its least compelling. The browser pretty much won because it was able to unseat the Windows hegemony without directly attacking Windows the way any of the alternative OSes had to do.
Sadly here we are now with bigger computers, better performing JVMs, great Java dev tools, and still losing to the browser due to massive inertia.
Web apps won because they didn’t need to be installed on the device. They were vastly inferior to desktop apps built with any framework (even java), but they were “good enough” and IT departments liked them because they were easy to deploy. Anything that required a local install never stood a chance. That advantage for web apps is still there, which is the thing that keeps web app dominance in place.
I feel like this could have been overcome by Java Web Start, which at the time it was released was too little too late. Even embedding applications as applets could have been a good solution.
For complex desktop apps I don't really think Java was a viable option until the 1.3 release in 2000 (from memory). You also always had that problem that you couldn't guarantee what version of Java somebody would have installed so you'd have to bundle whatever JRE version you needed into the installer. Deployment was always a challenge.
In the browser Java never really got past 1.1 because Sun and Microsoft fell out so you had to install a separate plugin, which almost nobody did, or just live within the restrictions of AWT.
Couple this with the fact that, as you've pointed out, Java Web Start was so late to the party and it's easy to see why it didn't do better, either on the desktop or in the browser.
Until Android came along I really thought Java was a dying technology platform. Obviously since then its fortunes have significantly improved.
I'm not saying there were no problems, and it's not like we haven't gone through massive teething pains to get to where we are with browser based apps either. However I'd argue if all the work that went into trying to turn a hypermedia system into an app runtime had instead been focused on fixing the Java issues, we'd be in a far better place.
Although what is interesting is that with the Java solutions, it was the user who felt the pain where developers had a decent experience. As the web platform has evolved, users have avoided most of the issues and the pain has been experienced by developers. This would go a long way to explaining why users have never had it so good as they have right now and every developer I know is trying to figure out how to get out of the industry, or at least get out of web dev.
Java Web Start must have all stars aligned for the apps to work correctly.
Ever tried to use HP ILO's Remote Console via Web Start? (For additional nerd points, try doing that with Apple M1 machine). Compared to that, the HTML5 version just works.
I don't think Java as an app platform ever really recovered from the reputation in picked up in the 90s in the form of applets (which were notorious for crashing your browser/machine) and in the 00s with desktop apps due to the high startup time and often alien-feeling UI toolkits.
I recall back in the 2000s there being a few Java apps for macOS that used Cocoa UIs via the Cocoa-Java bridge that used to exist. These tended to be embraced almost as readily by users as full-native Mac apps, and I think that was largely because these apps tended to start up more quickly and fit into the user's desktop better, removing those two barriers to adoption.
>Java or single operating system applications were the alternative.
Not really - you could write cross-platform GUI applications from a single code base in 1996 if you wanted to without using Java. It was unpleasant, but it was absolutely possible.
>Where would we be today if you couldn't swap between Mac and Windows laptops -- or worse, had to have both -- because you depended on applications that had no common operating system?
Those days are still here, depending on the application.
>The reason we can use linux as a desktop is because of web applications.
I wouldn't say that's the reason. Certainly a reason, but not the definitive one.
There are still cross-platform applications from a single code base. This is actually easier due to better cross-platform toolkits that were developed and would have continued development even if webapps hadn't become a thing.
>Even today, it's a giant pain to write cross-platform apps in anything but C or C++
Python and Java are relatively painless in this regard.
>It turns out that nobody is willing to put in the effort when we already have web applications.
Web applications certainly do have the advantage from a monetization, time to market, and network effect standpoint, no doubt.
> Not really - you could write cross-platform GUI applications from a single code base in 1996 if you wanted to without using Java. It was unpleasant, but it was absolutely possible.
Right - as seen by Netscape. We can also speak about codeweavers and others we have, but in '96, those were out of reach by all but the largest applications. (Pagemaker, word, etc.) However, the honest truth is that web apps were an order of magnitude cheaper for anything that wasn't desktop-heavy in 1996.
Again, worse is better.
> Python and Java are relatively painless in this regard.
Does your user have the right version of python? Are you bundling python with your application? I haven't worked with pyinstaller - it might be better today, but
And as far as writing python gui applications... Tkinter and WxPython are ugly. PyQt and PyGtk, last I looked at it, was painful and effectively meant knowing C++, and Qt licensing from non-oss is rough. Jython is punting the java problem above. In what world is this not painless?
I wrote a JavaFX app just for myself a bit back because I didn't want the overhead of Electron for my simple app. The experience in bundling it for each was not straightforward, especially with post Java 9. Jpackage and jlink and graalvm exist today, but the work around them are not as straightforward as getting a webapp up and running.
The experience does not, by its very nature, have to be worse than web applications, but we've had two orders of magnitude in engineering effort to make the web application developer experience better. In 1996, even with as small as the internet was at that point, enterprise application deployment was so painful that web applications were a giant competitive advantage.
Flutter2 is trying to tackle this issue by using the same code across mobile desktop and web but also being AOT compiled. Maybe it'll be more efficient than electron?
It's interesting to me you're getting downvoted which signifies that people don't think you're adding to the conversation
Is this a form of denialism by hardcore "I-Hate-Javascript"ers? I would like to add that I absolutely hate the subset of modern websites that choose to be slow, but I think the parent is completely right. If we keep making it hard to use useful applications across platforms, then we're going to continue to see pushes toward the lowest common denominator that consumer computers can run. Yes, yes, I know techies like you and I believe installing stuff is trivial if you spend even just a little amount of time, but _this doesn't matter to average users_
It's a nightmare even installing Python, Ruby on Rails packages properly without juggling a million virtual environments, and those are supposed to be tools that you can run with a single CLI command and it gives you debugging information if things go wrong. It's a bit much to think that average users can survive without any of the tools or knowledge we have, being locked in a stupid GUI, and at the rate that we encounter defective software (seems like I encounter bugs more frequently as time goes on).
Now we are definitely going to add ARM complexity into the mix because what Apple pushed (M1) is just too good and everyone else is jumping on board for it
Regarding the comment about consumer-Linux, I feel less strongly on your point about web applications being the killer reason why Linux desktop users exist. Anyone using Linux desktop is almost certainly a tech-hacker to begin with and isn't necessarily tied to web applications as their go-to reason, but it certainly helps surviving with it. There are tons of reasons to use Linux desktop with no relation to web, but for market share, it's going to need consumer applications bordering on exclusivity that makes Linux first-priority and dumps support for macOS/Windows. Many Windows users don't want to switch to macOS or Linux because of gaming
Windows games kept me tied to the platform until my life became more focused on software. I only left Windows entirely after I learned that programming was a first-class citizen on Linux and macOS. Yes, you can program on Windows, but making software and using other people's software is more enjoyable elsewhere. So to wrap up this example, what can we learn about this? Well I knew life/job functions would be easier if I replaced Windows, but me doing software is coincidental. Give people a reason to use Linux. Make their lives easier, somehow, and make it clear that it will only happen if they leave their current platform. Otherwise you start coming up with systems like Windows Subsystem for Linux which... I'm not entirely sure why it exists.
I think the oft-repeated ire for modern web does not add anything at this point without understanding that there are good reasons people prefer to use web applications
Or, rather, web applications /are/ the common, universal cross-platform applications that we theorized, just that it arrived in less ideal ways
I forgot to mention though that the M1 has reduced complexity in "cross-platformness" for one dimension now that you can run iPhone apps on macOS. People have been sideloading .ipa from their phone to their laptop which is exciting. Seems like a step in the right direction
Who uses those for desktop applications? Even if I need a VM for something coding-related, it’s almost too much of a pain to set up because AFAIK there still aren’t many quick provisioning tools for e.g. VMWare. And I can guarantee that 99.9% of users have never even heard of Docker.
Linux users have platforms like Flatpak that sandbox applications to known directories, and have multiple "app store"-like implementations with colorful buttons to make the process look unintimidating and non-threatening.
Another option is to simply use distro packages. Users of good distros who use distro packages have software that's vetted built from source independently, and compatible with the rest of the system.
> AFAIK there still aren’t many quick provisioning tools for e.g. VMWare
I use VirtualBox. I boot just about any Linux distribution's live Desktop image without any hard drive attached (so, ramdisk only). Then curl a script to download and install any dependencies. After a few minutes I'm off to the races.
I use it too. You have to have that curl script (not a default user experience), manually mount the ISO (and have it downloaded), and then you risk losing all of your application state if the machine crashes since there is no storage. Seems like a pretty terrible experience. Why don't we have something like Vagrant where you can quickly provision a VM from the command line following a template?
> You have to have that curl script (not a default user experience), manually mount the ISO (and have it downloaded)
What? I made my own script. I don't need to mount any ISO.
> you risk losing all of your application state if the machine crashes since there is no storage
That's a feature given that apps these days insist on being in a browser with cloud storage, there's no loss here.
> Why don't we have something like Vagrant where you can quickly provision a VM from the command line following a template?
That'd be great, actually. I'd do it if I knew Vagrant better. The biggest blocker for me using Vagrant is that it uses Ruby. Give me Vagrant in something like Bash instead. Or like docker-compose.
I work in defense and our stuff only runs with Internet explorer and Java applets. Requirements were written in the early 2000s and the product is still not ready to ship.
Today, to say the web is a “bad” runtime is to ignore the preferences of hundreds of thousands of companies. It may be true, but only along some metric that apparently nobody cares about. Maybe no one listens to this viewpoint because it’s wrong. The web is a fine runtime, not perfect, but solves a lot of the problems that matter (like distribution and cross-platform support). In the 90s, this was a different story, reflected by the industry’s and consumers’ overwhelming preference for native and terminal-based applications.
Yes, the Redmond Middle School Science project called Internet Explorer, now in the dumpster of history where it always belonged, was evil.
Its over-engineered idiocy (monikers? apartment threading? quirks mode? wtf?) slowed down the development of good internet applications by decades, and caused tens of thousands of lost developer and tester hours that we can never get back.
Gates was too greedy to understand why it was a problem. Ballmer was too stupid. Now, Nadella finally figured out that developers might stop hating his company if he finally flushed that toilet. And flush it he did.
Edgium is good. As good as all the takes-16-hours-to-compile stuff running on modern desktops.
IE was not bad at all, at least technically and at least when it was still being actively developed (i.e. <<ie6), compared to its competitors.
The bad things (and what the web page is mostly complaining about) are the political shenigans from MS; e.g. bundling it with Windows, and then slowing down development significantly when they got to the monopoly position.
And for those that believe MS is now politically better than it was a couple decades ago, just try a recent Windows 10 build. Not only it _still_ bundles Edge as if the EU court ruling was a joke, it will periodically nag you about making it your default browser, even in completely irrelevant places such as Settings.
This is all like 20 years ago except that no one seems to care about desktop browsers anymore.
Quirks mode was a necessary measure to ensure backwards compatibility for a majority of websites at the time (circa 2000 - this was IE5).
If they had switched every website to standards mode, most of the internet at the time would have appeared broken, and people would have blamed Microsoft, not the websites.
(To be fair, IE5 through IE7 had a very broken standards mode, but it was much less broken than quirks mode.)
For better or for worse, IE6 lived on, and on, and on for far too long with no changes.
A bigger concern than the changing standards mode in other browsers was the fact that IE6 just wouldn’t die.
I had people challenging me “why are you developing for Firefox first and then adapting your app to work on IE? Shouldn’t you be doing it the other way around?” Over 10 years later those apps I built are still working great on the latest browsers, with no layout fixes since then. Meanwhile, many of those “rich”, forward-thinking apps developed for IE and Silverlight are long dead.
Right. In hospitals there was a very popular medical imaging web information system, GE, Maybe? It rendered with an ActiveX control embedded in IE6. It didn't work in IE7.
So a lot of customers I worked with couldn't move past IE6. "No way, everybody need to be able to see those XRays images."
We even tried, without success, to organize the quick development of some sort of replacement for that, umm, "legacy" ActiveX control.
I always thought tabs were an Opera invention, or maybe one of those IE shells like Maxthon or Neoplanet? IE5:mac was a fantastic browser though and was widely regarded as having the best CSS1 conformance of its time, as well as some cool features like an eBay auction tracker sidebar that seem quaint today but were very useful to me at the time: https://alistapart.com/article/ie5mac/
On the flip side, the only time I can remember hearing an entire crowd boo Stebe Jovs was when he announced that IE 4.5 would ship as the default browser in Mac OS 8: https://youtu.be/WxOp5mBY9IY?t=142
Firefox (or the Mozilla browser that became Firefox) gave us our modern idea of tabbed browsing. If I remember correctly, it was initially an experimental feature buried in the settings, but it quickly became a popular mainstream feature that Safari and IE adopted.
Opera had an earlier version of tabbed browsing, but it was tacked on to their existing Windows MDI interface.
In the 90s, MS Visual Studio 6 had a 3rd-party plug-in that would add tabs to their existing MDI interface.
Borland's Delphi and C++ Builder may have been the first apps to provide a true tab-based document handling in the early-mid 90s.
IE5 for Mac was quite good. It handled transparent PNGs correctly around a decade before its Windows counterpart and was just generally more pleasant to develop for.
I always wondered why Tasman never got backported to Windows to act as a next-gen replacement for Trident.
In UK English usage I'd say it means a "minor fact". Compare planetoid, meteoroid, for other examples of -oid being used specifically to indicate a diminutive or minor form of something.
You can possibly blame Steve Wright, Radio 2 talk show host/DJ for popularising the term with that meaning in the UK.
Ha ha this reminds me of the first website I made. It was a site about Green Day. Of course, I built it on angelfire and it was chock full of gifs and midi music playing in the background. A lot of nostalgia seeing this.
If you feel like stepping back in time, check out a game called Hypnospace Outlaw. You play a moderator surfing around an alternate 90's like web space, investigating copyright, abuse, and other stuff. That gets that whole feel from when anyone could post up a webpage.
"And as if forcing IE on all Windows users wasn't bad enough, Microsoft forced Apple to bundle IE for Mac as their default browser instead of Netscape. If Apple refused, Microsoft would discontinue MS-Office for Mac. Steve Jobs was loudly BOOED as he announced that IE would be the default browser."
I really miss when most of the web looked like this. And average non technical people knew enough html to put something out. In retrospect I think Myspace ruined everything.
This is some wholesome caustic indignation I can get behind. Surely today’s social media networks would have classified this content as hate speech and inciting violence.
Man, seeing Bill Gates with horns and in a pentagram sure hits different in 2021 due to COVID/antivax and QAnon conspiracy theories than it did in 1996.
Internet Explorer Is Evil (2002) - https://news.ycombinator.com/item?id=23013001 - April 2020 (75 comments)
Internet Explorer is evil - https://news.ycombinator.com/item?id=4740890 - Nov 2012 (10 comments)
Edit: btw, since http://toastytech.com/evil/ieisevilstory.html references Netscape 7.1 and that was released in 2002 (https://en.wikipedia.org/wiki/Netscape_7), it seems like the previous submission was more likely to have the year right?