Hacker News new | past | comments | ask | show | jobs | submit login
Surfing the Modern Web with Ancient Browsers (superglobalmegacorp.com)
220 points by paulgerhardt on Mar 22, 2014 | hide | past | web | favorite | 89 comments

I think this stuff just inherently matters.

I recently talked to my kids about the American postal service and how you can still send things "general delivery." It's a little used option and you can't use it to replace a regular mailing address, but when I worked in insurance, I did send a check "general delivery" once. I think the family in question was living out of an RV and traveling around regularly. Maybe they were retired. I did not really know. They had me send the check to the nearest post office. These new fangled RV lifestyles would not work so well if you could not occasionally use this very old fashioned thing called general delivery.

While I was working in insurance, every time they upgraded, it introduced new bugs and some upgrades (where we would migrate something to a whole new system) failed to be backwards compatible. This created real problems. It was still necessary to preserve old information and old methods of doing things. For example, most of the claims were done on a computer using digitalized images of the paperwork which had been submitted. Once in a while, it was necessary to do an actual paper claim. Not everyone knew how to do a paper claim but it was information that had been preserved. I repeatedly ran into situations at work where things could potentially just go to hell in a hand basket if some old methodology were not somehow still preserved and available in spite of being outdated.

Someone, somewhere still uses these browsers for various reasons. I am glad someone sometimes works on this type of issue, never mind how silly might appear to folks who take it for granted that upgrading your system to the latest thing is the norm. It may be for you. It isn't for everyone.

This reminds me of when I filed for divorce in Philadelphia. I did all the paperwork myself instead of hiring a lawyer. There was one document which was required to be filled out with a typewriter. You could not print it out on a printer, or do it by hand. It needed to be a typewriter.

I didn't have a typewriter and I did not know anyone who had one. I called many copy/printing places in the city to see if there was one for rent - not one place had one to use or rent.

I found some that I could buy, but it was $100+, and I didn't want to pay that much to use it once.

Eventually I asked my soon to be ex-wife, who worked for the city, and she was able to locate one which I used and was able to successfully file the document.

Every second hand shop I've ever been to has a whole shelf of typewriters for less than $5.

> There was one document which was required to be filled out with a typewriter. You could not print it out on a printer, or do it by hand. It needed to be a typewriter.

I'm guessing it wanted typed letters but had carbon copies and thus required a struck letter to generate the carbon copies?

There are free, legacy-compatible operating systems (Linux) that have modern browsers and work on a huge range of hardware.

Creating sites that are compatible with super-old browsers is costly, and makes development of things like JS based webapps either complex or not even possible.

Why should we spend money or time fixing things for a tiny percentage of the population who have options?

Well, that's why it's not the case that every application and website should worry about it. Which is why the OP's project is so nice: it abstracts the problem into an emulator, so one person can worry about it and solve it for (to a first approximation) all sites on the one side and all legacy browsers on the other side.

Do you really think these modern browsers work on a huge range of legacy hardware?

A good enough range. They'd run on a $25 Raspberry Pi.

> Someone, somewhere still uses these browsers for various reasons

I am interested to know (pure curiosity) what those reasons are for those folks. I can only think of a few using my imagination (locked down OS in a hospital, crusty old server filling a vital need with no funds to upgrade or maintain), but I bet these edge cases would make for some thoughtful reading. (Perhaps for a separate HN post?)

Perhaps they are annoyed at the attack surface bloat of modern browsers and operating systems. I've been thinking of switching to w3m/links/elinks/lynx.

Right, but if you're worried about security, using a browser that's no longer maintained isn't the answer, so doesn't explain old versions of graphical browsers.

This article is about pre-rendering a webpage to png, so anyone using a text-based browser would actually find the result even less useful than the original page.

Some people don't have a lot of money and try to make things last as long as they can before they have to spend more money. There are a lot of folks out there who will say "I spent a lot of money on this danged machine 15 or 20 years ago, and it still works perfectly fine, why should I have to replace it?".

Sometimes I think back and ask myself if all of this progress in web standards has really got us anywhere. What can we do now that couldn't be done 20 years ago (putting aside connection speed) and is all of this progress really worth the trouble.

It makes a person want to start over, from scratch while keeping it simple adding as little as possible. Of course something like that would have to be done extraordinarily well to be worth it at all to avoid https://xkcd.com/927/

It's a nice dream, but even if you had that power to go back and time you'd probably just make other mistakes.

There isn't an intelligence capable of designing big open standards like the web, rather they only move forward by natural selection, much like terrestrial life has.

In the end I think we've gotten some pretty remarkable things this way, remember before the web there was no such thing as cross-platform, instantly-available globally, and accessible to any person with any disability multimedia. Few people seem to recognize how much of an accomplishment this is: http://www.youtube.com/watch?v=uEY58fiSK8E

> There isn't an intelligence capable of designing big open standards like the web

I think the problem here is big.

How about small standards with a limited scope? I don't want these huge standards that take millions of lines of code to implement.

That's exactly what the web is! It's made up of hundreds of such small standards plus thousands of ad-hoc implementations, solutions, and deep supporting infrastructure.

Again, the point is it's too big to be designed.

Yes exactly, too big is exactly what it is. And my point was that it could be small. In practice of course we'd never even agree on a definition of what the web is or what it should be, so even the idea of designing a compact standard for it is hopeless.

Yea, that is why WHATWG HTML is a "living standard". In fact, I am thinking that even the "HTML5" buzzword is a misnomer.

We have gotten somewhere -- the web has become a decent program delivery system.

In the process it's degraded pretty badly as a document delivery system. If you start over I recommend separating those two projects.

Well, we tried that for awhile. The web was mainly good for documents and if you had a program it was done with Flash.

And then people were making websites entirely in Flash.

And then people realized that was a bad idea because Mobile, and everything got better.

Until the App Store came along. Fine, so now program delivery has been separated from document delivery, but every document provider that used to just be a website now requires you to use a special program from the App Store instead of using a web browser or some other generic "document viewer". Or, at the very least, you have to tell them you don't want it every time you visit their website.

The problem isn't that the web is a bad document delivery system -- it's not. The problem isn't even that the web is just as good at delivering programs as it is at delivering documents. The problem is that there are a large number of people who know nothing about best practices and are way too enthusiastic about making things look flashy. They create documents and they don't care how you want them to look. If that means they have to create a Program instead of a Document, that's fine with them.

Download our app, and let us read your entire contact list and send out SMS messages as you, and read and post to Facebook as you, and do whatever else we want to anything for any reason.


It just made us lost the sense where to go, by re-inventing, badly, what was already possible on the desktop with Hypercard and similar systems.

> Sometimes I think back and ask myself if all of this progress in web standards has really got us anywhere. > What can we do now that couldn't be done 20 years ago (putting aside connection speed) and is all of this progress really worth the trouble.

Well let's see if we can make a short list:

1) Can I look up almost anything? Yes [0]

2) Can I learn a foreign language for free? Yes [1]

3) Can I fund something interesting? Yes [2]

4) Can I communicate with my family and friends for free? Yes [3]

5) Can I learn stuff for free? Yes [4]

Seriously I didn't even try, yes progress has got us somewhere. Admittedly we do now have n+1 standards

[0] http://wikipedia.org

[1] http://duolingo.com

[2] http://kickstarter.com

[3] Honestly there are so many - email, skype, facebook, whatsapp, etc ...

[4] http://khanacademy.com

edit - edited

As others said, all that could be done in the 90s modulo the bandwidth (specifically important for Khan Academy). But we don't even have to go back that far to get broken sites. IE 8 (released in 2009), forced on many corporate workers, is unable to render many "modern" sites. Some specific examples that I've encountered: Lifehacker & kin - multiple content blocks overlapping, rendering the main content unreadable; NPR - home page totally broken, can't get to different news categories; GigaOM - ARTICLES ARE PRESENTED IN ALL CAPS FOR NO GOOD REASON, AS IF THE AUTHOR WANTED TO SHOUT AT ME FOR USING SUCH AN OLD BROWSER.

And your 4) we could do that in the 90s as well. Email isn't a 21st century invention and doesn't depend on web technologies. Same with instant messengers. Skype - for video, bandwidth is the thing the 90s lacked to support this system. WhatsApp - widespread mobile devices was necessary for this, 90s lacked that. And none of them, other than Facebook, is wholly dependent on web stuffs, they're internet technologies and platforms.

Which of those wasn't possible in 1995 with a modern Internet connection? I posit that everything you describe would have been entirely possible with a little less polished appearance 15-20 years ago and that there could be real strong value in starting over and keeping things simple.

Duolingo - because it's not just me learning Spanish, it's us translating English only websites to Spanish. Perhaps I'm not versed enough in the services available in '94 - I didn't really start to use the internet much until college when I got access to the computer labs. Was there something like it around then? I say this because the most impressive thing to me about duolingo ( and the captcha system designed by its founder before it ) is the efficient way to leverages different individual goals to aggregate really large differences for both individuals. Ie the person wanting to learn a new language for free and the person with an English only website that needs to get it translated into multiple languages.

I presume the point is not what was already built but what was possible: Wikis have existed for a very long time, and nothing about Wikipedia seems difficult to do with very old versions of Netscape.

Also, email and IRC were available before HTTP/HTML, so people were definitely able to communicate over the Internet, in realtime or otherwise, even before they were viewing Web pages.

Indeed all those sites mentioned are just manifestations of an idea that is really independent of any "progress in web standards", and had someone thought of the idea 20 years ago, they could've just as easily (or maybe even more easily - because of the relative lack of complexity back then) set up a site for it.

This site actually chronicles the terribleness of the modern web quite perfectly. Warning: NSFW.


> What can we do now that couldn't be done 20 years ago (putting aside connection speed) and is all of this progress really worth the trouble.

Well, I know your 20 years is hyperbole, but let's see:

1) We can display video and play sounds. We can do it using native browser controls now, but in 1994 you couldn't do it at all - plugins weren't introduced until Netscape Navigator 2.0 which came out a year later.

2) We can display updated data without reloading the entire website. This let's us do stuff like chat or use collaborative tools.

3) We can disassociate content from styling, so styling data can be transmitted once and can be changed en masse with ease.

4) Cookies showed up 19 1/2 years ago, so you wouldn't be able to store user login data without it showing up in the URL - it makes sending URLs to other people very awkward.

5) Oh wait, you wouldn't be able to log in because forms weren't in the HTML spec yet, HTML 2.0 didn't come out til 1995.

6) We can actually... do layouts of web content. Tables didn't come around til HTML 3.0.

The initial version of the web was pretty much a prettier version of Gopher - with some inline graphics and styling. This site in would not be able to exist in 1994, unless you wanted to submit comments via email?

CSS has always been a hack, and the web we have today as a result of competing browser rendering however the hell they want based on their popularity - and the popularity of any co-developed devices - is just about the only thing that can make me countenance communism as a web developer.

Having said that, it's possible that we might be in for a second era of webdesign, as MS retire support of XP and the browsers that have dragged down web development.

Worth noting: sufficiently ancient browsers (e.g. Netscape 1.x and most versions of Mosaic) can't even connect to modern websites, as they predate HTTP/1.1 and the "Host:" header.

See http://www.jwz.org/hacks/http10proxy.pl for a workaround.

(Unfortunately SSL was developed in an HTTP/1.0 world and it's taken nearly 20 years for us to get the equivalent functionality with SNI.)

First thing I did when I got a Sun Ultra 10+ running. Of course being a Sun machine, you get to enjoy HotJava, probably the only web browser with a `garbage collect` menu entry. I could display things, as expected, very crudely.

> probably the only web browser with a `garbage collect` menu entry

In Firefox, you can force the global JS engine to GC on the `about:memory` page.

How stupid of me, after following nnethercote efforts on his blog (https://blog.mozilla.org/nnethercote/category/aboutmemory/) for months ..

And a nice 'meet the people' page :-)

Of course you don't need an Sun box to run it.

Oh, HotJava! I still remember using it years ago. It was surprisingly snappy and effective

Ha ? wikipedia mentions the first JVM were not performant enough for that. Which version did you use ?

I don't remember the version, it should be between 1997-2000 with a Pentium 2. I didn't find it specially slow or unresponsive, but my image of it may be more idealistic than what it really was. I remember using it on occasions, just for the sake of using different stuff than usual.

It is a revelation how quickly some sites will load when you turn off JS, and by extension much of the modern web. I mean, really, really quick, as you avoid the blocking of third-party services, which is quite common these days.

It does also highlight how many sites actually depend on large numbers of external sites for basic functions on their own site.

As an example - a site would not load at all, because it was trying to load dogshit like Disqus. Sigh!

That's funny, because in Third-Party JavaScript written by folks from Disqus, they partially explain the lengths they went to so that they didn't disrupt the page loading. Of course, in turn Disqus does have to protect itself from the page using iframes, so there's room for junk in both directions. It's perhaps amazing the web works as well as it does, all tangled up across domains.

Hence the point of this image-rendering proxy. I'd be more interested in a proxy that does the reverse -- strip websites down to bare blue links on grey by any means necessary, including OCR of images. That could be fun. For a few minutes. ;-)

I still use links pretty regularly. Not to the exclusion of modern browsers of course, but sometimes it's nice to read longer articles as actual text without all the fluff, and to be able to stop reading and then continue a session on another computer (w/ tmux). The only thing I find mildly annoying is that HN comment threads get flattened.

You might want to consider using pocket.


sounds like google keep

Keep is a totally different service.

You can compare Pocket with Instapaper, ReadItLater and similar sites.

To explain further: Pocket (and equivalent)

1] strips out all the extraneous cruft from an article, keeping only the text (in a typeface and font size of your choosing) and any included photos/videos;

2] acts as a centralized archive of articles, which can be tagged and favourited at will (so you don't need to bookmark articles in the browser for later reading); and

3] is available across multiple devices (phones, tablets, PCs) for a single account, including offline access.

It's an awesome experience to simply right-click-save interesting articles in your desktop browser during a work break, knowing that you'll be able to read them whenever and however you like, for example on your tablet in bed in the evening.

Pocket is actually ReadItLater rebranded ;)

Is there any reason to switch to pocket over Instapaper?

I remember switching from Instapaper to pocket, don't remember why though.

Give it a try yourself, that's the only sure way to find out.

Hm, I remember there was a thing called Opera Mini (8-10 years ago?). It worked in a similar way so that you could browse web using old basic cellphones.

It was quite popular in Eastern Europe where having computer with Internet access was something out of reach for a large fraction of population. I was using Nokia 3510i for that purpose :)

It's still pretty popular even now. More than 250 million people use it worldwide.

This could also be useful to circumvent the censorship in some places, no? I've seen a site which convert websites to images for that very reason a few years ago. Although less sophisticated. But sadly I can't remember the name.

I don't see the point of providing a life-support system for these old, insecure, outdated web browsers.

It's not really analogous to, say, driving old, restored cars. These browsers are rootkit magnets. The people who use this unmaintained software are inherently less safe.

It's just intended to be a fun hack that works surprisingly well, not a serious solution to the problem of providing support for older browsers.

Although come to think of it, if all the rendering is being done on the server side, and nothing but an image map is being delivered to the older browser it should be perfectly safe for these ancient browsers to navigate to even the most dangerous websites, provided that the server side component doesn't get hacked, because the server side component will be delivering nothing but safe images.

Say you wanted to make a film set in 1997? You could probably buy a period computer easily enough from eBay, you could probably get Windows 95 on there with Internet Explorer 4, but to display some content? You could make some Photoshop mockups for period 'Google', 'CNN' et all, but to actually serve those pages would require a period server. This little trick of rendering would make it all slightly easier for the film maker. They would not have to hand millions over to ILM just to make a few retro screenshots.

Personally, if I was making a film set in the 1990's I would go for the IBM WebExplorer browser (http://virtuallyfun.superglobalmegacorp.com/wordpress/wp-con...). It has awesome graphics.

If you're making a film, just have your art guys make mockups of the entire thing in photoshop and have it display like a fullscreened slideshow or video on an old monitor. The other old hardware needn't be functional.

This makes everything reproducible for multiple takes, lets your art guys have ultra-fine control over the presentation of everything, and lets the actor focus on acting, not driving a computer.

Of course! Don't even have the keyboard actually attached, have some guy in the gallery push the buttons too... Add those sound effects in post-production as well. No wonder there are no themes that enable your computer to beep and chirp like in the movies.

> "Don't even have the keyboard actually attached, have some guy in the gallery push the buttons too..."

Alternatively have the keyboard set up as one big "anykey" to advance the animation. The advantage of this being that keyboard strikes would be synced with screen updates, if there is ever a shot where you can see both the fingers and the screen (which is typically avoided, but still).

Although this one example might not be a very practical way of doing it, anything that keeps old hardware out of the trash and still in useful service is a big positive in my opinion.

One thing the software industry has been very good at doing is driving the sales of hardware, by requiring more and more resources --- only to do much of the same things as before, maybe with some improvement in specific areas. Many users have no need for the latest hardware nor software, yet they're constantly encouraged to upgrade for security, "new features" they'll never use, etc. (I'll admit that some of these, like security, could be valid concerns.) Upgrading to newer software with higher resource consumption, they wonder "why is it so slow?", and that eventually leads to perfectly fine hardware going to waste. In particular, the extremely fast upgrade cycles of browsers makes their contribution to this gross waste a bigger part than a lot of other software.

> The people who use this unmaintained software are inherently less safe.

A lot of exploits today won't even run on older systems. Older browsers also having less features is also a reduction in attack area - e.g. if there was something vulnerable in HTML5 video or CSS3 animation, a browser that didn't support those features would be inherently immune.

I'm interested in using this for my (original) iPad. As time progresses, more and more websites cause the browser to crash. The latest example was when I tried to view the new apple iCar thingy on apple.com.

Never could get it to load.

I don't think that my tablet is a root kit magnet, nor that I'm inherently less safe. I just find that this hardware works well enough for most of my needs, except now for browsing the web, which I never thought would happen...

And can't you just install an alternative browser? do none exist for the ipad?

Nothing official, as Apple only allows browser shells (which use the iOS version of WebKit), no complete browsers.

Opera Mini does the rendering server side so that isn't beholden to that rule.

When I tried to install a recent Chrome on my in-laws' ipad, it required a newer version of IOS than was available for that model of Ipad.

While there are other "browsers," they are really just different user interfaces wrapped around the same Safari rendering engine.

Most likely the 'crash' is the browser being terminated due to low memory. Alternative browsers would have the same problem.

Any number of people could have their own idiosyncratic reasons for needing, or at least wanting such a hack.

The general diversity of ways people connect to the internet on all kinds of different hardware is enough of a reason, provided there is someone interested in putting the hack together. And there was such a person, so, yay. There's no Court of Hack Justifications one must appeal to, to do this stuff.

Old cars are far more dangerous in a crash than modern ones. Old browsers at least won't kill you when they fail.

I just recently installed Ecom station in order to play the OS2 version of Galactic Civilizations that I have. Browsing around in Navigator 4 was interesting. I'll need to try a bit more.

Could somebody provide some details about the picidae network [1] mentioned in the article? Its focus seems to be circumventing censorship as a proxy, which makes me wonder why they use images instead of just serving the same content.

[1] http://net.picidae.net/

Images probably help bypass text-based censorship (i.e. deep packet inspection, proxies)

I guess HN will pretty much just work fine?

It renders a bit weirdly in w3m (indentation is shown as * and plusses and such), but apart from that, it seems to work just fine, so there is probably little reason why it shouldn’t work in any other browser that understands basic HTML?

(posted with w3m, written in a linked-in Emacs tab…).

Edit: Screenshot: http://chubig.net/t/w3m-hn.png

Good to know that you can log into HN with an unpatched w3m now. w3m sends "Content-length: " instead of "Content-Length: ", which HN used to dislike.


I can testify that HN works well in IE6.

It's mostly advanced HTML5 features, scripting, and CSS that aren't supported in older browsers, but basic text formatting and forms shouldn't be any issue.

Very cool. Somehwat ironic that I'm reading this in Firefox Aurora on my VPS, controlled from my iPad :)

Interesting sounding setup. What software are you using for the forwarding, VNC, X11, or something else?

I use xrdp (google xrdp scarygliders for a bleeding edge version) and Jump Desktop, which does RDP over SSH without fuss. Latency over a 3G connection is low enough for coding.

I have the a Vagrant template for setting this up here:


...all you need to do is read the Vagrantfile. I use the vagrant-lxc plugin to set up and tear down these environments on Digital Ocean.

Lol @ DNA Lounge.

Also, why? Outside of the joy of hacking

Legacy systems, unfortunately, are here to stay. Not everyone gets to update their browsers whenever they want to. And while I don't have to deal with IE 1.5, I definitely have to deal with IE 6 in my workplace. IT is a bunch of idiots, so we'll probably still be using it for the next ten years. Hell, some of them actually updated to Windows 7, and we're using IE 7 on them. Completely absurd.

So if there are ways to make pages accessible to older systems, that is a very good thing. Of course, I'll never be able to use it, but someone else probably can.

> Legacy systems, unfortunately, are here to stay.

Just because something exists(or can be called "legacy") doesn't mean it should be supported. I would venture to guess that nobody in the world uses any of the browsers shown on any regular basis. And if they do, they have absolutely no expectations of it working correctly.

When IE 1.5 was launched 8MB of RAM was acceptable and people were excited about 28.8k modems.

> I would venture to guess that nobody in the world uses any of the browsers shown on any regular basis. And if they do, they have absolutely no expectations of it working correctly.

It depends on where they work. We have to support IE6/7 because our users are largely accessing the webapp from hospitals... which means they're using computers that are very strictly controlled by the IT department, because upgrades can break other, very expensive legacy software that relies on old IE versions or other particular quirks of old OS versions. And if you break some essential medical software, lives can be at stake, so they can't take it lightly.

To be sure, they don't assume that our webapp will work in the browsers they're forced to use. But if it doesn't, we can't possibly get them as a customer until some distant day when they are ready to lay out the serious investment required to upgrade.

IE6/7 yes. I've had to support those too because they are still in use in some places. That's why I said "any of the browsers shown"...all of which are multiple generations behind IE6/7.

Win7 shipped with IE8.

Couldn't, let's say Google, run this server side and redirect legacy browsers to it? It could potentially allow large websites to drop legacy support more quickly and move to more modern technologies.

Google did similar proxying for mobiles back in the day. If you used Google's mobile search you'd often end up on pages via a proxy. They still do it now, or used to for Windows Phone 8 devices last I checked. Which is odd, since there's nothing restrictive about IE10 on a phone.

I wonder how Google's proxy would work in Mosaic or Lynx?

what I basically want from this is a png-browser-vnc. basically, any action you perform (mouse click, mouse over, typing text), it will get sent to a qtwebkit process, and render you back the results.

Now instead of rerendering the entire screenshot of the page, it should only render the region where the change occured. For example, if a mouse moves over a menu, it will figure out which region the image has changed as a result of this (onDomChange probably) and render that portion, send it back to the client.

Still looking for this or maybe I should build it.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact