Hacker News new | past | comments | ask | show | jobs | submit login
Why Is Gopher Still Relevant? (floodgap.com)
119 points by pointfree on Mar 13, 2017 | hide | past | web | favorite | 95 comments

The web is like an operating system that you are locked into using. This prevents alternative operating systems such as Plan 9 from ever becoming useful, because building a web browser is such a monumental effort. Even if a decent web browser were built for Plan 9 it would defeat the purpose of using such an alternative OS because the web binds software to content.

The web has effectively killed innovation in operating systems. A clean and consistent system will be out of reach as long as this web app nonsense continues.

I think this is the wrong way around; Plan 9 prevented itself from becoming useful by changing everything so thoroughly that building an actual web browser was pretty much impossible. It was more or less guaranteed that APE (The "ANSI Posix Environment") would die from neglect. This neglect seemed intentional and people seemed happy to run plain white windows with black borders and congratulate themselves that their system neither had nor needed the massive shared libraries that Unixen of the time seemed to need.

The lesson seems to be, rather - don't build an operating system that can't build up enough infrastructure to run a browser. This is maybe less forbidding than, say, being forced to build an operating system that can support an entire Windows subsystem.

I have trouble understanding what it is that people want when they say they want "innovation" in operating systems. It seems like almost anything that was practical to do in the period when I worked with Plan 9 is now eminently practical and far more performant in user space; so why not go build it on top of a minimal Linux kernel and save yourself the trouble of building a gazillion drivers, boot loaders, etc.

Web desktops may be the next step. As much as people are joking about running Javascript on metal, the idea of running a universally accepted virtual machine (WebAssembly?) and building a high level language and operating system on top of it seems to have some merit. See also




the idea of a web desktop doesn't make sense when we have services like google docs and dropbox that perform the roles of the desktop without the overhead of installing software. The desktop was a forced requirement while content had to be centralised on your PC, why bother with it now that we're freed from that constraint?

> The desktop was a forced requirement while content had to be centralised on your PC, why bother with it now that we're freed from that constraint?

Because some of us want content to be centralized on our PCs. We want to own our data, instead of giving it away and then renting back access to it. Related, I consider the push towards abstracting away the file system to be a very bad thing.

I agree with you completely, especially given all the revelations that have come out about dragnet surveillance, but the OP was talking about web desktops specifically.

It's all about namespaces. You can access local files the same way you access web resources and vice versa.

Have you ever thought about the fact that most people don't like innovation and don't need it? They want and need well tested incremental updates to proven technologies.

There is still innovation happening in operating systems, it just doesn't reach mass adoption because most of it doesn't provide real-life benefits to the mass market. Some of the more practical aspects trickle down to mainstream OSes.

We're past the Wild West stage of operating systems. You win some, you lose some.


This is the paper Isaac Asimov researched as an undergrad, and was the basis of his first sci-fi story sold.

There's a lot of material on this topic.


In my view the web is a very successful demonstration of language-based security in the OS. It's quite natural that the web has essentially became a full-featured (and half-baked) operating system because it has to handle everything in order to keep the language-based security and relevance. Therefore I would argue that the innovation in operating systems has not been killed---it has been shifted to the different realm.

The web, at least the part most people interact with, is more like a throwback to leased terminals.

The real computation happen on a far away pile of hardware, with the local computer just being responsible for drawing the updates on screen based on the results coming over the wire.

I disagree, while writing a web browser would be significant work, it would likely be on the same order as writing an operating functional enough for people to actually want to use it. Porting a browser would likely take less work even (while still being a significant amount of work).

This page talks about what it takes to port webkit to another platform: http://blog.cranksoftware.com/webkit-porting-tips-the-good-t...

Forgive me, but you should more specifically define the terms you use. As it stands, it looks like by "functional enough for people to actually want to use it" you seem to introduce a circular dependency -- as though an OS is useless unless it's easy to port a browser to it.

The GP post mentiones Plan 9, which is a great example here. There are real people who use Plan 9 every day (I have met some of them!) but Webkit will never be portable to that OS because there is no C++ compiler. This doesn't make the OS non-functional, but it does mean step one to porting a modern browser is "port GCC or LLVM," which is a massive task by itself.

Both Firefox and Chrome source code repositories are in the hundreds of megabytes, even without the required host OS dependencies vendored in. The document you linked to recommends leveraging freetype, cairo, and all kinds of other things that would require serious graphics infrastructure of a sort that's just not feasible.

Modern web browsers are keystones that cap off arches built of hundreds of pieces of very specific software. As time goes by, they require more infrastructure, not less -- c.f. Firefox, where to build the latest version you must also have a Rust compiler that targets your operating system. As the checklist of requirements grows ever longer, the candidate operating systems capable of meeting those requirements grows ever shorter.

Five years from now, I predict the only three operating systems capable of interacting with the web in any meaningful sense will be linux, macos, and windows; bsd and haiku et al will have been left by the wayside.

Ten years from now I predict there will be four operating systems, and one of them will be Fuchsia, because only a megacorp like Google has the resources to accomplish that.

> Five years from now, I predict the only three operating systems capable of interacting with the web in any meaningful sense will be linux, macos, and windows; bsd and haiku et al will have been left by the wayside.

That is an interesting prediction. What do you think will cause the BSDs to be left by the wayside in such a short timeframe, when they are more than capable of running the latest browsers right now?

Is the browser becoming a systemd module?

I am a little concerned about the trend to push "hardware acceleration" into everything. Graphics cards and their drivers are the biggest source of grief in PCs for me.

One intriguing direction we could take is for someone to make a self-hosting browser. Doing that would mean taking all these technology dependencies and moving as many as possible into wasm, until you're just left with the kernel, a "barebones" wasm runtime, and a browser binary. I think that is within the reach of one of the big-leaguers, albeit ambitious and full of "interesting" problems and standards that will have to be broken.

Don't forget that browsers like Netsurf might well be enough for your daily surfing.

I'm no expert, but I believe the difficulty in writing an OS lies with the myriad of different hardware configurations you have to support. Writing drivers for poorly documented hardware is difficult. You don't get around that complexity by writing a browser. That's just an additional effort you have to make.

Sure you could port webkit to your new OS, but if all the software you use effectively exists only as web apps what would be the point?

The point seems to be you could now use any system you wanted (as long as there is a compliant browser) and not worry if your needed app is supported or has all the features of the Windows/OSX version.

This lets you take advantage of whatever benefits your desired system provides you without a change of user experience.

I think I understand your argument now. Why do you think browsers as de factor operating systems is wrong? Their programming interface is highly standardized across implementations.

> Their programming interface is highly standardized across implementations.

It may be easier to have everyone use the same software and hardware platform everywhere. After all we do want interoperability and we do want everyone to play well with others, but, standardization and portability is a push to stop people from trying new things. The push to standardize everything is an unfortunate consequence of not having a separation of concerns, separate concerns such as content and UI.

Not sure what you mean by standardization vs. exploratory programming. Separation of concerns would only exist if standardized (and enforced).

A smaller contact area between implementations means we need only be concerned about compatibility at that interface and the rest is up to you. This keeps the area of software that we can't change to a minimum.

> A clean and consistent system will be out of reach as long as this web app nonsense continues.

Isn't that exactly what iOS and Android are though? I mean most of the criticisms of iOS are against Apple's obsessive curation of it's simplicity and usability, and it 'came to power' smack in the middle of the internet age. People are always complaining that native apps for internet services are a threat to the web. It seems to be exactly the thing you're saying the web makes impossible.

Dunno. Initially iPhone was not supposed to have apps at all.

Apple even release a web framework for making sites that looked and behaved as if it was native to iPhone.

Only after the jailbreakers enabled third parties to fool around with the guts did we see apps happen. And initially Jobs wants to sue those app devs and distributors into oblivion.

I wish that myth would die.

Yes, Jobs did a song-and-dance about web apps but not because they didn't want 3rd party apps but because Apple wasn't ready to support developers on what was a new platform. Apple plays it safe.

For one, it's not consistent with the timeline and secondly as the jailbreakers discovered there was quite a bit of infrastructure to support apps even in iPhone OS 1.0.

* iPhone release date: June 29, 2007

* iOS SDK Announced: October 17, 2007

* iOS SDK Released: March 6, 2008

* iPhone OS 2.0 release date: July 11, 2008

Apple had such a tough time providing enough support to developers that they restricted access to the paid developer program to just 4000 (Apple said 25000 tried to join).

There wasn't enough time for everything to happen (including relatively sophisticated tools like the iPhone Simulator) if it was just a reaction to jailbreaking.

Furthermore, the OS itself was initially developed for tablet computers, so apps would have been crucial. Adapting it to a Phone platform with a more limited initial scope was a fairly late stage pivot.

The web is a bad reinvention of distributed RPCs on desktop systems, imo.

You mean web apps? Because the web itself was fine till everyone started thinking that anything on the web must be an application. And that web must "win" mobile for some reason.

You'll understand that "some reason" when you'll want some cool, fancy, useful app and the vendor will have not developed a version for your current mobile OS.

Few of web apps are useful, though, and all suffer severe performance and UX penalty by the very virtue of being run within a browser.

Few of any kind of apps are useful.

The non-purists (which is most people) have decided that availability trumps everything.

I was more of a purist, myself, but after some bad experiences with the lack of native apps, I'm more of a realist these days.

Web apps have been the default approach for over a decade now. The O.G. vision of the web / netscape was to have a virtual OS, platform independent. I think it's time to lay the idea of the webapp as somehow being this alien thing to rest...

The web is a broken and incomplete adaption of Xanadu.

Xanadu is a broken and incomplete implementation of Xanadu, and has been for more than my entire lifespan (child of the 60s baby!).

Which is not enough reason to prefer an implementation which dropped Xanadu's probably most important feature (two-side links) from the beginning.

I blame the person from Porlock.

There's the web, then there are tons of other protocols out there to use on the Internet. And browsing using lynx/links/elinks is quite possible.

compile a linux desktop to the web (webassembler/jslinux), style your gui to look like your brand, users are logged into x11/wayland as guest by default. logging in looks like you're on a website but in fact you're in a linux session. you code your apps in gtk/kde (whatever) and html/xml (or something more sane) is only used for the layout of those apps.

Gopher might not be popular, and apparently it's relevancy is disputed. But it is still an interesting bit of internet-history. It still has an active community, and a whole lot of information.

There's also a slow movement to add gopherholes to the tor network. A list of the current known ones can be found here; http://gopher.floodgap.com/gopher/gw?gopher://bitreich.org:7...

I don't get this. It seems to me Gopher basically supports a subset of what is available on the web. All the benefits of Gopher (low bandwidth requirements, no flashy design or interactivity, strict hierarchical navigation etc.) can also be achieved with stripped-down web pages. I see no benefit at all to having a separate protocol.

Gopher is cool as a slice of history though.

You can't be confident of getting a stripped-down web page before you click the link. Trivial example: I can click a gopher link and be confident whatever's on the other end is not going to make a noise.

Human vigilance is fallible. Protocol-level enforcement is necessary.

I've been thinking about making a site that only takes links to text/plain and other non-mixed-media content. I haven't been able to find one if it exists. I'd also love search engines to support media type filters.

Are you suggesting there are some people who only browse Gopher?

There are people who prefer Gopher and/or particular known sites. If you can see from the URL that a site will be non-horribly designed then that's fine, and if the only option is HTTP then you put up with it.

You could use a stripped-down client. There are currently compatibility problems (with using Lynx, elinks, dillo, etc), but most of those are probably surmountable. I can easily imagine a browser that guarantees either a reader-mode/AMP-like experience or a graceful failure.

"All the benefits of Gopher [...] can also be achieved with stripped-down web pages"

Can be achieved, but aren't. I'd love a consistent, stripped-down interface. The web doesn't give me this (it doesn't matter if it _could_).

As a client you have no control over this though, because you can't make people provide services using Gopher instead of HTTP.

To put it another way, for someone wanting to provide a service it's daft to choose Gopher 'because it's lightweight'. If you're providing the service, nobody can stop you providing a lightweight HTTP service and in fact doing so is mind numbingly easy. Just do it.

Or provide a URL link to an FTP server. Gopher was basically just a nicer to use menu based alternative to ftp, but visual browser ftp clients are pretty much that now anyway.

Having said all that, it is kind of cool that Gopher is still around. I was using it years before HTTP even existed. If Gopher is going to exits then yes, being lightweight is one of it's strengths, but pretending that this meaningfully distinguishes it from HTTP doesn't strike me as a very strong argument. I suspect a lot of it's attraction is techie cultural exclusivity and contrariness. Not that I have anything against those things whatsoever.

> The web doesn't give me this (it doesn't matter if it _could_).

If it was important then the web would cater to it. It's simple economics.

We could create protocols for all the sub needs and desires that people may have but that would be restrictive.

The market can stay irrational longer than you can stay sane on autoplaying video ad-covered sites.

Popularity =/= Economic Viability =/= Importance.

The market is NOT a mechanism for delivering what is important.

Gresham's law, market for lemons.

It stops designers and developers from being "clever" The user gets pages that are highly accessible, straight to the point and free from javascript malware.

Not at all, you can do lots of "clever" things in plaintext, like ASCII-art or aligning table columns with whitespace. This is much less accessible and adaptive compared to semantic HTML.

No protocol will prevent people doing fun and clever things or produce content which you think is useless.

The big thing, as best i can tell, is that gopher is built around exposing a FHS and letting the user browse it rather than mask the FHS behind urls and links inside the documents themselves.

Gopher as a protocol can serve up any odd file format, HTML included, if one so wants.

Gopher comes without the multimedia annoyances of the over-engineered Web and it "lacks" people whose business model is to be on it. I like to browse Gopher every now and then. Several people I know even export read-only versions of their blog to Gopher.

I planned to write a new Gopher app for Android a while ago but I haven't finished it yet. This article reminds me to try again. Thanks.

I "ported" my blog to gopher as well (gopher://gopher.conman.org/) but it's not a clean port. The gopher client I use (an extension to Firefox) is not UTF-8 aware and thus there are character set issues. Also, there's some loss from the conversion of HTML to plain text (compare gopher://gopher.conman.org/0Phlog:2017/02/27.1 with http://boston.conman.org/2017/02/27.1).

But it wasn't that difficult to do. And yes, I do see a few people hitting the gopher site.

I just tested your blog with OverbiteFF (I guess that's the client you're using), and I had to manually switch the encoding to Unicode (View->Text Encoding.) I see there's an "item type" field in the protocol, it's set to 0 (ie "plain text file") for your blog entries. The protocol doesn't seem to offer anything to specify the encoding to the client, so overbite must default to something like plain ascii.

Is there an existing gopher client for Android? I couldn't find one on the Play store, but that could be because of search problems.

Yes, the Floodgap guy has its own Android client available on his website. Sadly, it follows the design conventions from Android 1.5 or so.


Oh wow, that's a blast to the path. The author also wrote the bucktooth gopher server, and I implemented a new (semi-OO Perl) version of it that supported Gopher+.

There was always something very satisfying about Gopher, how it made explicit the structure and categories of the data, leaving the front-end to present it how it pleased.

I would also note that others may find the name Cameron Kaiser familiar because he's the maintainer of the TenFourFox (Firefox for PPC OSX [1] ) and Classilla (Mozilla for OS9 [2]) projects.

[1]: http://www.floodgap.com/software/tenfourfox/ [2]: http://www.floodgap.com/software/classilla/

I had no idea what Gopher was and looked it up to understand; this article was helpful: https://arstechnica.com/tech-policy/2009/11/the-web-may-have...

Gopher is way more friendly for the visually impaired than the modern web.

That claim needs some evidence. Is there a good example? All the sites I can find are either toys or archives of university sites from the 1990s.

From those sites:

There is a lot of completely inaccessible ASCII art.

Everything is shown in a plain monospaced font, including tabular data. Emphasis can only be done using asterisks, capital letters or similar.

Images can't be embedded with context, only linked to -- at the end of the document.

Links can't be shown in context, only at the end of a document.

Only ASCII text works.

If by friendly you mean ‘irrelevant’. There’s plenty of easy things that developers can do to their sites to make them more accessible that don’t require rewriting them as a gopher site.

Yes. There are things that developers could do to make most web sites better for the visually impaired. But that doesn't mean that most web developers use, or even recognize, these best practices.

Gopher, on the other hand, simply only supports textual navigation that is easy to feed to a text-to-speech engine. The developer doesn't need to even think about these issues.

Right, but text-to-speech is only half of the problem. Gopher by default just uses plain text with no semantics. Without marking up those documents there’s no way to let users jump around within them. Do you honestly expect a visually impaired user to sit and wait while each document they go to is read out to them in its entirety with no heading structure or ability to navigate within the content?

I think that argument worked well for the original web. But the HTML spec has accumulated a lot of things to make design easier, but they also make it easier to represent a page without useful semantic structure.

You mentioned tables, but there were many years where people were using tables in HTML for formatting instead of just expressing tabular data. So div tags were added. Now people sometimes use divs to represent tabular data.

Search is a pretty good way to navigate content, and is simple for raw text. An index is an even better way, and is supported quite well by gopher.

people were using tables in HTML for formatting instead of just expressing tabular data

Yep, and there are algorithms that try to distinguish between ‘data’ tables versus ‘layout’ tables for precisely that reason[1][2]. It’s unfortunate that they have to exist, but the web has routed around past mistakes.

Now people sometimes use divs to represent tabular data.

Yes, and that’s wrong and a simple fix. But regardless of whether some ill-informed people do that it’s still no worse than Gopher, and when they get it right it’s a lot better.

Regarding search, that’s great if you have exact keywords for what you’re looking for. Less good if you just want e.g. a list of headers on the page so that you can narrow in on the part that interests you.

[1] https://dxr.mozilla.org/mozilla-central/source/accessible/ht...

[2] https://cs.chromium.org/chromium/src/third_party/WebKit/Sour...

> Search is a pretty good way to navigate content, and is simple for raw text.

Gamefaqs hosts plain-text documents. Some of them are quite long and well-structured. Since there's no intra-document linking like with HTML, newer documents tend to have hashtags in the TOC to make Ctrl-F searching easier.

I'd much rather have intra-document links I can click on. That way, as a document author, I can direct readers to the dedicated "All The Monsters You'll Ever Fight" section from anywhere in the document without adding the string "#monsters" all throughout the document, generating false positives for the Ctrl-F-using reader.

I think you misunderstood his/her comment. The flexibility that the modern web makes available makes accessibility tools:

  1. harder to implement
  2. More error prone.
Implementing accessibility software for gopher would be a breeze in comparison.

I myself am quite an old grumpy fart, but I would prefer to browse most web sites as gopher sites. I stumble on way too many sites that would have been more usable to everyone had they just skipped those extra 800kb and kept their page below 30kb.

Let's take a trivial example: https://en.wikipedia.org/wiki/List_of_open-source_mobile_pho...

Let's say we are both on an unreliable wifi connection (perhaps on a train) and want to find the most recently-released entry in the table. Then we want to sort the data by each of the other columns of the table.

I click a heading and navigate to the bottom row of the table and see 2016. Done. Then, whenever I am ready I click each of the other four headings to sort the data again. This exhausts the effort I need to expend in order to get sorted views of the same data. And no network access was required to do this (which is nice for non-trivial table sizes).

With Gopher, what user actions are needed to get the same views of the data? How much effort do these user actions require?

Edit: also, are those user actions discoverable? At least visually Wikipedia's table gives me two arrows and a cursor icon change to make it apparent I can click to change the view of the data. As far as accessibility, there is a "title" attribute with the value "Sort ascending" that I presume would get read by a screen-reader.

At the time of Facebook, the question should be: Is the web still relevant?

I'm sure people were asking that when web portals like go.com were becoming dominant, and they'll be asking it long after facebook is dead.

On that note, i seem to recall reading that Fidonet is still being maintained.

Gopher doesn't support encryption. There is no standard for gopher over tls. This makes gopher irrelevant for encrypted communications. But what makes it great: gopher is simple. You can use curl or even netcat for browsing gopher sites.

I used Gopher 1993 in university, but I don't see it relevant today. It's history.

Your experience with it is not very recent, then, so it's maybe best to listen to others. :)


This made me laugh: gopher://gopher.frog.tips/0/JOB_OPENINGS

Browsers won't process that request, so for the uninitiated like me:

$ curl gopher://gopher.frog.tips/0/JOB_OPENINGS

* My first ever gopher interaction.

I'm using Lynx, which understood that request just fine.

It's not.

The author didn't say "popular" he said "relevant"

The current state of software development has huge issues with needless complexity resulting from poorly chosen abstractions.

Gopher and IRC make it possible to use any client of your choosing, thus keeping innovation and improvements accessible to anyone with an idea.

> Gopher and IRC make it possible to use any client of your choosing, thus keeping innovation and improvements accessible to anyone with an idea.

I agree to the former and disagree to the latter---they are not correlated. In fact, IRC has demonstrated that it might be hard to adapt to new requirements without breaking many things (no matter it's an existing client, a community or a requirement itself). Today the relevance of IRC is mostly due to the existing community and not due to the technical innovation, and casual users wouldn't benefit from that. I don't know much about Gopher but I haven't seen any counterevidence to this observation.

The new(ish) IRCv3 standard does not break compatibility with existing clients though. If they don't support all of its features, they just have less things displayed.

I know IRCv3 is in progress, and I wish it to be successful, but frankly speaking this has been done 10 years ago.

Some of the more recent IRC clients already support most its new features, including v3.1.

I don't get why this is getting down voted.

I heard gopher before but never used it. I have two browsers (firefox and chrome) and gopher links failed on both. So I checked wikipedia:


And it is not supported in any of 4 most commonly used browser (except there is an add-on for firefox, you can write an add-on for anything anyway). (Edit: 5 actually, I forgot about Safari)

So yeah, Gohper is not relevant now.

Relevancy is a measure of subjective scope. Things that are relevant to some might be irrelevant to others. Gopher might not be relevant to society as a whole but to a subset of hackers (hi!). Parent's comment is effortless snark so it deserves any downvote it acquires.

Something relevant to a negligible minority is usually expressed as: not relevant. Relevancy is not totally subjective, as we don't live alone in basements, but in a society. Something relevant for the most of the/a society is relevant. Other stuff is called niche, irrelevant, because it has no effect on the life of the society.

Nuclear physics interests a handful of people only, yet it is relevant, as society would have problems sustaining its operation without nuclear power. Gopher is interesting for even less people, and it is irrelevant, as society would go on just as well without it.

Low processing power need? A 10$ Raspberry Pi packs more power than the notebook I saved for 10 years ago.

A link to all known gopher servers? It is, in gopher protocol, which I cannot check as none of my browsers support this. This is what is your definition to relevancy is? Because it is not mine.

A webpage written in Nepali is infinitely more relevant to most users, as they can access it, and use everyday tools (like google translate) to have an idea in the page.

Gopher is about as relevant as the International Toothpick Collector Association's Annual Report on the Proceedings of Toothpick Size Irregularities.

I don't think I've ever read anything so passionate about Gopher.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact