
The Rise and Fall of the Gopher Protocol - lindner
https://www.minnpost.com/business/2016/08/rise-and-fall-gopher-protocol
======
dang
This was posted yesterday but looks so good, and spent so little time on the
front page, that we've put it into the second-chance pool (described at
[https://news.ycombinator.com/item?id=11662380](https://news.ycombinator.com/item?id=11662380)
and the links back from there). We wouldn't normally do that for a day-old
post with 28 comments, but I think the HN software penalized this submission
by mistake, which was bad. Plus also, it's the gopher protocol.

~~~
sytse
Good call, I would have otherwise missed this awesome article.

------
dasil003
When I got my first shell account in 1993 it was as a student of the
University of Minnesota, and I remember that the first time you logged in it
didn't dump you directly into tcsh or whatever, there was an easy shell that
let you navigate by number. I recall vividly that both Gopher and the World
Wide Web were on this list (previously I only had had experience with Usenet
and FTP). Initially it seemed like Gopher had all the things, and the WWW was
some weird novelty. Of course within a year I had seen Mosaic running, and
building web pages became a career long obsession for me. It's hard for me to
gauge the politics (I only turned 15 in 1993), but at least from where I sat,
the multimedia document with embedded hyperlinks was the secret sauce. It was
the same thing that had gotten me obsessed with HyperCard back in the day. The
web just felt like this creative playground with unlimited untapped potential,
and Gopher felt sort of bureaucratic (again, maybe just from where I sat at
the U of M). Ironic that now Gopher is maintained as a labor of love by
hobbyists.

------
andrewstuart
You know how Paul Graham says "build something people want". Well somehow I
headed in exactly the opposite direction and designed some strange nondescript
beast that mashed up linked data (remember that concept, no? doesn't matter
you missed nothing), with JavaScript and the web into some sort of Gopher-ish
linked data browser.

Basically with 15 lines of JavaScript you could navigate any http based open
data API

I named it NeoGopher (New Gopher)
[https://www.youtube.com/watch?v=yuSDU0JiI2c](https://www.youtube.com/watch?v=yuSDU0JiI2c)
The madness starts at 1:30 (turn the sound down, the words serve only to
confuse)

Essentially it let you browse through linked lists. Each list item could link
on to another list or to a web page. Each list was created by about 15 lines
of JavaScript which could be loaded from any web server and the data came from
any HTTP web API.

Ahhh... so many meaningless words in the demo. Even I, who designed it, found
it hard to explain what it was - today I would say "It's a linked list
browser.", which is what Gopher was.

Of all the stupid ideas I have had (there have been many), re-inventing Gopher
was probably the worst. What was I thinking?

btw this is an edited repost of a comment I made a while back on the Gopher
topic.

~~~
vmorgulis
Very interesting ideas.

It reminds me Xanadu (without the entanglement):

[http://xanadu.com/Ping'n'Jeff.png](http://xanadu.com/Ping'n'Jeff.png)

My idea for Gopher is to convert webpages on the fly with a text browser like
elinks.

------
protomyth
"Eventually, though, the U did want some money — for itself. At GopherCon ’93,
Yen announced that for-profit Gopher users would need to pay the U a licensing
fee: hundreds or thousands of dollars, depending on the size and nature of
their business."

I wonder how many futures have been destroyed from the desire to profit off
the eating of the seeds instead of fruit?

~~~
x2398dh1
The University of Minnesota office of commercialization of technology does
this habitually, asking for far too much money from startup entrepreneurs.
Many of the academians have no idea how business works, and don't care. 90% of
their success has come from one patent. So from the standpoint of this one
institution, you could actually answer that question by auditing their
performance over the past.

On the other hand, it is a publicly funded institution, funded by taxes and
student fees. So one has to also ask if it would be fair for a small few
having little to do with the University should be able to profit off of
investments made by public dollars and overly high student debt.

~~~
lindner
This is so true. You have to remember that finances were really tight at this
time. The University budget was getting cut left and right throughout the
history of Gopher's evolution. At one point there were plans to outsource
everyone to the Minnesota Supercomputer Institute.

Of course in hindsight obtaining grants or forming a partnership with a non-
profit org or an academic department might have been a better choice,
especially for all the professional services requests.

edit: Also you have to remember that computing was a LOT more expensive then.
I have old quotes for SparcStations and RS/6000s that were in the $20-40k
range, even with an educational discount.. The Mac IIci's were not cheap
either ~$5k when loaded up with RAM.

~~~
protomyth
> The University budget was getting cut left and right throughout the history
> of Gopher's evolution.

Was it actually getting cut or were they not getting the increase they wanted?
I lived in MN off and on since the 90's and it seems like they call a "cut"
every time they don't get the increase they want.

> edit: Also you have to remember that computing was a LOT more expensive
> then. I have old quotes for SparcStations and RS/6000s that were in the
> $20-40k range, even with an educational discount.. The Mac IIci's were not
> cheap either ~$5k when loaded up with RAM.

Looking back, when people see the price of the NeXT cube and freak out, they
forget how much a Mac IIfx was. Its amazing the era had basically expensive
computers and machines like the Sinclair and Commodores in the very low end.

~~~
lindner
Regarding cuts, yes. We had to do with less year-over-year. From Hasselmo's
1991 State of the U address:

> """We lost at least $25 million to inflation, and $16 million through a base
> cut this year. In addition to a potential $25 million loss to inflation next
> year again, the Governor's vetoes of IT and systemwide special
> appropriations cut another $23 million in funding -- for which we are
> aggressively seeking full restoration."""

The mainframe teams had a harder time of things. For Microcomputers we were
lucky - our hardware costs decreased and we had a deal with the University
Bookstore to support their computer hardware sales.

That stuff was still expensive. Here's some educational pricing for a
workstation with substantial education discount in 1994.

    
    
                                            list          discount   
      IBM model 25T                         $8495         $5400.00
             80Mhz upgrade                  $1500         $ 953.50
             64MB upgrade                   $             $2912.00
             2GB disk upgrade               $             $1463.00
                                                           -------
                                                          10728.50

------
dmd
> “I still remember a woman in pumps jumping up and down and shouting, ‘You
> can’t do that!’ ”

> Among the team's offenses: Gopher didn’t use a mainframe computer and its
> server-client setup empowered anyone with a PC, not a central authority.

File this under "straw-man responses that happened in their feverish anti-
establishment imaginations."

~~~
lindner
Not sure what dmd is implying here. But I will say that the committee was
looking at using either X.500 or CSO protocols instead. Thus it would rely on
a centralized authority and a publishing model that would not allow for
individuals to run their own servers.

Also remember that this was the old internet where we didn't have NAT or
really any firewalling. Most of the University of Minnesota's class B was open
to the world at the time. In fact I used to send email directly to my
workstation instead of the central mail host to avoid delays..

------
pronoiac
A contemporary article about Hyper-G:
[http://much.iicm.edu/projects/hyper-g/9.htm/](http://much.iicm.edu/projects/hyper-g/9.htm/)

My impression was that they tried to monetize Hyper-G immediately, with the
negative results mentioned in the original article. Having an open source
server may have helped avoid that for both; I think Apache was a powerful
force for the web.

------
vmorgulis
Gopher is still alive :)

[http://gopher.floodgap.com/gopher/gw](http://gopher.floodgap.com/gopher/gw)

Veronica-2 is the gopherspace search engine:

[http://gopher.floodgap.com/gopher/gw?ss=gopher%3A%2F%2Fgophe...](http://gopher.floodgap.com/gopher/gw?ss=gopher%3A%2F%2Fgopher.floodgap.com%3A70%2F7%2Fv2%2Fvs&sq=McCahill)

~~~
spc476
And there are even blogs (known as phlogs) in gopher. After reading one
(gopher://sdf.org/1/users/jstg/phlog) I was motivated enough to modify my
blogging engine
([https://github.com/spc476/mod_blog](https://github.com/spc476/mod_blog)) to
support gopher. The gopher protocol itself is _very_ simple so getting a
simple server up and running was rather easy: gopher://gopher.conman.org/

~~~
zafiro17
The article says there are just over a hundred gopher sites in existence. Yay
- mine is one of them! I've got 200 or so articles on my website and one day I
decided to make them all available as well over gopher. It's easy to do:
install pygopherd on your machine for the server, and run your HTML files
through an emacs macro that strips the HTML and outputs 80 column text, and
you're in business!

It feels whimsical to have a gopher site up and running, but it costs me
nothing and earns me nothing. It's there because I like it and because I feel
happy knowing it's there. In a way, that's the sentiment behind the Internet
in its earliest days. And that's what i like about it - the philosophical
purity of doing something for its own sake. And now, for the simple pleasure
of posting a link with that old protocol:

gopher://therandymon.com

How awesome does that look? :)

------
wolfgangh
I miss Gopher. I remember when the Gopher sites started to display: "This
service has been discontinued. Please use WWW instead". I was completely
disappointed.

~~~
x2398dh1
Actually there are still a number of Gopher servers in operation, and you can
use Firefox to visit them.

~~~
vmorgulis
"All the gopher servers (that we know of)"

[http://gopher.floodgap.com/gopher/gw?gopher://gopher.floodga...](http://gopher.floodgap.com/gopher/gw?gopher://gopher.floodgap.com:70/1/world)

SDF still provides a gopherspace inside its free shell:

[http://sdf.org/?tutorials/gopher](http://sdf.org/?tutorials/gopher)

------
angry_octet
I remember discovering gopher through the wonderful TurboGopher program. All
the fiddling around with FTP was revealed to be pointless and dumb. For a
brief period, really until Altavista came along to make Mosaic useful, Gopher
was so cool. But at school we all had our public_html directory being
published to the world, and you could nag the sysadmin to install cgibins,
whereas gopher was an institutional thing and focused on files.

Tangent: NNTP is still better than any web forum, fb or twitter. Too bad
solving the spam problem is only via centralised identity platforms.

------
aswanson
_The most popular protocol, or method of retrieving information from another
computer, was FTP (file transfer protocol), the primitive, labor-intensive
equivalent of knocking on someone’s door and asking if you could carry away
his piano._ lol, I wonder what the computing equivalent of ftp today is.

~~~
digi_owl
Still FTP.

~~~
regularfry
Sftp, hopefully.

~~~
wtbob
> Sftp, hopefully.

Ha! I've had precisely one vendor ask for an SSH public key to set up file
transfer.

The world runs on FTP. It's sad. Even sadder, it's often a Windows FTP server!

~~~
digi_owl
If there was a sane way to set up an anonymous sftp server, never mind having
web browsers actually understand sftp links, things may change.

------
nemo44x
Does anyone have a list of modern alternative (experimental even) protocols
out there?

~~~
lindner
The recent Decentralized Web Summit touched on a number of alternates:

[http://www.decentralizedweb.net/](http://www.decentralizedweb.net/)

Interplanetary File System (IPFS) was discussed quite a bit. NameCoin was the
most popular DNS replacement. ZeroNet was a really interesting project that
does all of the above.

[http://ipfs.io/](http://ipfs.io/)
[http://namecoin.info/](http://namecoin.info/)
[http://zeronet.io/](http://zeronet.io/)

~~~
nickpsecurity
I also collect old models, centralized or not, in case they're useful in
modern situations (esp intranet) for reliability or security especially. What
do you think of Tannenbaum et al's Globe model as WWW alternative?

[https://cds.cern.ch/record/400321/files/p117.pdf](https://cds.cern.ch/record/400321/files/p117.pdf)

I thought it was an interesting design that provided nice way of reducing
abstraction gaps and rework in the various layers/techs. They integrated it
with their Amoeba OS that ran on cluster of workstations w/ single, system
image.

~~~
lindner
Wasn't familiar with this so I skimmed the IEEE paper:

[http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=7...](http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=749137)

Interesting that it references the Legion OS work that Greg Lindahl spoke
about at the Summit:

[http://legion.virginia.edu/](http://legion.virginia.edu/)

At first glance Globe feels like a low-level solution for the problem. Mapping
objects to binaries via a broker and all that.

The web we have today solved much of the issues in other ways. Anycast DNS and
CDNs allow for content distribution. Storing state in cookies allows for
Operational transform and other eventually consistent techniques.

Thanks for the reference and connecting a few dots!

------
runnr_az
Interesting! I guess I knew the basic outlines of that story, having been an
internet user in 1993... but it's a neat story. Good work, guys!

------
pronoiac
In February, Metafilter brought back their Gopher server, "after fifteen years
of downtime:" [https://metatalk.metafilter.com/24019/Direct-your-gopher-
cli...](https://metatalk.metafilter.com/24019/Direct-your-gopher-client-to-
gopher-gophermetafiltercom)

------
zeveb
> To the curious who stayed behind, Berners-Lee explained that the Web could
> be used to connect all the information on the internet through hyperlinks.
> You could click on a word or a phrase in a document and immediately retrieve
> a related document, click again on a phrase in that document, and so on. It
> acted like a web laid over the internet, so you could spider from one source
> of information to another on nearly invisible threads.

One of the saddest things about the modern Internet is how little hypertext is
used. As an example, I would have expected, 'Mark McCahill' to link to his
personal or academic site[0], 'San Diego' to link to the city government's
site[1], 'Hyatt Islandia' to link to that hotel's site [2] (perhaps with a
note that its name has changed), and 'Mission Bay' to link to an appropriate
page [3] — and that's in the first paragraph alone.

It's also interesting how those slides from 1992 look current as of 2016:
flat, bullet-pointed, sans-serif font.

> But the internet was not yet open for business. It had been built on dot-mil
> and dot-edu, on public funds. Programmers shared source code; if you needed
> something, someone gave it to you. A dot-com address was considered crass.
> It was “as though all of TV was PBS,” Lindner says. “No commercials.”

Those really were the days. I remember the Canter & Siegel spam, and how
appalled we all were to see it. The Internet then was more genteel. Honestly,
I wish it were still non-commercial: available to be used by companies (as it
was even in the 90s), but not as an advertising medium.

> At GopherCon ’93, Yen announced that for-profit Gopher users would need to
> pay the U a licensing fee: hundreds or thousands of dollars, depending on
> the size and nature of their business. Many users felt betrayed. In the
> open-source computing spirit of the day, they had contributed code to
> Gopher, helping the team keep up with the times. Now they were being asked
> to pony up.

As much as it's become fashionable to group-hate esr the last decade or so, I
think that it's important to recognise his major contribution: persuading
folks the free software (under a different name) can be good for business.
Imagine if the university had kept Gopher as GPLed software, rather than
demanding money from its users.

> At its peak, the Mother Gopher consisted of 10 Apple IIci computers. But
> when it was finally euthanized, who knows what shape it was in. There was no
> ceremony. Nothing was carted off to a museum. Gopherspace simply became
> emptier, and the world without the Web became harder to imagine.

That's definitely sad. We need to do a better job of honouring our history and
marking important events. Even in the late 90s or early 2000s, folks ought to
have recognised the historic import of gopher.

[0]
[https://fds.duke.edu/db/aas/ISIS/faculty/mark.mccahill](https://fds.duke.edu/db/aas/ISIS/faculty/mark.mccahill)

[1] [https://www.sandiego.gov/](https://www.sandiego.gov/)

[2]
[http://missionbay.regency.hyatt.com/en/hotel/home.html](http://missionbay.regency.hyatt.com/en/hotel/home.html)

[3] [https://www.sandiego.gov/park-and-
recreation/parks/regional/...](https://www.sandiego.gov/park-and-
recreation/parks/regional/missionbay)

~~~
aaron695
> One of the saddest things about the modern Internet is how little hypertext
> is used.

Right click, search google (Or however FF does it/extension), without the hard
to read hypelinking standard (hard to read because it's for important stuff,
you should stop there and think)

Wikipedia is a joke in many articles around this hyperlinking irrelevant
points.

A hyperlink should reinforce the article, not detract from it in a random
manner.

We read linearly not in a hash.

~~~
zeveb
> Right click, search google (Or however FF does it/extension)

Which of course relies on a third-party, proprietary service.

> A hyperlink should reinforce the article, not detract from it in a random
> manner.

Back in the old days, being able to jump around from article to article was
considered a virtue. It made the Web like a piece of interactive fiction, and
adventure. It was awesome.

~~~
pronoiac
I miss Suck, which felt like it started _not_ using links for exposition:

> Cavanaugh pointed out that one particular lasting legacy of Suck's is the
> idea of using a link as a rhetorical effect. "People still used italics to
> make a point in a sentence back then," he said, explaining that the site was
> one of the first to use a link to let readers know what it was writers were
> discussing, or to point to a joke. "That was what knocked my socks off about
> Suck right away, was the idea that oh, the link is this funny thing."[1]

I enjoyed it when the link was like a footnote at odds with the text.

[1] [https://www.engadget.com/2015/09/16/suck-dot-com-20th-
annive...](https://www.engadget.com/2015/09/16/suck-dot-com-20th-anniversary/)

------
lindner
Lots of photos and quotes from the original Gopher team on this long form
article.

