The web has effectively killed innovation in operating systems. A clean and consistent system will be out of reach as long as this web app nonsense continues.
The lesson seems to be, rather - don't build an operating system that can't build up enough infrastructure to run a browser. This is maybe less forbidding than, say, being forced to build an operating system that can support an entire Windows subsystem.
I have trouble understanding what it is that people want when they say they want "innovation" in operating systems. It seems like almost anything that was practical to do in the period when I worked with Plan 9 is now eminently practical and far more performant in user space; so why not go build it on top of a minimal Linux kernel and save yourself the trouble of building a gazillion drivers, boot loaders, etc.
Because some of us want content to be centralized on our PCs. We want to own our data, instead of giving it away and then renting back access to it. Related, I consider the push towards abstracting away the file system to be a very bad thing.
There is still innovation happening in operating systems, it just doesn't reach mass adoption because most of it doesn't provide real-life benefits to the mass market. Some of the more practical aspects trickle down to mainstream OSes.
We're past the Wild West stage of operating systems. You win some, you lose some.
This is the paper Isaac Asimov researched as an undergrad, and was the basis of his first sci-fi story sold.
There's a lot of material on this topic.
The real computation happen on a far away pile of hardware, with the local computer just being responsible for drawing the updates on screen based on the results coming over the wire.
This page talks about what it takes to port webkit to another platform: http://blog.cranksoftware.com/webkit-porting-tips-the-good-t...
The GP post mentiones Plan 9, which is a great example here. There are real people who use Plan 9 every day (I have met some of them!) but Webkit will never be portable to that OS because there is no C++ compiler. This doesn't make the OS non-functional, but it does mean step one to porting a modern browser is "port GCC or LLVM," which is a massive task by itself.
Both Firefox and Chrome source code repositories are in the hundreds of megabytes, even without the required host OS dependencies vendored in. The document you linked to recommends leveraging freetype, cairo, and all kinds of other things that would require serious graphics infrastructure of a sort that's just not feasible.
Modern web browsers are keystones that cap off arches built of hundreds of pieces of very specific software. As time goes by, they require more infrastructure, not less -- c.f. Firefox, where to build the latest version you must also have a Rust compiler that targets your operating system. As the checklist of requirements grows ever longer, the candidate operating systems capable of meeting those requirements grows ever shorter.
Five years from now, I predict the only three operating systems capable of interacting with the web in any meaningful sense will be linux, macos, and windows; bsd and haiku et al will have been left by the wayside.
Ten years from now I predict there will be four operating systems, and one of them will be Fuchsia, because only a megacorp like Google has the resources to accomplish that.
That is an interesting prediction. What do you think will cause the BSDs to be left by the wayside in such a short timeframe, when they are more than capable of running the latest browsers right now?
Is the browser becoming a systemd module?
I am a little concerned about the trend to push "hardware acceleration" into everything. Graphics cards and their drivers are the biggest source of grief in PCs for me.
This lets you take advantage of whatever benefits your desired system provides you without a change of user experience.
It may be easier to have everyone use the same software and hardware platform everywhere. After all we do want interoperability and we do want everyone to play well with others, but, standardization and portability is a push to stop people from trying new things. The push to standardize everything is an unfortunate consequence of not having a separation of concerns, separate concerns such as content and UI.
Isn't that exactly what iOS and Android are though? I mean most of the criticisms of iOS are against Apple's obsessive curation of it's simplicity and usability, and it 'came to power' smack in the middle of the internet age. People are always complaining that native apps for internet services are a threat to the web. It seems to be exactly the thing you're saying the web makes impossible.
Apple even release a web framework for making sites that looked and behaved as if it was native to iPhone.
Only after the jailbreakers enabled third parties to fool around with the guts did we see apps happen. And initially Jobs wants to sue those app devs and distributors into oblivion.
Yes, Jobs did a song-and-dance about web apps but not because they didn't want 3rd party apps but because Apple wasn't ready to support developers on what was a new platform. Apple plays it safe.
For one, it's not consistent with the timeline and secondly as the jailbreakers discovered there was quite a bit of infrastructure to support apps even in iPhone OS 1.0.
* iPhone release date: June 29, 2007
* iOS SDK Announced: October 17, 2007
* iOS SDK Released: March 6, 2008
* iPhone OS 2.0 release date: July 11, 2008
Apple had such a tough time providing enough support to developers that they restricted access to the paid developer program to just 4000 (Apple said 25000 tried to join).
There wasn't enough time for everything to happen (including relatively sophisticated tools like the iPhone Simulator) if it was just a reaction to jailbreaking.
The non-purists (which is most people) have decided that availability trumps everything.
I was more of a purist, myself, but after some bad experiences with the lack of native apps, I'm more of a realist these days.
There's also a slow movement to add gopherholes to the tor network. A list of the current known ones can be found here; http://gopher.floodgap.com/gopher/gw?gopher://bitreich.org:7...
Gopher is cool as a slice of history though.
Human vigilance is fallible. Protocol-level enforcement is necessary.
Can be achieved, but aren't. I'd love a consistent, stripped-down interface. The web doesn't give me this (it doesn't matter if it _could_).
To put it another way, for someone wanting to provide a service it's daft to choose Gopher 'because it's lightweight'. If you're providing the service, nobody can stop you providing a lightweight HTTP service and in fact doing so is mind numbingly easy. Just do it.
Or provide a URL link to an FTP server. Gopher was basically just a nicer to use menu based alternative to ftp, but visual browser ftp clients are pretty much that now anyway.
Having said all that, it is kind of cool that Gopher is still around. I was using it years before HTTP even existed. If Gopher is going to exits then yes, being lightweight is one of it's strengths, but pretending that this meaningfully distinguishes it from HTTP doesn't strike me as a very strong argument. I suspect a lot of it's attraction is techie cultural exclusivity and contrariness. Not that I have anything against those things whatsoever.
If it was important then the web would cater to it. It's simple economics.
We could create protocols for all the sub needs and desires that people may have but that would be restrictive.
The market is NOT a mechanism for delivering what is important.
No protocol will prevent people doing fun and clever things or produce content which you think is useless.
Gopher as a protocol can serve up any odd file format, HTML included, if one so wants.
I planned to write a new Gopher app for Android a while ago but I haven't finished it yet. This article reminds me to try again. Thanks.
But it wasn't that difficult to do. And yes, I do see a few people hitting the gopher site.
There was always something very satisfying about Gopher, how it made explicit the structure and categories of the data, leaving the front-end to present it how it pleased.
From those sites:
There is a lot of completely inaccessible ASCII art.
Everything is shown in a plain monospaced font, including tabular data. Emphasis can only be done using asterisks, capital letters or similar.
Images can't be embedded with context, only linked to -- at the end of the document.
Links can't be shown in context, only at the end of a document.
Only ASCII text works.
Gopher, on the other hand, simply only supports textual navigation that is easy to feed to a text-to-speech engine. The developer doesn't need to even think about these issues.
You mentioned tables, but there were many years where people were using tables in HTML for formatting instead of just expressing tabular data. So div tags were added. Now people sometimes use divs to represent tabular data.
Search is a pretty good way to navigate content, and is simple for raw text. An index is an even better way, and is supported quite well by gopher.
Yep, and there are algorithms that try to distinguish between ‘data’ tables versus ‘layout’ tables for precisely that reason. It’s unfortunate that they have to exist, but the web has routed around past mistakes.
Now people sometimes use divs to represent tabular data.
Yes, and that’s wrong and a simple fix. But regardless of whether some ill-informed people do that it’s still no worse than Gopher, and when they get it right it’s a lot better.
Regarding search, that’s great if you have exact keywords for what you’re looking for. Less good if you just want e.g. a list of headers on the page so that you can narrow in on the part that interests you.
Gamefaqs hosts plain-text documents. Some of them are quite long and well-structured. Since there's no intra-document linking like with HTML, newer documents tend to have hashtags in the TOC to make Ctrl-F searching easier.
I'd much rather have intra-document links I can click on. That way, as a document author, I can direct readers to the dedicated "All The Monsters You'll Ever Fight" section from anywhere in the document without adding the string "#monsters" all throughout the document, generating false positives for the Ctrl-F-using reader.
1. harder to implement
2. More error prone.
I myself am quite an old grumpy fart, but I would prefer to browse most web sites as gopher sites. I stumble on way too many sites that would have been more usable to everyone had they just skipped those extra 800kb and kept their page below 30kb.
Let's say we are both on an unreliable wifi connection (perhaps on a train) and want to find the most recently-released entry in the table. Then we want to sort the data by each of the other columns of the table.
I click a heading and navigate to the bottom row of the table and see 2016. Done. Then, whenever I am ready I click each of the other four headings to sort the data again. This exhausts the effort I need to expend in order to get sorted views of the same data. And no network access was required to do this (which is nice for non-trivial table sizes).
With Gopher, what user actions are needed to get the same views of the data? How much effort do these user actions require?
Edit: also, are those user actions discoverable? At least visually Wikipedia's table gives me two arrows and a cursor icon change to make it apparent I can click to change the view of the data. As far as accessibility, there is a "title" attribute with the value "Sort ascending" that I presume would get read by a screen-reader.
$ curl gopher://gopher.frog.tips/0/JOB_OPENINGS
* My first ever gopher interaction.
The current state of software development has huge issues with needless complexity resulting from poorly chosen abstractions.
Gopher and IRC make it possible to use any client of your choosing, thus keeping innovation and improvements accessible to anyone with an idea.
I agree to the former and disagree to the latter---they are not correlated. In fact, IRC has demonstrated that it might be hard to adapt to new requirements without breaking many things (no matter it's an existing client, a community or a requirement itself). Today the relevance of IRC is mostly due to the existing community and not due to the technical innovation, and casual users wouldn't benefit from that. I don't know much about Gopher but I haven't seen any counterevidence to this observation.
I heard gopher before but never used it. I have two browsers (firefox and chrome) and gopher links failed on both. So I checked wikipedia:
And it is not supported in any of 4 most commonly used browser (except there is an add-on for firefox, you can write an add-on for anything anyway). (Edit: 5 actually, I forgot about Safari)
So yeah, Gohper is not relevant now.
Nuclear physics interests a handful of people only, yet it is relevant, as society would have problems sustaining its operation without nuclear power. Gopher is interesting for even less people, and it is irrelevant, as society would go on just as well without it.
Low processing power need? A 10$ Raspberry Pi packs more power than the notebook I saved for 10 years ago.
A link to all known gopher servers? It is, in gopher protocol, which I cannot check as none of my browsers support this. This is what is your definition to relevancy is? Because it is not mine.
A webpage written in Nepali is infinitely more relevant to most users, as they can access it, and use everyday tools (like google translate) to have an idea in the page.
Gopher is about as relevant as the International Toothpick Collector Association's Annual Report on the Proceedings of Toothpick Size Irregularities.