
Living without the modern browser - skilled
https://an3223.github.io/Living-without-the-modern-browser/
======
ktpsns
> JavaScript, captchas, and logins, are the main “gotchas” for text-based
> browsing, if the functionality of a site (or part of the site) relies on any
> of these then most likely it will not work.

Really, there is no reason why a form should require JavaScript. When jQuery
was hip, the js community celebrated "Graceful degradation". I frequently have
the feeling this attitude is lost. That's super sad because it also excludes
all the handycapped people.

Does even anybody remember the good old HTTP basic acess authentification?
This is one of the most accessible ways of protecting ressources, it can be
consumed by any HTTP client (!), and it is just reduced to the basic.

One of the worst things are these terrible captchas everywhere. I wonder why
we cannot come up with a web standard interface in a way the captchas could be
offloaded to customizable and adaptable GUIS rendered by the browsing client.
This way, clients (users) could even declare what kind of captcha they can
solve (blind people obviously cannot read images).

~~~
__ryan__
> Really, there is no reason why a form should require JavaScript.

I’m a web developer who has never done significant forms without JavaScript.

So I’m curious: how would you handle reactive form fields without client side
scripting? For example (perhaps a bad example): the user is entering data in
“rows”. The user clicks “add row” and then a new row of fields appears. The
user can also delete rows on a whim. Their changes should only be persisted
once submit is clicked.

Would the “right” way be a full page reload with the row added/deleted, and
caching all the values?

Not to mention, if fields require cross validation, is it customary to have to
submit the form to get the validation error messages to occur? On some of the
forms I’ve worked on, this would be very complex.

~~~
deanclatworthy
Split the form into pages, and use session variables to maintain state.
Regarding form fields appearing and all your fancy front end logic: none of
this would fly in user testing.

I find it mind-boggling there’s a generation of developers after me that
doesn’t know the basics of server side state. No offence intended to yourself,
and good on you for putting yourself out there to ask :)

~~~
hn_throwaway_99
As a web developer for the past 20 years, I do know how server-side state
works. What baffles me is that, while there is certainly much to complain
about the state of front-end JS dev today, the attitude of some that moving
the web back 15 years is a good thing.

There is a reason single-page apps became a thing, so I'm not clear why you
think a page refresh for every user interaction is somehow a good thing. The
dismissive attitude that "Regarding form fields appearing and all your fancy
front end logic: none of this would fly in user testing" flies in the face of
how 90% of heavy consumer front ends out there work today.

~~~
malvosenior
Single-page apps became a thing because of lack of development resources to
build native apps or properly engineer server side solutions (as well as a
business desire to centralize and lock down the product so people have to
experience ads, pay subscriptions...). It has nothing to do with creating a
user friendly experience. 100% of the time a user would benefit from a native
app or properly engineered server-side rendered web page over a SPA.

~~~
ScottFree
I agree, but there is one problem: most people would rather use an SPA than
have to download an app and have to keep it up to date themselves.

I wonder if there's a market for an alternative "browser" that loads "native"
code (or as close as practically possible) and runs it in a sandbox.

~~~
eitland
> I wonder if there's a market for an alternative "browser" that loads
> "native" code (or as close as practically possible) and runs it in a
> sandbox.

Electron comes close, but many don’t like it at all. (I have somewhat mixed
experiences, I love VS Code but I’m not too fond of Slack.)

~~~
ScottFree
Electron isn't native in any sense of the word. I meant something more like
OS-specific binary executables that get loaded and executed in a sandbox.

Mike Acton once said something about having created a browser plugin his team
used that loaded and ran binary exe tools on demand, but that it was also
wildly insecure. At least I think he said something like that. My memory could
be wrong about that.

------
_0o6v
Recently, there seems to be this ever-growing, romantic vision of turning the
clock back to a time when the browser was simpler.

Certainly, in terms of accessibility, the JavaScript (and Browser) landscape
has never been better than it is now. General awareness and tooling for
building online experiences that consider disabled users has led to an
infinitely better time for these people when using the internet, overall.

Sure, some people build shitty products that make your fan spin and ignore
disabled users. But that's because they've built a shitty product, not because
of the 'modern browser' itself.

> The web can be a resource hog, sometimes devouring CPU and memory. But it
> doesn’t need to be that way.

It also doesn't mean returning back to horse and cart!

~~~
hn_throwaway_99
Whenever I see lines like that, "the web can be a resource hog, sometimes
devouring CPU and memory", I really want to respond "no one cares".

OK, not no one. And people certainly DO care if a site is very slow, or user
interactions are bogged down and choppy. But there is this somewhat weird
(IMO) belief, very common among developers and tech-savvy users, that heavy
use of resources is a bad thing in and of itself. I certainly understand where
that belief comes from, but it's just not relevant to 98%+ of people if it
doesn't get to the point of being heavily noticeable in user interactions.

~~~
zonidjan
Heavy use of resources IS a bad thing in and of itself (if it's not necessary
for the task being done - I fully expect a game to eat up my CPU and GPU).
We're not on DOS with only one application running at a time. If all my
resources are being eaten up by bloated websites and Discord and Atom and
other crappy Electron apps, that's less resources for other programs that
might actually need them.

~~~
ashleyn
I can't help but get the feeling the only reason my Google phone has an 8-core
processor is so news sites can play more ads.

------
jakecopp
I've started using Brow.sh[1] occasionally - it runs headless Firefox and
pipes the output over a terminal.

Try `ssh brow.sh` for a 5 minute demo (`ssh brow.sh -t
[https://maps.google.com`](https://maps.google.com`) is pretty cool too)

I really need to try out the experimental vim branch[2] to emulate
Vimium/Tridactyl. Firefox rendered to a shell with Vim keybindings would be so
amazing to use!

[1]: [https://www.brow.sh/](https://www.brow.sh/) [2]:
[https://github.com/browsh-org/browsh/pull/264](https://github.com/browsh-
org/browsh/pull/264)

~~~
tombh
Browsh author here. I feel so guilty about not having merged that branch :(
It's been there 8 months or so. It's by far the most requested feature. I'm
just so overwhelmed with other things to do. I'd love Browsh to be my main
focus, but I've got to earn money. There's so many more things I want to do
with Browsh.

So basically I just wanted to say sorry to the people that have submitted code
and not had it merged and sorry to the people that really want this feature,
like a handful of blind users that keep emailing me about full keyboard
support. I haven't forgotten about Brwowsh, I've just got to earn some money
first.

~~~
IggleSniggle
Are you willing to cede some control and let someone help you maintain it?

~~~
tombh
For sure. In fact there are already 4 people with commit permissions to the
repo.

------
leetbulb
If you use your browser for media consumption, this is a great shell addition
(add to ~/.bashrc):

    
    
      function streamer() {
        youtube-dl -o - "$1" | vlc -
      }
    
    

Then:

    
    
      $ streamer <just about any media url>
    
    

Of course you need youtube-dl and vlc

~~~
LukeShu
A while back I migrated from VLC to `mpv`. I much later learned from an HN
comment that `mpv` automatically calls `youtube-dl` if it's installed; you can
run

    
    
        $ mpv <just about any media url>
    

without setting up any aliases or functions or mucking with calling youtube-dl
yourself.

~~~
ScottFree
Thanks for suggesting mpv! I had never heard of it, but I'm not too happy with
VLC or smplayer on MacOS so I gave it a shot. It's fantastic. Fast, responsive
and it plays everything I've thrown at it.

~~~
shadowoflight
To anyone on MacOS who wants the efficiency of mpv with a nice GUI: IINA [1]
is a slick video player for Mac, with minimal UI, web browser plugins, and
gesture-based controls, and it's FOSS with mpv and youtube-dl on the backend.

[1]: [https://iina.io/](https://iina.io/)

~~~
ScottFree
I gave IINA a shot. It has a nicer UI, but it's not nearly as fast or
responsive as mpv. No sale.

~~~
shadowoflight
Interesting. I’ve found it to be significantly more responsive than VLC while
being much more consistent with the Mac UI, but to each their own!

------
codedokode
What I don't like that modern browsers tend to trade memory for performance,
i.e. get slightly better numbers in benchmarks by using significantly more
memory. For example, have you heard about image decoding cache? A browser can
waste as much as 20 or 30 Megabytes to store uncompressed bitmaps, although a
better solution would be not to store them at all. If the website uses too
much images, then it is not a browser's problem.

Well, wasting 100-200 Mb for one tab is fine, but what if I have many tabs
opened? Browser quickly uses up all available memory and starts swapping out
other programs including a shell.

I think that browser could at least reduce memory consumption for background
tabs: they don't need to run when they are in background. A browser could free
everything that is not necessary, including: image decoding cache, caches to
speed up DOM and CSS, canvases, compiled JS bytecode (it takes more memory
than source code) and JIT'ted code, any network caches, SQlite cache. Also
they could gzip text resources in memory like CSS or JS code.

Note that this approach is better than simply unloading a page like mobile
browsers do. But if the user didn't input anything into a page, I think it
should be fine to unload it completely after some timeout (for example, 30
minutes).

The same approach could be used with iframes. Typically they are used mostly
for ads and analytics. There is no need to waste precious memory for slightly
better animation of an ad banner.

I remember that many years ago Opera could handle tens of tabs with 512 Mb of
memory and not get swapped out.

~~~
saagarjha
> I remember that many years ago Opera could handle tens of tabs with 512 Mb
> of memory and not get swapped out.

Tabs back then weren't full page SPAs, though.

------
void_nill
Most of the time I use my own browser [1] because I set this javascript to
False and can therefore "roughly" do without an adblocker. Unfortunately no
social media or other login services work. For Youtube, Twitch and other video
services I also use mpv which turned out to be a very good and fast solution.

The author is right when he says in the conclusion that it is really difficult
to do without Firefox. Although many services can be fixed with external
programs, these are often only emergency solutions.

But I also learned to distinguish between the "information" web and the
"entertainment" web and since I do a lot of research, programming and reading
in data sources I don't need Javascript to 95%. Other points are important to
me, e.g. fast loading times, low CPU and minimalist design. But you have to do
without many features.

Text-based web browsers are cool, yes. But if you're not in the Linux scene
and ride on (e.g. Manjaro I3) it's incredibly cumbersome and exhausting.

[1]
[https://voidnill.gitlab.io/cosmic_voidspace/alligator.html](https://voidnill.gitlab.io/cosmic_voidspace/alligator.html)

~~~
chewxy
I used to do this many years ago. Then I added JS to upvote and downvote on HN
and reddit. Then I gave up. Now I just use umatrix on Chrome or FF

~~~
pmarin
You don’t need javascript to upvote and downvote on Hacker News.

~~~
void_nill
Whoa! And as I just notice by your comment, not for the login either. I only
tested the HN search function once and it only worked with Javascript.
Therefore I assumed that this is the case with all other functions. Thank you.

~~~
icebraining
Yeah, it uses a service called Algolia, and unfortunately their frontend
depends on JS. But you can use [http://hnapp.com/](http://hnapp.com/) for JS-
free search.

(I'm not affiliated)

------
practical_lem
After a workday, I used to be repelled by using a computer. I started to use
it again when I uninstalled Firefox. Now I just use elinks for basic stuff and
I'm happy again to hack or just _use_ a computer again, instead of a bloated
web browser.

If I want to do "fancy" things, like book a flight, I just do that on the
iPad.

~~~
postit
I used to have access to the Amadeus API, and the experience to book flights
`curling` it was very interesting :P

~~~
Nextgrid
Booking flights via curl sounds like it’s a much better experience than
navigating the endless choices, options & dark patterns on a typical airline
website.

~~~
orcdork
Don't check this checkbox if you want to miss out on not getting amazing
promotional emails, great poems and the occasional bobcat.

------
string
On the subject of JavaScript and NoScript.

I'm a JS dev working with some relatively large institutions building single
page web applications using JS frameworks. Unfortunately I have seen there is
often an unwillingness from both management and some developers to invest in
server side rendering of SPAs. There is always a willingness to support users
who have visual or other accessibility requirements, yet users who want or
need to disable JS are often left out in the cold.

I love JavaScript but I still believe it is important to provide alternatives
for those that don't.

~~~
gruez
>There is always a willingness to support users who have visual or other
accessibility requirements, yet users who want or need to disable JS are often
left out in the cold.

Because accommodating disabled people is mandated by the ADA. No such
requirement exists for noscript users.

~~~
string
Exactly! Whilst I understand the reasons, it is nevertheless a shame that the
willingness is born from requirement rather than the want of actually making
things accessible for as many users as possible.

~~~
spookthesunset
The marginal benefit from making a site available to the relative handful of
people who refuse to run javascript is miniscule. The marginal cost of doing
so is huge.

------
superkuh
I agree. But you don't need completely abandon the GUI. There are modern
browsers like Pale Moon that reject the idea of the browser as the new OS
layer and don't implement all the new attack surfaces and CPU wastes. Combined
with NoScript temp whitelist only you can browse normally but without most of
the resource wasting (ie, ram usage from multi-process, web(gl/etc), and the
like. I regularly do hundreds (ie, 300+) of loaded tabs with less than 2 GB of
ram used.

That said, I still use tools like youtube-dl on most of my computers and IRC
and RSS and all the rest.

~~~
bigato
Agreed on not abandoning the GUI, I'm writing this from dillo for example.
Also agree on the idea of not having the browser as the new OS and not
expanding attack surfaces so much. But Pale Moon seems to be a far cry from
that, specially seeing as it keeps NPAPI plugins support.

------
euroclydon
I now use a dedicated Chrome profile for a handful of sites I trust like my
banks and gmail. The profile has cookies and JavaScript disabled by default
and I have to opt in to let each domain use them.

Then I do all other browsing on a Chrome Guest session. It's cut down my web
usage a bit as I have to log in to sites like HN or LinkedIn, Facebook, in the
guest session to use them.

------
carapace
Just to give a head's up in re: Dillo browser:
[https://www.dillo.org/](https://www.dillo.org/)

On modern hardware it's effectively instantaneous. Click the icon to start it
and it's up and rendered the homepage before I release the mouse button.

Many sites don't work well in Dillo, but then many sites do.

~~~
mftrhu
There's also NetSurf <[https://www.netsurf-browser.org/>](https://www.netsurf-
browser.org/>), which is in active development and which is _almost_ as
lightweight as Dillo, while having better support for CSS3.

It doesn't really support JS yet - it's included and you can compile it in,
but there's IIRC no DOM still - and 3.8 behaves a bit weirdly with word
wrapping and fonts (I _think_ the latter is because they are compiled in the
executable), but it works pretty well, and it works even outside X: it has a
framebuffer version.

------
dredmorbius
While I use GUI JS-enabled browsers, when at a real keyboard, with a shell and
terminal enabled (or even on Termux in Android), fast command-line lookups are
useful. Unfortunately, much of the Web formats poorly (HN included).

I have locally defined bash functions for DuckDuckGo (ddg), the Online
Etymological Dictionary (etym), weather, and stock quotes (well, plus
extensive cruft-stripping for the latter). DDG + bang searches gets me
numerous other services, particularly Worldcat, which wins the award for
least-useful web design: it breaks on _both_ mobile GUI _and_ console devices,
though at full screen rendered by w3m it's passable (I'm considering how to
extract and present useful info bypassing the web design entirely).

Third-party utilities give further resources: surfraw for a whole slew of
sites, rvt (no longer maintained :-( ) for Reddit, and a set of Wikipedia
tools. There's dict/dictd for dictionary lookups (including forwarding OSX
queries to my Debian Linux box which has both a server and far better
dictionaries).

For RSS I wrote a far-too-complex "rsstail-pretty" which reads from a set of
feeds and restructures output -- because there's all kinds of useless and
annoying crap in RSS feed content. Similar tricks for stockquote (a bash
function wrapper around an awk script) and weather (bash function, sed, and
awk -- because features of each are required).

The browser or http agent (w3m, curl, wget, generally) fires once, grabs
content, and gets out of the way in many cases. No 25GB VSS memory allocations
(yes! Firefox) resident for hours or days.

The amount of useless cruft wrapped around desired information payloads makes
the Saturn V launch / return capsule mass ratio look downright sleek.

Particular disappointments: Archive.org, entirely useless without JS.

------
Chirael
When traveling, to keep bandwidth usage down I exclusively use Firefox as my
iPhone browser because of its ability to turn off images. Turns out most web
pages are actually a lot better with only text.

------
Panino
> Probably the most effective thing is going to be an ad blocker.

If for some reason I was unable to use an ad blocker then my web browsing
would drop to such an extent that you could say I no longer used "the web" but
rather a handful of sites. Similarly, if my only TV option was cable with
commercials, I'd just stop watching TV altogether.

I normally browse without JS, and JS-only sites are increasingly common. 95%
of the time I just close the tab, even if I'm researching local businesses.
From the other 5% of the time, requiring JS to read some text has become such
a reliable predictor of regret that the choice (to just close the tab) is easy
and immediate.

On the flipside, people here often complain about not being able to view a
site or that its format is unreadable, when it works great without JS, so it
goes both ways. You only ever hear that blocking JS makes things fail, but not
that it makes things work.

------
wickerman
I remember disabling Javascript for a couple of months in Chrome some time ago
and it barely broke anything. Had a great browsing experience. This might be
because of my own personal preferences when it comes to sites, but I don't
think that JS is a necessary evil in today's modern web.

~~~
chrismorgan
A few months ago I decided to try uMatrix out, which defaults to blocking
third-party JS, but allowing first-party JS. It broke a _lot_ of things. But I
persevered, and manually whitelisted various things as time went by; I decided
to always whitelist by individual third-party for each domain, rather than
just throwing in the towel and letting a site do what it wants. Some things
were very painful to get working; Dropbox login, which I occasionally need to
use for work, was excruciating.

More recently I’ve been playing around with heading in the other direction:
when things break, I try blocking first-party JavaScript instead. This has
often been successful, so that I’m considering disabling all JS by default and
only turning it on when needed.

All this _definitely_ speeds things up. I installed NoScript on my fairly slow
phone (Firefox for Android) a month ago, which I don’t use for much, but do a
little web reading on sometimes. Disabling JS definitely sped things up a lot.

A few sites do stupid things like visually hiding all content if you don’t
have JS. Those I don’t use, use reader mode or once or twice have added a user
stylesheet for if I’m going to touch them more than once but decide not to
whitelist them for probably capricious reasons.

But this disabling-JS lark is definitely not for the consumer.

~~~
timw4mail
Sites don't "visually hide all content" without JS, they are using JS to
render content in the first place, thus no JS, no content.

~~~
codedokode
No. Many sites have a style like "display: none" on body element, to hide
content until JS is loaded. But of course they have content in HTML for search
engines optimization. So they show content to Google robot instantly, but the
human user has to wait.

Google also uses this trick in search results if I remember correctly.

~~~
chrismorgan
Exactly. Here’s an example I encountered today:
[https://status.digitalocean.com/](https://status.digitalocean.com/), all the
content is in a div that is affected by `.layout-content.status.status-index {
opacity: 0 }`. JavaScript code then sets `.style.opacity = 1` on the element
to undo that.

This is a bad technique. If you really want that effect, something friendlier
would be this in the head of the document:

    
    
      <script>document.documentElement.className+=' js'</script>
    

… then prefix that earlier selector with `.js`.

Or another approach: remove the `opacity: 0` from the stylesheet, and
immediately inside the element to be hidden while loading, add

    
    
      <script>document.currentScript.parentNode.style.opacity=0</script>
    

Remember that external JS is never guaranteed to load; even on normal people’s
computers, networks are not reliable and even first-party resources can fail
to load, though it’s ones on a different origin that are the least reliable.

------
megous
One way to avoid the browser, is to scrape. These days I mostly use nodejs for
that. I tend to do that for websites that I would otherwise visit often, or
websites that have terrible UI, yet simple data model.

------
Fnoord
This post is a bit short on information. I like the take is instead of "living
without GUI" (hence CLI/TUI) it is living without the modern browser. But
you'd be surprised how much stuff is flying over HTTP(S). For example consider
VLC, and use the download subtitle extension. You could think of Beautiful
Soup 4 or similar clients (didn't check if it uses that but as a Python
example).

Which leads me to the question: when is a browser considered modern? If it has
a GUI? If it supports Web 2.0? If it supports Web 3.0? If it supports
JavaScript? If it supports all JavaScript by default?

"There are multiple text-based browsers to choose from. The oldest and maybe
the most well known is Lynx. Personally I use Links instead, which provides a
similar experience, but also includes a “graphical” mode that is capable of
displaying images, and the default background color is black as opposed to
gray (only for the non-graphical mode, though the background color can be
changed in graphical mode). Here’s a list of text-based browsers from
Wikipedia (not all of them are maintained)."

Every once in a while I check this out. They're all either horribly out of
date, or they do not work with current technologies, or they have some
experimental features but they're out of date ports from Firefox (with
security vulnerabilities!). Hence I recommend Browsh instead which is based on
Firefox. But I would not say it fits the narrative of "living without the
modern browser" as it is Firefox under the hood. It fits better in the living
without GUI narrative.

------
codedokode
Also, when I was using Chromium, I disabled JS by default and enabled it only
on a number of sites that really need it. But now I don't want to support
Google by using Chromium and Firefox cannot disable JS without installing
heavy memory-hungry extensions.

~~~
KryDos
Actually if you install NoScript it should not eat your memory too much. Even
if it eats something it will save your memory by blocking javascript which is
pretty good trade off.

------
geekit
I don't think issue is with browsers, it is with websites getting more and
more complicated. It's ironic that people (web dev) who are complaining about
browser, are actually root cause of problem.

Quick solution can be if website is trying to act as "app", should only be
app, leave web for just reading hypertext and links. Any lightweight browser
can handle that.

But again we will have to draw an imaginary line between websites and apps.

------
hestipod
What would be the leanest linux browser option if one HAD to use Google Docs.
I am assuming there would be no practical way for a typical user (read "not a
software engineer") to do that in a text only environment.

------
notinversed
This is a great read. I spend so much time fighting the web. If you want to
maintain a shred of privacy you then have to regularly read about all the
latest trackers, dns monitors, browser fingerprinting techniques, shady vpns,
blocking social domains, etc, and even then you're probably not very anonymous
or private.

I hadn't ever really considered just giving up on the browser before. But I
think yeah, maybe it's time. The web has become such an awful cesspool of
surveillance capitalism, I'm tired of fighting it all.

------
reshie
there are many websites that are fine and good in a console browser. i hear
about captchas and such and i never get them but my web browsing behavior may
be different. that said i do still need a browser for specific purposes but
not captchas but script specific that console browser can not simulate yet
that is very rare. this is coming from some one who uses a console browser as
their main browser. i do have a phone that i use for browsing on the go and
other wise but do not consider it my main.

------
xvilka
I hope something like elinks, but on top of Servo (the parsers and JavaScript
interpreter parts) would appear soon. While able to process modern Web, it
would "render" in the console instead. And disable a lot of features.

------
chaoticmass
I never knew about the YouTube RSS feed. As someone who watched 36 hours of
YouTube last week and has hundreds of subscriptions, this combination of RSS
aggregator + MPV looks like the way to go.

~~~
Arnavion
Yeah, I used to rely on subscription notification emails, but those are
uselessly unreliable. If I didn't watch one video of an uploader, Youtube
would stop sending emails for some of that uploader's new videos that I
_would_ have wanted to watch.

So when I discovered the RSS feeds, I just unsubscribed to everything from the
website and added the feeds to my feed reader instead.

~~~
asdff
inoreader at least opens the links in line with full description, shielding
you from having to go to the website at all.

------
scottndecker
If "the most effective thing is going to be an ad blocker" then I'd encourage
people to check out the Brave Browser
([https://brave.com/](https://brave.com/)) which has ad blocker built in and
will actually pay you for viewing ads.

