
Using the Web for a Day with JavaScript Turned Off - mhr_online
https://www.smashingmagazine.com/2018/05/using-the-web-with-javascript-turned-off/
======
everdev
Back in 2006/7 I remember we'd use "progressive enhancement" to make a site
work without JavaScript and then add JavaScript enhancements for those who had
JS enabled.

At some point (maybe after the popularity of Google Maps?) nobody wanted
progressive enhancement and it was totally cool to just ignore users who had
JS turned off. It made web app development so much easier, but probably less
user friendly.

It feels like JS is a hammer to fix the nail of the page reload. I always
thought it was too bad that browsers choose to show the blank page instead of
sending the request and just rendering the difference themselves.

~~~
Androider
It does make development easier, and at some point it just doesn't make sense
to support any longer.

If the number of people who visit your site with JavaScript disabled is less
than the number of people who visit using the Opera browser, does it really
make sense to add [number of supported browsers] x [JS on | JS off]
permutations to your testing workload? Is spending the time and resources on
creating a non-JS site worth it, or would it better spent somewhere else
(usability, accessibility, etc.)? Everything is a trade-off.

I personally feel the JS ship has long sailed. I'm more worried about the
recent trend of sites not even working in Firefox anymore, just Chrome. That
is definitely a bit sad.

~~~
reificator
> _If the number of people who visit your site with JavaScript disabled is
> less than the number of people who visit using the Opera browser, does it
> really make sense to add [number of supported browsers] x [JS on | JS off]
> permutations to your testing workload?_

If your site doesn't work for people with JS disabled, you're unlikely to see
significant traffic from people with JS disabled.

~~~
TeMPOraL
Also, if you use JavaScript trackers instead of server logs, guess which group
won't show on those trackers at all...

~~~
scient
Most trackers have a way to handle non-JS clients by rendering a tracking
pixel - a solution that has been out there for a long time now.

~~~
etatoby
Last time I checked, the default embed code provided by Google Analytics
(which covers 99% of website stats) was a single script tag.

------
jakecopp
Install uBlock Origin and do it every day!

I block all JavaScript, then re enable it when needed to fix a site.

Often I just need to enable 1st party scripts in the uBlock Origin grid to fix
it.

Make sure you enable advanced mode to see the grid.

~~~
shabbyrobe
If you're serious about giving this a whirl, uMatrix is even better for fine-
grained control. I disable everything except first-party images and CSS by
default and it's like a whole new internet.

There's a bit of a dance you have to get used to when things don't work, but
I've got it into muscle memory now so I don't even think about it. I tend to
click around in there enabling the first few obvious things for about 3
seconds, which works 95% of the time, and for the the remaining 5% the URL
gets copy-pasted into Chromium.

Wait, what are all these floating obstructions? I thought I disabled all JS...
hang on, you mean CSS is now full of animations and distractions too?
[https://addons.mozilla.org/en-
US/firefox/addon/unstickall/](https://addons.mozilla.org/en-
US/firefox/addon/unstickall/) and [https://github.com/gagarine/no-
transition](https://github.com/gagarine/no-transition).

What's that? Sticky sidebar full of chum? No problem - uBlock Origin's "right
click, block element" to the rescue.

The internet doesn't have to be shit any more!

~~~
jakecopp
How does uMatrix compare to uBlock Origin with its matrix enabled through
advanced mode? I can't tell much difference but would be willing to jump ship.

uBlock Origin doesn't have CSS or image control per domain but it's the JS I
worry about!

~~~
shabbyrobe
They complement each other. I use them both. uBlock origin has my back for
those rare times where I get the shits and just disable uMatrix entirely for a
particular site.

Decentraleyes is also worth a look if privacy concerns form a part of your
aversion to the blizzard of garbage your browser spews forth at you. It
contains in-browser hosted copies of all that CDN JS junk, so if you have to
turn on JS for some shared copy of jQuery that Decentraleyes happens to have
bundled, you don't hit the network at all.

------
nabla9
[http://lite.cnn.io/en](http://lite.cnn.io/en)

[https://text.npr.org/](https://text.npr.org/)

Every site should have a plain html option. Or create their html so that it
works fine without js or cnn.

~~~
sjmulder
I'm trying to assemble a list:
[https://sjmulder.nl/en/textonly.html](https://sjmulder.nl/en/textonly.html)
but so far there aren't many sites like this that I found useful.

~~~
icebraining
There was an HN thread on such sites last year:
[https://news.ycombinator.com/item?id=13337948](https://news.ycombinator.com/item?id=13337948)

~~~
sjmulder
Awesome! I've added a whole bunch of them.

------
godshatter
The author talks about "noscript" and makes it clear that it's not a reference
to the html tag but to the idea of surfing the web without javascript. I was
waiting for a discussion of NoScript, the popular add-on for Firefox that
blocks javascript by default and allows for turning it back on on a site-by-
site basis.

I surf the web, as I have for many years, with NoScript turned on and as few
permanent domains white-listed as I can get away with for security reasons.

I don't have any numbers to back this up, but my guess is that the population
of people who use the NoScript add-on dwarfs the population of users that
actually disable their javascript in their browser. I'm not sure how someone
like me shows up in their numbers, but I suspect that I would be on the
"blocks javascript" list and then on the "uses javascript" list if I am
intrigued enough about the site to enable some of the domains it requests
javascript from.

I don't know if web developers take that into account or not, but I suspect
they don't because the number of domains I have to experimentally temporarily
enable just to see some parts of some sites is getting even more ridiculous
these days.

Maybe someone should design a new protocol that is built for interactivity
from the start instead of one designed around static content with back-flips
needed to make it usably interactive.

------
codedokode
The web works much better without JS, sites load faster and browser takes less
memory. But sadly many sites use preloaders or plugins that hide entire page
content until JS is loaded. What an awful idea.

------
kome
I am not a developer, I can write just html+css. For my personal website
([http://mrtno.com/](http://mrtno.com/)) I use just static pages written by
hand. No JS needed, no stupid complexities.

I wonder why if I can sell my lack of skill on the market... because the
skilled designers/coders are programing a web that really has a tendency to be
so bloated.

I browse the web with no-script (firefox extension); and many websites won't
even load without scripts. I wonder why. For many many website I don't see the
need to have JS around at all.

Can the lack of skills be a skill? :) hire me!

------
ibdf
Oh my lord, this article is non-sense. Had you gone to a website and had not
been able to read anything I would say... good point. But you went to a
Wordpress admin to prove your point. Lord have mercy... you're really going to
hate your experience once WP switches to a JS based editor. Quick, let's try
using Facebook and other apps built on JS frameworks and complain about the
functionality not working. Let's try searching data tables with JS off, and
let's try making ajax requests to load data over time instead all at once.
It's not only in your browser it's in your phones, in apps that provide you
services you can't live without.

These type of articles are a shame and are only written to instigate fights
among developers. The world is using JS, it's on every major app. Get over it
or make something better.

~~~
80386
I know a guy who built a single-page web app for a blog. Really.

It takes a few seconds to load the post titles, it'll make your computer grind
to a halt if you have more than two tabs of it open, it took months to write,
and it doesn't do anything you couldn't do with plain HTML and some AJAX
request handlers... but Brawndo has electrolytes!

I bet it looks pretty good on his resume.

~~~
ry_ry
I know a guy that made a model ship in his spare time. It doesn't carry any
passengers or cargo, doesn't meet even the most basic safety standards, and
frankly doesn't do anything you couldn't do with a canoe.

I bet it looks pretty good on his mantelpiece.

There comes a point where you have to ask yourself - does it _matter_ that he
built (what I assume is a personal) blog in framework du jour, simply he just
wanted to?

~~~
80386
Not a personal blog -- this was a paid job to replace an old platform that a
community was using but could no longer maintain.

~~~
ry_ry
Oh gawd, if he was paid market rate it's a pretty questionable decision... At
least he didn't write his own css preprocessor or something, I guess!

------
overcast
As much as I'd love for the majority of JS to go away, there is no denying the
usefulness of AJAX in web forms. Doing page submit is not a better experience
for the user or the developer. That's basically the only piece that I can't
see going without.

~~~
prophesi
Yeah, I only want Javascript for when it actually speeds up a site. Which,
from my experience, is simply:

    
    
      AJAX for all form submissions
      Service Worker for caching
      Turbolinks[0] for page navigation
    

And those are easy to implement as progressive enhancements. If JS is
disabled, the submit button does a regular page submit, the Service Worker is
simply not registered and instead uses your web server's cache policy, and
your links remain as regular hyperlinks.

[0]
[https://github.com/turbolinks/turbolinks](https://github.com/turbolinks/turbolinks)

~~~
overcast
Honestly wouldn't bother with forms failing over. That sounds like double the
work to support a few people. If you want to participate, enable JS, otherwise
view the site as read only.

~~~
prophesi
Not sure if I follow? For a form submission, you'd set it up like a normal
HTML form. The <input type="submit"> would do it's regular thing. If
Javascript is enabled, then you'd hijack the button with event.preventDefault
and do your AJAX.

~~~
overcast
If I'm client side, as well as server side checking inputs with JSON
responses, there is going to be overlapping of work. Particularly annoying
when carrying over field/errors across submit pages. Old school submit, check,
refresh, show errors, is an ugly experience for end users.

Of course it can be done, not something I'm going to worry about as a solo
operating and developing multiple ventures.

~~~
prophesi
Sure, it requires some careful planning. You'd set up the html form with
method="post". Then have the AJAX send the request with its content type as
"application/x-www-form-urlencoded". At least with Node Express, that will
have the requests handled the same.

~~~
overcast
I understand how this works, I've built a billion products by this point. It's
much easier and user friendly to post the form through AJAX, get a JSON
response on success or an array of error codes for each field. Dealing with
error messages and passing around form information is not something I want to
deal with any longer in OG form submit.

~~~
prophesi
Alright. To each their own.

------
danjme1
Most content specific sites (which is what we're talking about right?) and
Wordpress et al have RSS feeds out of the box, if you're that obsessed with
making browsing _that_ fast surely aggregating feeds is by far the fastest way
to browse?

------
parvenu74
When the statistical body of internet users who refuse to use JavaScrpit are
expressed in terms of a market to whom custom (JS-free) experiences will
induce them to pay money for products and services we'll begin to see
businesses create JS-free experiences for them.

Are there marketing-specific attributes of anti-JS web users?

~~~
etatoby
We simply get information on what stuff to buy from websites that work and
ignore those that are broken.

My mobile browsing experience has been immensely better since I installed
Brave and set it to block everything by default, including js.

When I'm presented with a blank page (not very often) I evaluate in a split
second whether the several seconds needed to reload the site with js enabled
are worth its content.

90% of the times the answer is no, so I click Back and then try the next
search engine result for the same query. 10% of the time I really want to see
or use that particular website, so I enable js for it on Brave's panel and
wait for it to reload.

Summary: 1 additional click and 1 additional reload (to be only paid once) on
websites I care about. ∞ less nuisance on everything else.

------
textmode
"I Used The Web For A Day With Javascript Turned Off."

I Used the Web For A Decade With Javascript Turned Off.

I have been using the Web for over a decade via software that has no
Javascript capabilities.

There is a conception one can detect in this article that there is only one
"working" look for any website: the look that the designer intended.

However IME many websites "work" without ever engaging the "look" the designer
intended. To discuss this, one needs to first specifically define what it
means for a website to "work". I am not sure agreement can be reached on that
issue.

It may be that the definition of "work" varies with each user. Different users
may want different experiences. I know what I want from websites: data. I am
less concerned with design. This may not be the case for another user.

The author cites Amazon as an example website.

When I am only _browsing_ Amazon I do not use a "modern" web browser nor
Javascript. Without a browser nor Javascript I can download product data in
bulk, convert it to CSV and import it into kdb+ for easier searching and
viewing. Without a browser nor Javascript I can download images of interesting
products and embed them into a simple PDF for offline viewing, using pdflib.

When I am ready to _purchase_ , then I might move to a "modern" browser with
Javascript-enabled.

Other example websites he gives are Wordpress, Github, Instagram, Twitter and
YouTube. I read the content of all these sites without ever using a "modern"
browser nor Javascript. If I want to view images or video, I can download them
as in the above case of Amazon. I may choose to view the data in whatever
application I choose on whatever computer I choose on the local network, and
the computer need not be connected to the internet.

For this user, these websites "work" just fine without Javascript. I can
always use a "modern" browser with Javascript if I want to see what the
designer intended. However this is rarely necessary for the experience I seek.

------
JoshMnem
I do that every day with uMatrix[1], only overriding hosts when needed. I also
turn off all CSS animation with Stylus.[2] The Web is becoming unusable
without that.

[1] [https://github.com/gorhill/uMatrix](https://github.com/gorhill/uMatrix)

[2] [https://addons.mozilla.org/en-US/firefox/addon/styl-
us/](https://addons.mozilla.org/en-US/firefox/addon/styl-us/)

------
rhacker
The referenced site BlockMetry.com doesn't seem to go into how it calculated
0.2% as having JS disabled. Especially since it references Tor traffic as a
high percentage my guess is that each connection was coming into the site with
different fully-wiped Http connections basically disabling tracking.
Alternatively what percentage of that traffic is simply cache bots for search
engines, etc...

------
stepvhen
i want to invite others to try to visit Yelp on mobile with JS turned off.
first off, you canr access anything. then the website provides nonsense
reasons for why a user should have js turned on, and tries to guilt the user
into doing do.

i always have js off by default, and turn it on momentarily when i feel it
necessary. i do not feel it necessary for yelp.

------
waldfee
the 600kb isn't even that bad. it's 2x the amount of requests that makes it so
much worse

~~~
addicted
The JS version loads 4times slower, despite only needing 600kb more.

Goes to show how much effect the large number of requests have.

------
NVRM
W3M !!! Well done pages runs fine in w3m. Think about viewing it in 20 years.
Another good test is to see the rendering in archive.org .

------
iliasku
i do it every day

------
samirillian
Honestly, I don't see the wordpress backend as a relevant test case. Whoever
else you might be catering to, it's fair to assume that someone building a
website would have JavaScript enabled while developing.

~~~
majewsky
90+% of the people using Wordpress backends do not build websites. They just
author content.

~~~
samirillian
Content authors either

------
halite
Might as well try living a day without water.

~~~
TeMPOraL
JavaScript is not water. JavaScript is Brawndo, the sports drink. Used in
business of watering plants. Because it has electrolytes.

------
NoGravitas
Meanwhile:

requests blocked

on this page

168 or 96%

------
danjme1
RSS, anyone?

------
CodeTheInternet
I'm glad he can code his brochureware site with minimal impact when JS is
disabled. It is 2018 and every site is driven by JS, not just DOM events. If
you disable JS, you know what you're getting into. Do we need to develop for
those that decide to block CSS? Insist on deprecated browsers? You develop for
your audience.

~~~
TuringTest
So you've never heard of the CSS naked day? Or visually impaired users and
accessibility?

~~~
danmg
I disable css when it's some article with some annoying low contrast.

------
Rapzid
Somebody should create a list of JS free sites. They can sell it at Whole
Foods in the "medicine" aisle. You guys know what I'm talking about; "JS Free
-All Natural!".

I think we've about reached the point of ideology on this. If you are
considering arguing with the other side your probably better off arguing
evolution with a creationist, or facts with the GOP.

~~~
andrewmcwatters
What a tremendously rude comment. I can't believe you need so many hundreds of
karma points to downvote what is clearly a non-contribution.

~~~
perl4ever
Well, I'm not sure if it was meant to be helpful, but I _would_ like a "list",
or better yet, a search engine, that excludes all sites with javascript. If
there is no such widely used search engine, then sites without scripting can't
thrive.

I've never been an ideological seeker of "organic, all natural" type foods,
because the criteria tend to seem arbitrary and unscientific, but I've come to
see the label as useful because the other stuff is increasingly adulterated
with garbage I don't want.

~~~
krapp
>If there is no such widely used search engine, then sites without scripting
can't thrive.

Such a search engine would never be widely used, because there's no widespread
desire to limit search results only to sites which don't use javascript.

~~~
perl4ever
It's a chicken and egg question. People don't know they want it, but they
can't find out they want it either. I don't mind javascript or ads _per se_ ,
but the bandwidth and processing requirements of modern ad-heavy web pages are
becoming unmanageable for me. My breaking point is where, even using WiFi at
home all the time, I am still going over 1GB a month for my cell usage away
from home. And equally as important, pages are infuriatingly slow on both
laptop and phone, even though I have a relatively recent and high powered
phone.

~~~
krapp
But unless people are specifically searching for sites that don't use
javascript, a search engine that excludes javascript is just going to return
poor results for most queries.

A more useful search engine might be one that shows whether or not a site uses
javascript, and if so, which common libraries, etc. Maybe even whether or not
a site will render without javascript.

------
staticelf
I like that he offers solutions but honestly, more people are vision impaired
than have javascript disabled so instead of doing all that work instead put it
first on helping people with disabilities.

> There is a danger that more and more sites will require JavaScript to render
> any content at all. What danger, exactly? While some sites that don't
> require a lot of js is nice you will soon notice that many sites you _want_
> js on. Music streaming services is one example.

I build api based sites nowdays because of mainly two reasons:

1\. They feel more rapid after initial load and gives the user a better
experience.

2\. You can use the same api for web and mobile experiences.

I personally think that WebAssembly is the future so the web will finally just
be another compilation target and I can't wait for it to be wide spread.

~~~
wgerard
> I personally think that WebAssembly is the future so the web will finally
> just be another compilation target and I can't wait for it to be wide
> spread.

Hmmm, I'm not sure about that. I've only done toy projects with WebAssembly,
so I could easily be missing something, but it seems to be:

1) Overkill for most websites

2) Significantly more complicated than current web development - which is
saying something given the explosion in complexity of recent web development
practices

Obviously #2 can be lessened as time goes on, but it's hard for me to imagine
#1 ever being false. It's great for performance-sensitive applications like
games, but it doesn't seem to offer much benefit for CRUD apps or static
sites.

~~~
gramstrong
Maybe it's overkill, but hardly anymore than the V8 engine is for most
websites. Compilation to WebAssembly will be at zero or little cost to the
developer, so it shouldn't matter either way.

