
A day without JavaScript - pmlnr
https://sonniesedge.co.uk/blog/a-day-without-javascript
======
spiderfarmer
One of my clients loads a 4.5 Mb bower.js (including Angular with a lot of
components and jQuery), they also include an extra jQuery script, a full
jQuery UI and several other scripts _on each pageload_. Nothing is minimized.
The bower file alone has 300k in comments.

The CSS file is also nearly 1 Mb.

It's just a simple website with some forms.

They have 2 developers working on the site, a scrum master, a project manager
and 2 testers but somehow they can't find the time to find out which legacy
code they can remove.

It's only after I pointed out that they have an exceptionally high bouncerate
on their (expensive) Adwords traffic and the slowest pageload time of all
their main competitors that I managed to get some priority to optimize their
site.

Serious question: are there any tools that can scan a _full_ website for
unused code and unused CSS?

~~~
deanward81
Chrome 59 has a coverage tool built-in:
[https://developers.google.com/web/updates/2017/04/devtools-r...](https://developers.google.com/web/updates/2017/04/devtools-
release-notes)

~~~
JepZ
Note: The color-coding is likely to change in future Chrome releases. <\--
Well, someone eventually found out that some people cannot distinguish red and
green, but it was too late :-D

I am a little surprised to find such an issue in Google software as it is a
topic for first semester CS undergraduates ;-)

~~~
uniclaude
Wait what? CS students learn color about color blindness now? I really wasn't
in the good classes!

~~~
Fiahil
I think it's a joke.

~~~
onli
I actually learned that as a CS student.

~~~
kbenson
If it was actually part of the curriculum I sure hope that it was in some
design centered elective class. IMO if your CS degree spent time teaching that
as part of the core curriculum, they missed an opportunity to put more Math,
PL theory, and interesting algorithms in there, because there's more than can
be sanely covered in any one curriculum.

Since it's accessibility based, it's more laudable than teaching CS students
how to center a div, but it's not like it really requires a mentor of some
sort to express the nuances of, right? Or even if it does, it's still design.

~~~
onli
It's some years now, and there were at least two courses in which it could
have been. The one was elected, HCI. The other was not, it was an introduction
into graphics and audio, and since you need to understand basics of human
perception to understand compression in that area (jpg, mp3), they talked
about stuff like that.

A good CS degree definitely has the space to teach some basics in that area.
To mention Gestaltgesetze, to explain human perception a bit, and give an
introduction into usability. You do not get a useful developer in the end
otherwise

~~~
kbenson
> You do not get a useful developer in the end otherwise

Not all developers do stuff with UI, and of those that do not all do anything
with a UI that is actually graphical beyond a terminal.

> since you need to understand basics of human perception to understand
> compression in that area (jpg, mp3), they talked about stuff like that.

That is a good reason to teach it, and counters my overly assertive original
comment.

------
WorkLobster
Having browsed nearly JS-free for the last two-ish years, aside from a bit of
tweaking at the start, I have to say that it has made things a lot faster,
more stable, and just much more of a pleasure to navigate. If anyone is
interested, I have found that two tools make life a lot easier:

\- NoScript, which blocks execution of scripts save for those you whitelist.
This means sites that reasonably require JS (e.g. YouTube, Google Maps) can
remain functional.

\- A bookmarklet that redirects you to the Google text-only cache of the
current page, which is great for text-based articles that inexplicably require
JS to show content.

~~~
pmlnr
Would you mind sharing that bookmarklet?

~~~
WorkLobster
Sure:

    
    
        javascript:(function()%7Bvar%20loc%3Dwindow.location%3Bif%20(window.location.protocol%20!%3D%20%22https%3A%22)%7Bloc%3Dwindow.location.toString().replace(%2F%5Ehttp%3A%5C%2F%5C%2F%2F%2C'http%3A%2F%2Fwebcache.googleusercontent.com%2Fsearch%3Fq%3Dcache%3A')%3B%7Delse%7Bloc%3Dwindow.location.toString().replace(%2F%5Ehttps%3A%5C%2F%5C%2F%2F%2C'https%3A%2F%2Fwebcache.googleusercontent.com%2Fsearch%3Fq%3Dcache%3A')%3B%7Dwindow.location.replace(loc%20%2B%20'%26num%3D1%26strip%3D1%26vwsrc%3D0')%7D)()
    

Glancing at it, it just seems to drop the webcache URL in front of the current
page location. Credit to [http://www.localseoguide.com/easily-check-the-text-
only-cach...](http://www.localseoguide.com/easily-check-the-text-only-cache-
of-a-page/), if I recall correctly.

------
mnm1
This is everyday for me. Just whitelist the few sites that actually need it
and most of the rest are fine. This should be the default. What do I care if
some images don't load or CSS is slightly off? The main content is there. For
sites that won't do anything without JS, I can consider whether I want to use
them or not. Mostly not. Fuck other people running code unnecessarily, without
my permission, on my computer. Especially Javascript. It was an extremely
stupid and costly decision to have a scripting language like this run by
default in browsers... though great for doing things against the user's wishes
and making money off of it. And security holes. And privacy issues. Etc etc.
The irony is that AJAX and SPAs were created as a response to the
request/response model that was seen as too slow. Now it's these SPAs that are
unbearably slow and buggy while the request/response model has only improved
as hardware and networking have gotten better. I think if more people
understood how the Internet works and what is actually happening, they would
also turn off JS. No chance of that happening though.

~~~
ZeroClickOk
I fully agree. Do you stop to think flash/action script was "obsolete" because
all this? and now we have the exact same problems...

~~~
Touche
flash was obsoleted because 1) iphone didn't support it, 2) iphone didn't
support it, 3) iphone didn't support it, and 4) JS replaced the non-DRM use
cases (and now even replaced that).

~~~
abritinthebay
Well that and Flash was a _massive_ vector of security problems.

Plus it wasn't very good on mobile, even when it was supported (ie - on
Android it was still _bad_ and never got out of beta anyhow)

~~~
Touche
I thought it was fine on Android, I played many games on it that worked well
even on those old underpowered phones. It was never good, but probably about
0% of flash apps were written that targeted mobile so I wouldn't have the
expectation of _good_.

~~~
abritinthebay
I think "fine" is being generous.

When it worked it was... not great, due to UI issues, but it at least worked.

Trouble is it seemed to not consistently _work_. I worked at a flash game
developer at the time (Zynga) and it would not always load our games even if
it had done so previously - the difference was sometimes just refreshing a
page.

------
jaclaz
If I may, it seems to me like there are several possible points of discussion:

1) Javascript is bad (vs. Javascript is good and variations thereof)

2) The use a lot of websites make of Javascript is overcomplex, gratuitious,
uncalled for, intrusive, etc.

3) Sites should provide some (even minimal) functionality to people browsing
them without Javascript

#1 is largely a matter of personal opinions

#2 is a known, undeniable fact

#3 is where everyone could contribute

Personally I browse normally with javascript turned off since years (and I use
another browser with the capapbility to display the site "fully" when really-
really needed).

Particularly when browsing HN, I follow given links and often avoid a lot of
pop-ups, ads, and what not, and from time to time, when I find a site that
won't load AND I really think that the linked to site is worth it, I use the
"other" browser.

What is curious is that most "new" products, Saas, _whatever_ simply fail,
showing just a blank page, that is exactly what the site shouldn't do.

I would gladly accept a "You need Javascript enabled in order to fully
appreciate the site" kind of message, but only if some (basic, simplified how
much you want it, even text only, etc.) content is shown nonetheless.

I simply cannot bear the "blank page" or the "Your browser is not supported,
please download Chrome, Firefox, Internet Explorer and try again".

This makes me sure that the people behind the site in the best case have not
understood the "showcase" nature of a website and in the worst case care
nothing about user (please read as customer) experience.

So, if you create a site, consider how a part (even small) of the visitors may
have not javascript enabled but they are anyway potential customers, they are
somehow anyway interested in your site contents, by making their no-javascript
experience so miserable you are turning them away from your site, sometimes
forever, which is the perfect negation of the reason why you put the site
together in first instance (to make your _whatever_ content visible to the
world).

~~~
linkmotif
I simply will not spend time tailoring a nascent or even mature product to
0.0001% or whatever minscule percentage of the population turns off JS. If you
turn off JS, you get a blank screen. Why should I accommodate your cohort? Why
write unit tests for someone who takes a standardish client with above 1%
market share and does something completely unstandard with it?

~~~
always_good
I find that it's at least (usually more) double the work to build the JS-free
version of something for a fraction of the user experience.

For example, imagine a forum where clicking the "edit post" button turns your
post into a <textarea> editor and saves with AJAX so that you can continue
scrolling once you make your edit.

To build the JS-free version, you typically need a separate endpoint, a new
template, a redirect, and a less empowering editor. And all this for UI that
99% of your users won't see.

It just doesn't seem like a opportunity cost savvy way to build a website.

Then there are the UIs that take some real backsplits to accomplish without
Javascript like a table that lets you mass-modify the rows with checkboxes and
a <select> at the top that lets you choose options like Move | Archive |
Delete.

You can wrap the entire <table> with a <form> such that all of your checkboxes
submit to your mass-modify endpoint. That works without JS.

But since you can't nest <form> within other <form>, your <form
method="DELETE" action="/things/{id}"> delete buttons can't appear in each
row.

I have a hard time understanding how so many people in these threads can
suggest that the opportunity cost of building for the 1% is always worth it.
Does everyone just have basic blogs on the mind when they envision the labor
involved in what they preach?

~~~
ams6110
For me, yes. I should be able to see at least the main content of your blog
without JS. I don't care if it's perfectly formatted or styled. Just put the
text in a div.

When I go to a blog post linked from HN and see a white empty page I will just
move on.

~~~
horsawlarway
Good. Please leave. I don't walk into a restaurant and demand that they cook
my food without using a knife. Don't expect modern websites to bend over
backwards for a (frankly rather juvenile) view of a modern web standard.

Code execution is part of the web, it's here to stay, the number of sites that
support non-JS experiences will continue to dwindle.

You're not some special snowflake. Other people make real decisions based on
cost/value models, and you cost way more than you're worth.

~~~
flukus
Where does the cost of making a blog readable without javascript com from? It
takes a lot of extra effort to make a blog that requires javascript.

As far as I'm concerned, if it's not readable without javascript then you've
got nothing intelligent to say anyway.

~~~
horsawlarway
>As far as I'm concerned, if it's not readable without javascript then you've
got nothing intelligent to say anyway.

Great! Leave please. I have no desire to interact with folks who approach the
world in that manner.

------
VMG
> For me it’s a matter of elegance and simplicity over unnecessary complexity.

Simplicity is having one place where the DOM is created and managed, and
optimizing for the 99% use case.

Complexity is splitting up DOM rendering over two networked systems for
servicing a 1% use case.

Browsers are JS runtimes now. Get over it.

~~~
hobs
Naw, I will keep running noscript and blocking script by default.

If your site doesn't work, I don't care, I will go to a different one.

I may be the minority now, and while there is some great use of js out there;
most of it is bloated, slow, insecure, and often privacy destroying.

~~~
madsbuch
Not all websites not using JS are good:
[http://www.theworldsworstwebsiteever.com/](http://www.theworldsworstwebsiteever.com/)

It is not a question about JavaScript or not. It is a question about well
developed site.

~~~
eberkund
At least that site loads quickly, doesn't call home with tracking info, or use
up all of my expensive mobile bandwidth.

~~~
sp332
According to my uBlock Origin plugin, this page does use Google Analytics and
Easy Counter.

------
godinaa
One thing to consider is that at larger companies, we already have a difficult
enough time testing all of our site with JavaScript enabled. Netflix building
a version that works without JavaScript is like asking them to build a second
website, especially from a QA perspective. When you look at your users and
find that .5% of them are not running JavaScript, how do you justify spending
the money for that few of people?

~~~
godshatter
The NoScript extension is the 5th most downloaded extension for Firefox
([https://addons.mozilla.org/en-
US/firefox/extensions/?sort=us...](https://addons.mozilla.org/en-
US/firefox/extensions/?sort=users)). If you want to show a blank screen or a
broken website to those users, that's up to you. Some of them may really want
to see your site, so they might try to figure out which of the twenty domains
they should enable, and which of the thirty they should enable after that one
doesn't help them. Most will likely click on the back icon and see what's next
on the list.

~~~
largehotcoffee
Yes, and Firefox is the 3rd of 4th most used browser at less than 10% (which
is bad). So the subset of actual Firefox users that use NoScript is absolutely
not worth supporting.

~~~
godshatter
Sure, but we're not talking extreme niche here or anything. It was downloaded
by over two million users.

~~~
postozoan
And the vast majority of those, if they want to use netflix, just whitelist it
anyway. The amount of revenue lost must be tiny compared to the costs.

------
FLUX-YOU
>Verdict: Cartography catastrophe.

As much as I hate JavaScript, Google Maps gets a pass. I'm pretty sure the
code behind it is thoroughly tested. We can just suck it up and turn
JavaScript on for one of the most useful tools from the internet -- how hard
is it to find something that whitelists domains to run JS anyway?

~~~
oneeyedpigeon
There should be _something_ as a fallback, though, surely? Take out all the
functionality, by all means, but at least give me a map image. If you can't
even get link-to-zoom working to display different images, just give me a map
image based on my IP location; give me _anything_ rather than nothing.

~~~
FLUX-YOU
Can you do lazy loading with just HTML and CSS? You can load the map around
the GPS coordinates you start on, but what if you go 20 miles north? How do
you: A) figure out what are the new GPS coordinates that your viewport is
looking at? And B) load the map within the viewport centered around the new
GPS coordinates that you're looking at without ajax?

~~~
Liquid_Fire
How would you "go 20 miles north" without JavaScript? The only sensible way
would be by clicking a link, at which point you can load your new tiles.

~~~
boobsbr
I think OpenLayers did this a looooooong time ago.

~~~
marcosdumay
Google maps did this a looooooong time ago.

------
brightball
Say what you will about jQuery, but it made graceful degradation easy (and
encouraged).

I do miss the days when people considered JavaScript an enhancement and not a
dependency.

~~~
pmlnr
It was called progressive enchancement[^1] back in the days and it was good
for you.

[^1]:
[https://en.wikipedia.org/wiki/Progressive_enhancement](https://en.wikipedia.org/wiki/Progressive_enhancement)

~~~
robin_reala
It still is?

------
thinkxl
It's funny how devs feel threatened by a non-JS trend. Even when their
favorite frameworks are providing tools to accomplish minimal functionality
when JavaScript is off.

Server side rendering has been a big priority in React, Vue, Redux,
ReactRouter, etc.

~~~
joshstrange
I'm not threatened but no-js but I also am not going to maintain 2 code bases
or thoroughly complicate things for such a small percentage of people. Yes if
you have a news/blog site you should probably support noscript but for web
apps I'm sorry, it's just not worth the time and effort.

~~~
thinkxl
There are ways where you don't have to maintain 2 code bases. For example with
react, when doing SSR, you can import your components from the client and use
them in the server, allowing you to keep minimal functionality (links, text,
images, etc.) when the JavaScript is off.

There is one function on the server that you need to use to render any
component you imported.

------
andrewfong
Are there any JS frameworks that make it easy to build SPAs that degrade
gracefully when JS isn't available?

I know there's a lot of work done on making server-side rendering work with
React / Vue / whatever, but I'm wondering if there's a framework available
that asks for some tradeoffs in how your SPA is structured in order to
maximize the amount of functionality available when scripting is off (similar
to how Redux asks you to make state-management tradeoffs in order to get hot-
reloading, dev tools, etc.).

For example, you'd probably want, at a minimum:

* By default, all code related to fetching or subscribing to data from a server is tied to URL routing.

* By default, all code related to sending data to a server works with standard webforms.

The closest analogue to what I'm thinking about is Turbolinks and Ruby on
Rails, but I'm curious if there's some equivalent in JS-land because you'd get
less context-switching when adding extra JS-only SPA interactions on top of
the core JS-optional form-based interaction.

~~~
madjam002
If you use Redux and React server side rendering, you can quite easily wrap
forms/interactions on your page with a <form /> tag and a hidden input with
the Redux action name, and then handle the actions/reducers on the server.
I've played around with it in the past and managed to convert a SPA which was
using redux-form heavily to work fully without Javascript enabled, and it was
only like ~200sloc.

~~~
RussianCow
I wouldn't say it's "quite easily" done. The most obvious challenge is
components that depend on data loaded asynchronously: If the only thing the
component renders is a "loading" message while it fetches something from an
API, that's all that the server is going to render as well. So you have to
figure out a way to render asynchronously on the server-side, and then you
also have to come up with a generic way to handle cookies, redirects,
fallbacks for interactive features, etc. So it might be easy for really simple
apps, but it gets incredibly involved as your app becomes more complex.

The nice thing about this, though, is that most of the work happens up front,
and as long as you develop the app within the proper constraints, server-side
rendering is more or less "free" after that.

~~~
madjam002
Agreed, but the overhead of coming up with "generic" ways to handle all of
these interactive features like data fetching, redirects, etc, isn't that big
(at least from what I've experienced).

It's good to have a standardised way of doing all of those things anyway, and
when you do, you can make sure they all work on the server.

~~~
RussianCow
I don't agree. Dealing with asynchronous API calls in a generic manner is, in
my experience, non-trivial. It's been a little while since I've tried it, so
maybe things have changed, but when I tried this, the React server rendering
API converted the immediate rendering of the component to a string, which
means you effectively have to pre-load all the relevant data outside the React
app, which may be really difficult depending on how you've architected your
app.

All the other stuff is pretty easy, though. :)

------
nigma
I turned off JavaScript and Cookies for all sites a few months ago and only
whitelist (or open in another Chrome profile) when they refuse to display or
need an authenticated session.

There are some sites that require JS to render, but the great thing is that
majority of pages I visit work just fine as plain html/css. In general I feel
less distraction from popups, overlays, ads, etc.

~~~
TeMPOraL
I use uMatrix as a substitute for NoScript + uBlock, and while it's a pain to
get some sites working manually without bulk-enabling everything on them, I'm
mostly happy about it, and the web looks better for me.

~~~
aninhumer
I tried using uBlock (EDIT: uMatrix) for a while, but I felt like it was a lot
of hassle, and most of the time it still wasn't granular enough to actually
block the things I wanted to without breaking sites. It seemed like usually
all of the bullshit javascript making sites sluggish came from the same domain
as the few bits I might actually want. (Or at least needed to make the site
useful.)

~~~
gorhill
> wasn't granular enough to actually block the things I wanted to without
> breaking sites

I don't understand the criticism: it's no less granular than NoScript. If
considering only the dynamic filtering panel[1], it can be argued it's more
granular given that one can block inline-script tags only, and also one can
block/allow on a per-site basis.

Static and dynamic URL filtering of course allows you to filter on a per-URL
basis.

[1] [https://github.com/gorhill/uBlock/wiki/Blocking-
mode:-medium...](https://github.com/gorhill/uBlock/wiki/Blocking-mode:-medium-
mode)

~~~
aninhumer
> it's no less granular than NoScript.

Sorry, to clarify, I wasn't comparing uMatrix to NoScript, merely commenting
that it didn't seem particularly useful to me when I tried it.

I was considering trying NoScript at some point, but from what you've said, it
doesn't sound like it would be much better for my purposes?

------
dreamcompiler
Mozilla is partly to blame here by taking away the "Disable Javascript"
control, which effectively told websites they could go crazy with JS and
ignore the "no-JS" users.

What might be helpful now is something like a MAX-JS-BYTES=n header that tells
site owners that the useragent will ignore any JS beyond the first n bytes on
the page, and will not request more once that limit is reached. (I know this
is a terrible idea; hard byte limits could never work. But somehow site owners
need to be made to feel more immediate pain when they screw their customers
with too much JS bloat because they've hired lazy developers, or because their
idiot VP of marketing is all about TEH BLING.)

~~~
smitherfield
_> Mozilla is partly to blame here by taking away the "Disable Javascript"
control, which effectively told websites they could go crazy with JS and
ignore the "no-JS" users._

One might say, for other reasons, that Mozilla is 100% to blame for all of it.

------
russdpale
I generally agree with what he is saying, but the web has moved beyond the
point where its reasonable to expect websites to have JS fallbacks. It is just
too prominent now a days.

I've always been a JS hater, though I have to code it frequently.. However,
with that said I find the arguments against the language becoming more and
more obsolete as the years go on.

~~~
kevin_thibedeau
It's not unreasonable to expect hyperlinks or images to work without JS.
Fancier stuff sure, but basic navigation through a content only site should be
functional without it. This is what HTML does and you have to be profoundly
lazy to screw that up.

~~~
douche
My favorite pet peeve is JS links, that aren't actually <a> links, but spans
or divs with click handlers, so there's no way to right-click->open in new tab
or ctrl-click->open in new tab. It's like intentionally doing something that
is harder, and works shittier; I don't get it.

~~~
syrrim
My favourite pet peeve is the lyrics wikia, which has some css embedded in a
<noscript> tag that blocks out the lyrics and replaces it with a "sorry! This
site requires javascript" message.

------
mikegerwitz
Third-party scripts aren't just a performance problem---consider
privacy/trackers as well. I gave a talk at LibrePlanet 2017, where one of the
things I showed is the drastic effect disabling JavaScript has on third-party
scripts, graphed by Lightbeam:

[https://mikegerwitz.com/talks/sapsf.pdf](https://mikegerwitz.com/talks/sapsf.pdf)
(slide 60 / pg 115, "The Web---Mitigations & Anonymity").

There's a bunch of stuff before that slide that's mitigated by JS, including
things like ultrasound cross-device tracking and hardware fingerprinting.

------
JepZ
Well, while I like what you can build with JS, I often don't like what people
build. For example, npm makes it so easy to import megabytes as a dependency
to your project while you might only need a few kilobytes. And some devs
simply do not seem to care about that.

On the other hand, JS allows us to build great user interfaces which better to
use than our old school point and click adventures we use to have. And if you
do it right the performance is also ok, but it is also not so hard to fail and
ruin the whole user experience by rendering the scene twice every frame ;-)

~~~
marssaxman
I actually liked the old point and click adventures better, because you always
knew exactly what to expect. There was a fixed suite of tools provided by the
browser and that was that. Now... well... every dumbass designer with a bright
idea is free to inflict some bit of novelty on you. Fads sweep through the
design community and suddenly old sites which worked fine have to be "updated"
to work differently, and you have to learn them all over again, for no
benefit.

------
shirian
Disclaimer: I am, and have been for the last 5 years, a JavaScript developer

My current off-work project is a web application, that is built with Elixir.
I've written 0 lines of JS. It feels really good. This is my way of
internalizing some of the points made here[0]

[0]
[http://idlewords.com/talks/website_obesity.htm](http://idlewords.com/talks/website_obesity.htm)

------
dmitriz
Ironically, this articles confirms that investing into the no-JS functionality
is becoming less and less important, with those major sites already forcing
the user into JS. Why then would you invest your time and money to cater to
the "eccentric" 1%?

~~~
robin_reala
Because of those 1% the large majority don’t choose to turn it off but don’t
have it available for a multitude of reasons outside of their control.

~~~
Ajedi32
Such as... ?

If you're referring to screen readers, I was under the impression that pretty
much all of them execute JS these days. Am I wrong?

~~~
robin_reala
You’re not wrong about screen readers, no. I wrote a blog post about this last
year[1] but I’ll copy and paste a list of possible reasons from it here:

\- script hasn’t finished loading, or has stalled and failed to load
completely

\- application route is up, but the route to the Content Delivery Network is
broken

\- user has chosen to install a plugin that interferes with the DOM

\- user has been infected with a plugin that interferes with the DOM

\- user’s company has a proxy that blocks some or all JavaScript

\- user’s hotel is intercepting the connection and injecting broken JavaScript

\- user’s telecoms provider is intercepting the connection and injecting
broken JavaScript

\- user’s country is intercepting the connection and injecting broken
JavaScript

\- JavaScript code has functions not implemented by your user’s browser (older
browsers, proxy browsers)

[1] [https://gdstechnology.blog.gov.uk/2016/09/19/why-we-use-
prog...](https://gdstechnology.blog.gov.uk/2016/09/19/why-we-use-progressive-
enhancement-to-build-gov-uk/)

------
rwmj
I found that Google actually works better without Javascript. It's much more
like the Google of old (c.1998).

One significant enhancement (w/o JS) is that you can type something in the
search bar without losing your current results while you're typing.

~~~
dougb5
To do this without turning off JS, under Settings > Search Settings > Google
Instant Predictions, select "Never show Instant Results"

------
peteretep
NoScript plugin keeps JS off by default and you can turn it on as needed or
whitelist certain sites. Highly recommended

~~~
digi_owl
Dunno how well it will work after Mozilla drops support for XUL extensions...

~~~
tombrossman
NoScript definitely works without XUL support and is the fourth most popular
extension listed on the compatibility page here:
[https://www.arewee10syet.com/](https://www.arewee10syet.com/)

Tested and working, for a while now.

~~~
blowbaybestban
e10s is multi-process, I thought, which is a different animal to the new plug-
in framework they are pushing. I'm pretty sure I read no-script will have
problems, but that might have been with regards to tree style tabs instead
(which is shown to work in the multi process branch)

------
emodendroket
What benefit could Google possibly get out of making Google Maps work without
JavaScript? The main takeaway I have here is that older applications are more
likely to support a no-JS scenario, and I'd imagine that's because they can
just fall back to older legacy code.

------
sevensor
I negotiate the javascript issue by using two web browsers -- w3m for reading
hypertext and Firefox when I want javascript. I'm fine with my bank or an
online store expecting to use my browser as a thin client, but if you require
javascript to present your blog, I'm going to find a different blog to read.

This policy makes it pretty much impossible to use social media, because
social media is basically blogs-that-require-javascript. I see this as a
beneficial side-effect.

~~~
awiesenhofer
[http://mbasic.facebook.com](http://mbasic.facebook.com) if you ever do miss
social media this gets you a javascript-free, minimalist version of facebook.
I use it all the time on mobile and get hours more battery compared to their
app or standard mobile site.

~~~
gpderetta
I switched to mbasic the first time the FB application crashed on me on
android.

------
textmode
I have not used Javascript in years. I boot to textmode and use a text-only
browser.

There are exceptions when I use JS. When I have to use someone else's
computers or manage certain accounts using the www. The later not being by
choice.

In the 90's, there was an attempt to allow web developers to run their code on
the user's computer via the user's web browser. This was called Java applets.

There was no limit to the pie-in-the-sky promises that developers were making
back then. All based around the web browser and Java.

Not surprisingly, Java applets failed. I think there were some security
issues. Maybe. Not sure.

Javascript reminds me of Java applets.

I am not sure if there are security issues with Javascript. From discussion
surrounding this language, today we are asked to believe there is _nothing_
that cannot be accomplished with a web browser and some Javascript. Unlike
Java applets, this time, it is real. I think.

But, as far as I know, I too can do anything I want to do _without_ using
Javascript. And without using Java applets. If there are websites that _truly_
require Javascript (see below) in order to access data, I cannot find them.

1\. require

Here, "require" means that the data could not be served to the user via any
other e.g. more direct mechanism than through a series of steps that includes
Javascript being run by a web browser. Here, "require" does not mean the
choice by the website owner to use Javascript in this way or a message
displayed to users that suggests "Javascript is required". The meaning here
refers to technical capabilities not design choice.

------
shakna
The fact that so many sites fail to even make use of a noscript block (like
[0]), boggles my mind.

[0] [https://www.google.com/chrome/](https://www.google.com/chrome/)

~~~
TeMPOraL
Re [0], note that Google Maps on the other hand seems to handle lack of JS
very well. I guess it can be blamed on different teams in the same big
company.

~~~
tombrossman
If by "seems to handle lack of JS very well" you mean "does not work at all",
that makes sense. Not being sarcastic either, it seems they decided to take an
all or nothing approach and force the user to enable JS to use Google Maps, or
instead see this page:
[https://i.imgur.com/Qc156Ds.png](https://i.imgur.com/Qc156Ds.png)

Maybe the maps product wholly depends on JS for all functionality?

~~~
TeMPOraL
Yeah, my point is that they put actual effort to show you a nice-looking page
when viewing Maps without JS enabled. Compare to the Chrome download page,
which blanks out. This is orthogonal to whether or not a service can/should
work without JS enabled.

------
madsbuch
Simplicity matters, correct:

What is simplest: A full website in Elm, or a website developed in a plethora
of HTML, CSS each with at least a handful of libraries?

Well, Elm needs javascript..

All in all: there is not need to be religious about the tech stack.

~~~
cousin_it
HTML isn't just simple to the programmer, it's also simple in other ways which
gives it unrivalled accessibility (to disabled folks, different browsers and
devices, web crawlers). Not sure any SPA framework can compete.

~~~
SquareWheel
Web crawlers and screen readers have supported Javascript for years. This is
an outdated argument.

SPAs can also support traditional URLs for bookmarking, indexing, etc.

~~~
cousin_it
Have you tried relying on SPA crawling in practice? As far as I know it's
still messed up and hasn't changed in half a decade, but maybe I'm missing
something.

------
CoryG89
For me, the main take-away is:

> …it’s a sad indictment of things that they can be so slow on the multi-core
> hyperpowerful Mac that I use every day, but immediately become fast when
> JavaScript is disabled.

> It’s even sadder when using a typical site and you realise how much
> Javascript it downloads. I now know why my 1GB mobile data allowance keeps
> burning out at least…

As a developer, I understand not wanting to devote a large amount of time,
catering to some insignificant portion of your audience who disables
JavaScript.

And I love JavaScript, but I cringe at the thought that we are needlessly
slowing things down with MBs of JS that change too often to be meaningfully
cached.

If you are serving MBs of JS after gzip, please make sure that I am not going
to have to download all of it every time I pull up your site on my phone every
week or so.

------
oneeyedpigeon
I just did the same experiment and, coincidentally, also checked feedly, only
to receive a totally blank page. However, a little digging revealed an intent
on feedly's part to handle this case. They have a 'feedlyBlocked' element,
with content, that is simply fixed to "opacity: 0"; obviously, without
javascript enabled, they can't then dynamically display that content. Another
approach required.

As an aside, it's very difficult to contact feedly directly, but they do have
a uservoice account
([https://feedly.uservoice.com](https://feedly.uservoice.com)).

------
gorpomon
I surfed without JS on for a long time. I found that it made the web very very
pleasant!

What totally breaks is when you do a complicated bank transfer or work on some
older site. They often transfer you across multiple domains, so whitelisting
doesn't work, as you can't anticipate what site to whitelist (when you check
your points, bank.com redirects you to bankrewards.com, things like that).

What's really needed is a chrome extension to let you turn JS off and on, and
if you turn it on, it can be set to auto-turn off after some time. What would
be even better is if Chrome offered this behavior in an easily accessible way.

~~~
marssaxman
My credit union's web site hilariously misinterprets my browser's portrait-
mode aspect ratio and lack of Javascript as evidence that I must be using a
phone, and serves me up a "mobile" site which is lighter, simpler, faster, and
generally more pleasant to use than the normal one. Yes, there are a couple of
extra pages to click through when transferring money from one account to
another, but the pages load so quickly that it's still actually faster than
the Javascript-based menus in the desktop site.

Every time I set up a new browser, NoScript is the second plugin I install,
after uBlock Origin. Between the two, the web is _far less annoying_ than it
used to be.

------
wikiwatchme
We regularly advise all our clients to disable javascript.

[http://www.bbc.com/news/technology-35821276](http://www.bbc.com/news/technology-35821276)

------
mstade
The verdict comments on this page are golden. I especially like the NYTimes
verdict:

> _Verdict:_ Failing… to not work. Sad!

Delicious.

------
801699
There is a famed quote that goes something like this:

"An engineer is someone who can do with ten schillings what any fool could do
with a pound."

By this definition one could argue Javascript developers are not engineers.

If the user can get the desired data from a website without having to run 4.5M
of js in a large browser, but the developer "needs" 4.5M of js and a team of
people to deliver the data, then who is the "engineer"?

~~~
kabes
Except the schillings usually are days. And it might take JS to build the
requirements in 10 days.

'Support the 0.1% of people without javascript' usually doesn't make it beyond
the bottom of the backlog.

~~~
801699
This reasoning i.e. "supporting" people who are not using Javascript make
little sense.

It is the Javascript and website complexity, the embellishment of data with
needless garbage, that necessitates "support" i.e. work for developers. It
creates more work.

Serving text without embellishment requires less work, not more.

At some time or place in every website development project, data exists in
plain text or some other raw form.

Some users might just want that data as it is, without embellishment, before
web developers even start working.

This requires little if any "web development" work. Basic HTML can be
autogenerated with ease.

    
    
       <a href=http://example.com/data>Data</a><br>
       <pre>
        Description:  blah, blah, blah
       </pre>
    

Or the user can just use a link to json file and generate the text/html
themselves.

    
    
         # usage: $0 section 
         # sections: world, etc. 
        curl -4o .$0  https://static01.nyt.com/services/json/sectionfronts/$1/index.jsonp
        exec sed '/\"guid\" :/!d;s/\",//;s/.*\"//' .$0
    

For those who "want" and "demand" it (the 99.9% as you would have us believe),
web developers can also create a whiz bang version of the site that
encapsulates this data in a cutting edge "web app".

Meanwhile we are having this discussion on a web site that does more or less
exactly what I am suggesting. It can be easily autogenerated. The HTML could
have been written in the 1990's. It is trivial to remove the tags and have
plain text.

I guess we are the 0.1% that would ever access a website that did not need
Javascript?

The fact that the popular browsers run all manner of javascripts and process
all sorts of tags _does not obligate anyone to use them_. Whether it is web
developers or users.

~~~
kabes
That argument only makes sense for informational websites with heavy focus on
textual data. But the article looks at web apps like google maps. Good luck
creating a static version of that.

\+ I'm not saying 99.99% WANTS javascript, I'm saying 99.99% HAS javascript.
And the economics of supporting the other 0.01% often don't add up.

So people can arrogantly claim here: 'If your site requires javascript, I
don't want to use it'. But it's much more the website owner saying: 'If your
browser doesn't support javascript, I don't want you as a user'. Since it
probably costs more to have you as a customer then I will make out of it.

~~~
801699
I download static maps. I prefer maps as images or images in a PDF. But I
understand your comment. Indeed it is the textual data that is at issue with
respect to gratuitous javascript use.

A large amount of data maybe even the majority is already textual. For this
data, from a user's persective there is no valid argument that "supporting"
users who want to read text requires additonal work. Serving textual data
certainly does not require javascript.

The "arrogance" if any is displyed by a website owner who for some unexplained
reason does not want to let _any_ users (e.g. 0.1%) read text without runing
javascript. As if any user who does not care about whether the website makes
use of the latest popular browser features is a user they do not want. But
maybe that is not really the reason.

It is the last sentence in your comment that is the interesting one. Perhaps
the use of javascript is designed to take something from the customer e.g.
personal data via some discreet mechanism that requires running code on the
user's computer. If this is true, then one might argue it becomes clear why
website owners do not want users who do not use javascript. Because the
website cannot take something from the user where the mechanism of extraction
is powered by javascript.

If that is the case, it may inform the user about the website e.g. any website
that tries to force users to use javascript may be one that is trying to
extract something from the user. And as such may be a wesbite that the user
should avoid.

As a sidenote, websites routinely serve content as text without the use of
javascript because that is how they are most efficiently indexed by Google.
Thus the website must "support" Googlebot. Some users may be quite satisified
with the "Googlebot version" of the website.

------
onli
Feedly being completely blank might be bad, but I had that debate with me when
building my feedreader: Do you actually want to use a feedreader without js?
Things like marking article as read would be really cumbersome, and marking
automatically what was read on screen just now would not be possible.

------
twhb
I think it's pretty critical to note that, with only one exception, every
website that works well without JS is what HTML was built for - static
documents - and every website that doesn't, what HTML is famously bad at -
apps.

------
b0rsuk
More people should be aware of static blog generators.

It's a kind of software used to make websites where you write your posts in a
markup language like reStructuredText or Markdown, and once done a script
transforms it into HTML and CSS. Various plugins exist, including for
comments, although this tends to sacrifice your "staticness" in varying
degrees, especially if you just use Disqus.

------
w8rbt
I wrote a chrome extension to disable js. It's pretty simple and it works
well:

    
    
        var contentSettings = chrome.contentSettings;
    
        contentSettings.javascript.clear({}, function() {
    
            contentSettings.javascript.set({
                    primaryPattern: '*://*/*',
                    setting: 'block'
            });
    
        });

------
scottmf
It's 2017. All modern browsers support JS and it's no longer reasonable to
expect all websites to work very well without it.

~~~
onion2k
_All modern browsers support JS_

While that's true it's not reasonable to assume the user has JS enabled. Ad-
blocking plugins that also block JS are increasingly common.

 _and it 's no longer reasonable to expect all websites to work very well
without it._

The question should be "what does work very well" actually mean?

For a closed application that requires the user to log in I don't really care.
Users won't use the app if it doesn't work for them, so it's entirely up the
developers to build something the users want. Talking about web applications
is silly. For a public-facing website though, I would expect it to at least
_work_ without JS even if it doesn't necessarily present as good an experience
as it would with JS (which is nothing new, it's just progressive enhancement).
The problem is web sites that present an essentially a meaningless page
(blank, no content, populated but broken, etc) to users who have JS disabled.
That's where work needs to be done.

~~~
HappyTypist
> That's where work needs to be done.

Why?

Instead of building for 0.5% of users that _can just enable JavaScript_ , why
shouldn't I invest the time into building a new feature, writing more unit
tests, or writing an app for Windows Phone?

~~~
onion2k
For the same reason we build features in websites that work for blind people
(aria tags everywhere), disabled people (accessible buttons), people who speak
different languages (i18n), VIM users (GMail keyboard shortcuts), etc. We
choose who we build for. If you want to ignore 0.5% of your users then that's
your choice, but don't complain when you submit something to HN and get a page
of responses saying it doesn't work.

~~~
emodendroket
Humoring people who want to turn off JS is not really on the same level as
making the page work for people with disabilities.

------
blunte
This is enlightening. Now I'm going to test each of my commonly visited sites,
and the ones that work well enough without JS will get js-blocked -
[https://support.google.com/chrome/answer/3123708?hl=en](https://support.google.com/chrome/answer/3123708?hl=en)

------
seangrogg
I guess I should expect that an article about not using JavaScript is written
on a page with standard Google Analytics tracking and a JS bundle of...
partially-minified code that is primarily used to apply _styling_.

------
gtramont
> "(…) but I just can’t get down to less than 30 in any one window"

Just like me! I also suffer from the same disease, but on a much more serious
level… lol! Everything just crashed yesterday and the count was on 270+… :-o

------
vortico
I can see the desire to disable JavaScript, but I think it's more useful to
disable CSS on most sites. I'd be interested to see this article except
without CSS instead of JavaScript.

------
sdomino
I lol'ed at the part about tabs/One Tab, because I have that exact same
problem...

I have something like 300 tabs saved in One Tab, which is a huge improvement
on what it used to be (> 1000).

------
dzonga
Noob question? Always wanted to know is possible to build a functional web app
without JS and what would be the approach ?

------
achikin
It's like taking gasoline off the car and using horses to drive. Well,
sometimes it works.

------
12345678qwerty
People should realise that most sites today aren't websites, they're web
applications. HTML is just the template/output language. Without JS you would
have to download an even crappier, probably Windows only desktop app since it
wouldn't be possible to implement the logic.

~~~
TeMPOraL
No, most websites are still websites, and one of the biggest problem on the
web is the plague of people who think _their_ website should be an application
instead.

 _Actual_ web applications, like e.g. office suites, map applications, ect.
are obviously extempt from "should work fine without JS" rule.

~~~
pharrlax
Are you really trying to say there's no argument to be made that SPA +
prerendering is a design pattern with advantages over flat html, even for
static content?

Obviously there are disadvantages as well, but it's a tradeoff, and if
executed correctly it's one that, as fewer people turn off JS and browsers
become better at running web apps, is increasingly becoming worth the
complexity for certain use cases.

~~~
TeMPOraL
In theory, maybe there is an argument. In practice, most of the time it'll be
a pretty bad one.

For me it's about how user-hostile those decisions are. I can understand SPA-s
when e.g. the service would be infeasible if all the processing was done
server-side. But more often than not, it's just _laziness_. The devs can
deploy a sleek-looking SPA in 5 minutes with their JS toolchain, forever
dooming users to download tons of pointless JS that adds _nothing_ valuable to
their experience. But who cares, today's zeitgeist is "privatize the profits,
socialize the costs".

Think of all those sites with articles and blog posts that blank out if you
have JS disabled. There's _no_ reasonable argument to be made that they should
be SPA. That's just a case of user-hostile laziness.

~~~
drdaeman
> The devs can deploy a sleek-looking SPA in 5 minutes with their JS toolchain

Oh, if that'd be true...

Our experiments with React took quite a lot of time (like, weeks) before we
got things in shape. And then some more time before that shape wasn't a pear.

And if the backend is not in JS, server-side rendering is quite a mess.

------
ultim8k
What if you have serverless spa's?

------
taypo
I too could not get my wife to watch Star Trek Next Generation with me.

------
ojr
the best pleasing ui to users is a website loaded with javascript

------
pwaivers
Why don't you try browsing the web without CSS enabled? Or just go straight to
blocking all HTML?

JavaScript is a core part of website development, regardless if it is not the
only way to do it. It is unfair to judge websites on whether they can run
without JS.

------
progx
A day without my car:
[https://www.google.de/search?q=horse&source=lnms&tbm=isch&sa...](https://www.google.de/search?q=horse&source=lnms&tbm=isch&sa=X&biw=1758&bih=1152)
;)

~~~
scaryclam
A day treating JS like a car:
[https://www.google.de/search?q=horse&source=lnms&tbm=isch&sa...](https://www.google.de/search?q=horse&source=lnms&tbm=isch&sa=X&biw=1758&bih=1152#tbm=isch&q=toy+car)
;P

------
linkmotif
What is the motivation behind this mentality? Sure there are still webpages on
the web, but there are also web apps. Why or even how would you expect a web
app to work without JS? Why would you expect developers to spend time writing
a non JS version of their app? Before taking on this endeavor, I would
prioritize native apps, optimizations of all sorts, adding features, etc.
Unless your app is gmail, why would you ever spend time on creating an
HTML/CGI based web app? I keep seeing these posts and I just don't get why
anyone cares about this.

~~~
niutech
The content is the main thing users care about and it should be available as
soon as possible even on a low-bandwidth connection, or a low-end device, or
using a screen reader. This is what Progressive Enhancement
[https://en.wikipedia.org/wiki/Progressive_enhancement](https://en.wikipedia.org/wiki/Progressive_enhancement)
is about.

~~~
linkmotif
Progressive enchancement is a rubbish paradigm because it's incredibly
difficult to implement. React isometric rendering has made it slightly easier
but it's still one of these catchphrases people throw around lightly until
they try it.

------
stupidcar
Looking forward to the author's follow up: "A day without a CPU". I am sick of
lazy, profligate coders assuming that my computer is a von Neumann machine.
And that just because I run their software, I am happy with it spending
billions of _my_ CPU cycles. How did we, as an industry, get to this point!?

~~~
nilved
What a dumb comment. "A day without JavaScript" is to websites as "A day
without a CPU" is to paper books. Documents are not software.

