
Progressive Enhancement Is Dead - avolcano
http://tomdale.net/2013/09/progressive-enhancement-is-dead/
======
glesica
"And most importantly: Don’t be ashamed to build 100% JavaScript applications.
You may get some incensed priests vituperating you in their blogs. But there
will be an army of users (like me) who will fall in love with using your app."

This statement needs a huge, HUGE caveat that you should only be building 100%
JavaScript apps in situations where doing so _makes sense_. For example, I
find the new Blogger "web app" infuriating. I shouldn't have to stare at a
loading screen to read a half-dozen paragraphs of text, that's just stupid.
Just serve the HTML. No one is going to "fall in love" with your app if your
app shouldn't exist in the first place because the old-school solution
provided a superior experience.

~~~
jenius
I think that issue, while valid, is entirely different from what OP is talking
about. That is the question of client to server side rendering tradeoff, and
it looks something like this in my view:

\- If you have a website that people will come to then spend a bit of time in
different views (like gmail, facebook, or an analytics dashboard), it will
usually pay off to dock the initial load time a bit in exchange for much
faster loads of subsequent views. This is the base tradeoff made by switching
to client-side rendering.

\- If you have a website where most people will come check out one page then
bounce (like a news site, a blog, a page with directions or info, anything
where an article will be linked to rather than something that's used like an
app), you won't want to make that same trade, since the extra blow to load
time initially is going to hit most people _and_ they won't see the benefits
of the reduced load time for slowly adds up as you hit more views on the page,
because they will read the info then leave.

Any page can be built as a single page app, and many people will do this
automatically since it's the new hotness. But the question we should be asking
ourselves is whether what you are building actually will be used as an app or
not. I'll close out with an interesting example:

If you are building a news site, you should not build as a single page app
because it's likely that you'll have a lot of single page views and a high
bounce rate. Those single pages should be rendered as quickly as humanly
possible. However, if you are building something like a feed reader, the
situation would be the opposite. People will likely spend a bit of time in
your interface reading a variety of articles, so the additional load time at
the beginning will quickly pay itself off in faster renders for each item they
read.

EDIT: An interesting approach for a news site would be to render single
article views straight from the server or have them compiled, but render their
homepage as a single page app. Tailor different parts of your website to the
way that they are viewed. Has anyone written about this? Maybe I should write
about it...

~~~
cia_plant
Template rendering is rarely a bottleneck. When JS-only pages seem slow, it is
often because they are loading large JS libraries or are waiting for after
page load to begin fetching AJAX data. By prerendering the initial JSON data
for the page in script tags, and keeping javascript to the minimum necessary
to render the page, pure javascript can render very quickly.

------
integraton
Sometimes I wonder whether advocates for heavy client-side JavaScript ever
bother to test on mobile devices, because it very much seems that in the
majority of cases they don't. The vast majority of JavaScript-based "show HN"
submissions I've seen don't work on mobile browsers well or at all, to the
point where I've been trained not to click them. This submission from a few
days ago is an example, and last I checked it caused browser crashes and, even
if you managed to load it, was unusable with an iPad:
[https://news.ycombinator.com/item?id=6270451](https://news.ycombinator.com/item?id=6270451)

Edit: Here is another example from the past few days that's unusable on
mobile:
[https://news.ycombinator.com/item?id=6302825](https://news.ycombinator.com/item?id=6302825)

This is a common problem with media sites. I dread trying to load quartz, usa
today, gawker, etc since every thick-client media site is intermittently a bad
citizen on mobile browsers (for example, gawker in Chrome on iOS currently
perpetually reloads the page). Even when these kinds of sites work, there's
often a multi-second delay where where the user has to sit and watch elements
fly around as the page is built.

Edit again: just to be clear, if you are a non-technical product manager or
CEO type, my comment should not be interpreted in any way as an implication
that the web or JavaScript is somehow inherently bad and therefore your
developers must build a "native app." My comment's intended audience is
developers with a deep, working understanding of these technologies.

~~~
leif
I think you just invented an internet forum with a proof-of-technical-
proficiency requirement for each post. Well done, sir.

------
Pinckney
I think the point that's overlooked here is that the offenders aren't the
clever apps that would be impossible to write without javascript. Go ahead and
write those. I'm happy for you, really.

The problem is pages that require javascript to display static content. There
are very few good reasons for an article, or an image gallery, or a homepage
that could have been displayed just fine a decade ago to now need javascript
so it can do some stupid flashy thing that breaks the expected interface
behaviour. And frankly, that's most of what I'm seeing in "Sigh, Javascript."

~~~
leokun
Check out meteor.js for why JavaScript would be needed for "static only
content". Real time updates. That's why. Everything is an app.

~~~
mynameisme
Is this something people even want in content driven sites though? Imagine
reddit in real time, it would be a mess and impossible to keep track of what
you last read.

~~~
kansface
Why should users have to click refresh for pages to load more comments or to
see when someone replies to you? The fact that reddit does not do this is not
evidence that it would be confusing.

~~~
icebraining
Because sites that distract me from what I'm reading to tell me that there are
new comments are extremely annoying.

Sure, you could add, say, auto-loading for new comments when you reach the
bottom of the page, or setting the message icon to orange when you get a
reply, but that hardly requires a full-blown web app, it's a simple addon to a
static site.

~~~
camus
disqus deals with this issue quite well , disqus will just notify you there
are new comments ,it's up to you to load them or not.

------
padolsey
> Don’t be ashamed to build 100% JavaScript applications. You may get some
> incensed priests vituperating you in their blogs. But there will be an army
> of users (like me) who will fall in love with using your app.

We all want wonderful experiences as users. The crux is almost a question of
"how we want things to be" and "how we want to get there".

For me, the 100% JS MV __movement is wonderful for a specific genre of app: An
app that is:

* Behind an intranet

* Behind a paywall

* Behind a login-wall

* Prototypes / Demos / PoCs / etc.

But for the open web -- wikipedia, blogs, discussion forums, journalism (etc.)
this movement detracts from the web as a whole, in that it excuses developers
from having to worry about degraded and/or non-human consumption of their
websites' data.

We have to ask ourselves what we, as humanity, want from the web. Do we really
want a web of 100% bespoke JavaScript MV __web-apps with no publicly
consumable APIs nor semantic representations? If that is the intent and desire
of the developers, designers and otherwise educated-concerned web-goers, then
fine, let 's do that and hope it works out okay...

But there is an alternative that has its roots already planted deep in the web
-- the idea and virtue of a web where you can:

* Request an HTTP resource and get back a meaningful and semantically enriched representation

* Access and mash-up each-others' data, so as to better further understanding & enlightenment

* Equally access data and insight via any medium, the latest Chrome or the oldest Nokia

So, please, go ahead and create a 100% JS front-end but, if you are creating
something for the open web, consider exposing alternative representations for
degraded/non-human consumption. It doesn't have to be progressively enhanced.

Imagine for a moment if Wikipedia was one massive Ember App... And no,
Wikipedia is not an exception from the norm -- it is the embodiment of the
open web.

~~~
tomdale
Every Ember.js app, by definition, connects to an API that does all of the
things you are asking for.

Seriously, open up any Ember.js app and look at the network traffic. You'll
see a series of requests, usually using very RESTful URLs, that requests the
document.

The only difference is that, instead of HTML, where you are conflating the
markup and the content, you get a nice, easily-consumable version of the
document in JSON form.

There is literally no change to the web here, other than the UIs are faster
and better, and it's easier for me to consume JSON documents than trying to
scrape HTML.

~~~
padolsey
That could work -- if the JSON is intended for public consumption, and if it
is documented as so. The problem, I'd argue, with JSON is that it does not
intentionally facilitate semantic annotations, unlike HTML(5). I'd argue that
a properly marked-up HTML5 representation of a piece of data is more useful
than a bespoke JSON structure with crude naming liable to change without
notice. The benefit I get with an HTML representation is that it's the exact
thing that was intended for the user to read/consume, whereas JSON is awkward
to divine meaning from without the crucial app-specific view-logic that turns
it into DOM.

How would you reconcile the need for an open-semantic-web with arbitrary JSON
structures with no governing semantic standard?

EDIT: An example of a potential problem: Please take a look at how the Bustle
app you referenced brings its article content to the front-end:

E.g. [http://www.bustle.com/articles/4470-why-we-should-root-
for-l...](http://www.bustle.com/articles/4470-why-we-should-root-for-lamar-
odom)

View source. It's not a public REST API (not visibly so); it's awkwardly
embedded as literal JS in the HTML document itself... That'd be hell to
publicly consume through any kind of automation.

~~~
tomdale
You are building up a strawman against JSON without acknowledging that every
problem you outline applies just as much, if not more, to HTML.

Is the HTML of any popular website publicly documented? Is there any guarantee
that an XPath to a particular value won't change? Is there any guarantee the
data I need is marked up with semantically accurate class names? No.

HTML is intended for public consumption—by a human, at a particular time. It
is not a data interchange format.

Contrast that with things like Twitter or GitHub, which provide a versioned
JSON API that is guaranteed not to change. Your web site becomes just another
consumer of that API.

JSON contains all of the data you need, but in a way designed to be consumed
by computers, and you don't have to do all of that awful HTML scraping.

And as for Bustle not having a public JSON API, well, here you go:

curl -H "Accept: application/json"
[http://www.bustle.com/api/v1/sections/home.json](http://www.bustle.com/api/v1/sections/home.json)

A versioned JSON API that is guaranteed not to change. Can any public site on
the internet guarantee that about its HTML?

~~~
padolsey
A versioned JSON API is awesome, I am not denying that. I also don't deny that
the current state of the HTML markup on most sites is semantically rubbish.

Regardless of this entire PE debate, we would still have a problem, on the
web, of data being out of reach due to walled apps that only serve rubbish
HTML.

The problem of open + semantic data is very relevant to this discussion but
we're pretending that one "side" has all the answers. I want a better web --
more open -- more semantic -- and maybe some shimmer of a truly semantic
web[1] will emerge in the next 20 years.

So, yes, a 100% JS App is 100% awesome if, IMHO, it has:

* A publicly documented and consumable REST API

* Semantically enriched data through that API

* Some kind of degraded state NOT just for search-engines but for older devices and restrictive access (e.g. behind national/corporate firewalls)

I am not interested in being one side or another regarding this PE feud, and I
am sure you're not either. I am trying to question what is best for the web
and humanity as a whole. I don't think we have a silver-bullet answer. I do
think it's necessary to dichotomize walled web-apps and open websites, and the
latter deserve additional thought regarding usability, accessibility and
semantics.

[1]
[http://en.wikipedia.org/wiki/Semantic_Web](http://en.wikipedia.org/wiki/Semantic_Web)

~~~
phpnode
and in fact that particular bustle link you posted is a perfect example of
where using HTML5 + microdata would be not only faster to render and crawlable
but also allow the underlying data structure to be consumed by javascript.
There's no reason why

    
    
        Bustle.pageData.article.title
    

couldn't have been extracted from

    
    
        <article itemscope itemtype="/article">
            <h1 itemprop="title">Why We Should Root for Lamar Odom</h1>
            ...
        </article>

~~~
iopq
It's a real bitch to implement when your use case is complicated and nested.
From experience, I have no idea how to mark up a document "correctly" because
of like recursive definitions of some microdata. Like flingy can contain
thingy. A thingy can contain flingy.

Should I mark my concrete item as flingy where some elements are thingies or
should that be a thingy with some subelements as flingies?

I just did my best and called it a day. Then spend a lot of time debugging it
in the microdata analyzer tool.

~~~
phpnode
you can mark sub-scopes with their own itemscope property, so you could have
this:

    
    
        <article itemscope itemtype="/article">
            <h1 itemprop="title">My Title</h1>
    
            <div class="related-thing" itemscope itemtype="/author" itemprop="author">
                <h1>
                    <span itemprop="firstName">John</span>
                    <span itemprop="lastName">Smith</span>
                </h1>
            </div>
            
            <aside class="incidental-unrelated-thing" itemscope itemtype="/sausage">
                <span itemprop="type">Cumberland</span>
                <span itemprop="ingredients">Mystery Meat</span>
            </aside>
        </article>
    
    

in the example above, we define an article that has a related author property,
that author property is its own scope so has firstName and lastName properties
of its own. We also define an _unrelated_ itemscope (unrelated because it has
no itemprop) that happens to be nested in the same element, so this would
parse to:

    
    
        [
            {
               "type": "/article",
               "properties": {
                   "title": ["My Title"],
                   "author": [{
                       "type": "/author",
                       "properties": {
                           "firstName": ["John"],
                           "lastName": ["Smith"]
    
                       }
                   }]
     
               }
            },
            {
                "type": "/sausage",
                "properties": {
                    "type": ["Cumberland"],
                    "ingredients": ["Mystery Meat"]
                }
            }
        ]

------
kleiba
There are dinosaurs like me who use the web _mostly_ for reading stuff on
websites. I also happen to use an old, quite slow computer as my default
machine.

It aggrevates me when a site that I try to open because of its textual content
takes 30 seconds to render since there's too much Javascript going on. Then
I'm typically sitting there thinking: "how hard can it be to display a piece
of text?" Because of this, when I see my CPU spike as I try to open a website
in a new tab, I very often decide to simply close the tab again and do without
the information I originally came there to see. This happens a lot for me with
online magazines, such as wired or techcrunch. One trick is to invoke the
"Readability" bookmarklet if I can get to it fast enough, i.e., before the
JavaScript has frozen my browser completely.

Of course I understand that I am part of a tiny minority. And probably I'm not
part of your target group anyway. And the web is so much more in 2013 than
pages with text on them.

If you _do_ , however, want someone like me to come to your site, you better
remember to keep it dinosaur-friendly.

~~~
trafficlight
In all honesty, why should I care about you, the tiny minority? Why should I
waste any time at all worrying about you?

~~~
mwcampbell
There are two different kinds of minority to consider:

1\. People who are part of a minority by choice. For instance, they choose to
use an old computer, old OS, old web browser, etc. even though it is within
their means to upgrade. Or they have current software but intentionally
restrict it with add-ons like NoScript. You might be justified in not catering
to these minorities, because it's just not profitable and you have no moral
obligation to do so. Maybe the GP falls in that category; I don't know.

2\. People who are in a minority through no choice of their own, and have no
power to change their circumstance. One example would be a poor student or
job-seeker who's stuck with an old computer and can't do anything about it.
People with disabilities also fall in this latter category. I know someone who
once lost a job because he is blind and some inaccessible software barred him
from doing that job. He described how that felt in this blog post:

[http://blindaccessjournal.com/2006/02/torn-from-the-
collecti...](http://blindaccessjournal.com/2006/02/torn-from-the-collective/)

All that to say that your comment is quite insensitive. Those are real people
in that small minority, and depending on why they're in that minority, we
developers might have an obligation to accommodate them.

EDIT: Yes, I know that JavaScript apps can be accessible.

~~~
trafficlight
I should've phrased it better, but I was asking him as a person of minority by
choice.

I fully understand making a site accessible for those who need it, but I don't
understand the NoScript people.

~~~
psionski
What is there to understand about NoScript people? They just dislike malware,
having their accounts stolen and having their personal data up for sale.

Also, if you don't have an unlimited Internet plan, not having to download 5MB
of trackers, "analytics" scripts and Flash ads can be a pretty significant
advantage.

------
thezilch
_What I’ve found, counter-intuitively, is that apps that embrace JavaScript
actually end up having less JavaScript. Yeah, I know, it’s some Zen koan shit.
But the numbers speak for themselves._

The author does very little to support his claims. The Boston Globe page also
has lot of scripts to support advertising and the advertising itself. As well,
entirely different engineering teams and probably cultures. There's not even
any research into The Boston Globe's use of progressive JS; it makes ZERO
sense why the two homepages could not have the same JS footprint, with The
Boston Globe continuing to work and Bustle continuing to not work, while JS is
disabled.

I'm all for not supporting progressive JS; Bustle is certainly within their
right to not work without JS; the author is just caught in a confirmation-bias
bubble. His conclusions don't make sense; our intuitions are right; it doesn't
take [much] more JS to progressively enhance a site.

------
crazygringo
> _Worrying about browsers without JavaScript is like worrying about whether
> you’re backwards compatible with HTML 3.2 or CSS2. At some point, you have
> to accept that some things are just part of the platform._

This is the key bit.

It's a pretty popular attitude on HN to dismiss supporting IE, or IE7, or even
IE8 or IE9 -- despite having significant user bases. But there's still a
strong vocal contingent which argues for webpages to still work fine with
without JavaScript, despite it being a miniscule user base. They both seem to
come from philosophical standpoints, rather than anything practical. (Granted,
SEO is a valid consideration, but that's fundamentally a different
conversation.)

~~~
hollerith
It's not just the people with JavaScript turned off: it's people who, for
example, rely on the Readability or Readable bookmarklets or Safari's Reader
functionality.

In general turning documents into programs deprives users of those documents
of a kind flexibility that they enjoyed when documents were just data.

And does it not bother you that web-browser development has gotten so
complicated and labor-intensive that there are exactly five organizations with
the resources to maintain a web browser?

What hope do operating systems with very small userbases like Plan 9 have of
_ever_ running a web browser capable of displaying correctly the majority of
web site?

Browser complexity closes off certain opportunities: e.g., about 15 years ago,
a blind programmer named Karl Dahlke wrote a "command-line web browser" called
"edbrowse" that has a command language similar to the line editor ed. Is it OK
with you that the fraction of web pages browsable with edbrowse keeps going
down?

Another way that making the web a richer application-delivery platform reduces
the options available to users: approximately nobody bothers to maintain a
local copy of the web pages they have browsed (which would be among other
things a useful insurance against web pages disappearing from the web) because
it is so complicated to do.

And then there is the loss of consistency useful to readers. For example, when
you click on a link, then hit the Back button, the browser used to always put
you back at the same place in the web page that you were when you clicked the
link. Not anymore: for example, if you click a search result on hnsearch.com,
then hit Back, you are taken to the start of the web page containing the
search results with the result that you have to scroll through results you've
already sifted through just to get back to the state of progress you were in
when you clicked the link.

A possible reply to that is that the maintainer of hnsearch.com should fix his
web site. But the number and variety of "losses" or "regressions" like that
one is so large -- and increasing so fast -- as to make me doubt that
webmasters will ever get around to fixing most of them, particularly since
webmasters on average are less technically knowledgeable than, e.g.,
programmers are.

Selecting an extent of text in preparation for copying it is another thing
that has become less consistent and controllable over time: sometimes when I
just want to select a few words, slight movement of the cursor while dragging
will cause an entire adjacent column of text to be selected or de-selected
according to rules that are essentially unknowable to the reader.

In the past, for about 15 years, the space key consistently meant "scroll down
a screenful" (provided the page as a whole has the focus -- as opposed to,
e.g., a TEXTAREA in the page). The desire to turn the web into an
applications-delivery platform caused the web site to gain the discretion to
map the space key to something else, which is a gain for authors of web apps,
but a loss for readers who used to be able to depend on consistent behavior
from the space key.

In summary, although I am happy that many thousands of applications developers
are now able to make good livings without becoming a "tenant" or a "captive"
of a platform owned by a single corporation, I am sad about how complicated,
tedious and mystifying it has become to use the web to consume static content
-- and how expensive (in programmer time and effort) it has become to put
static web content to uses not foreseen and provided for by the author of the
content.

~~~
dredmorbius
Amen to Readability.

Increasingly, site design does little but piss me off. I use a set of tools,
Readability and Stylebot included (484 styles and counting, several of those
applying to multiple sites) to address the more severe annoyances (H/N is one
of my restyled sites). What's particularly annoying are content-heavy sites
(blogs, online periodicals) which break Readability and/or aren't restylable
with Stylebot (I recenty encountered a Blogger template which _navigated to a
different page_ when I tried editing CSS in the Stylebot editor).

In the original article, Tom notes:

 _At some point recently, the browser transformed from being an awesome
interactive document viewer into being the world’s most advanced, widely-
distributed application runtime._

That's pretty much the conclusion I'd reached, though my preference is that
tools _which are useful for presenting and managing content_ would be
developed:
[https://plus.google.com/104092656004159577193/posts/LR7jubsX...](https://plus.google.com/104092656004159577193/posts/LR7jubsXBgu)

Readability is useful, but addresses only a subset of the features I'd like.
I've been collecting a large set of literature through it and using Calibre.
In particular I want bibliographic capabilities and indexing, as well as much
larger tag lists (I ran into Readability's 500 tags per user limit within 3-4
days).

The other problem with JS is that I'm increasingly running into single Web
apps which consume, literally, a gigabyte or more of memory (Google+ is
perhaps the worst of these).

Which means: I could run a lightweight desktop application which provides a
basic set of functionality ... or I can run a browser with perhaps a handful
of tabs open, and absolutely pig out my system.

The browser is a decent rapid-development and rapid-deployment environment,
but it's still seriously wanting for real productivity.

------
shawnz
This author is making an assumption that progressive enhancement exists only
so that people who are browsing without Javascript can have a better
experience. Of course, this isn't true. Progressive enhancement is a good
thing because it encourages you to be as descriptive as possible at every
layer of your technology stack.

Why does it matter in practice? Well, there's more than one reason, but
consider that not every user agent is a browser with a person sitting in front
of it. Your website also should be interpretable by content indexers like
search engines, accessibility devices like screen readers, and so on.

Some services don't fit this model and really are better off being designed
like desktop applications written in HTML and JS. But in my experience, most
services can be modelled more like websites without making the design any more
difficult to reason about, and almost all users' experiences are bettered by
it.

~~~
tomdale
First, the accessibility argument is a red herring that I'm getting frustrated
people continue to try to throw around. Screen readers support JS, okay? Let's
put this argument to bed.

[http://words.steveklabnik.com/emberjs-and-
accessibility](http://words.steveklabnik.com/emberjs-and-accessibility)

Okay, thank you for letting me get that off my chest.

Second, one nice thing about "embracing 100% JavaScript" that I talk about in
the post is that it requires you to implement a really solid JSON API, because
your web site is now a true client that consumes an API. This makes it really
easy to integrate with third-party services that consume your content. I agree
that putting content behind JavaScript sucks; I'm just advocating that the
content be JSON (or some other normalized format), not HTML.

~~~
sergiosgc
Can you give an example of one of your sites with a "really solid JSON API"?
I'm afraid there's a gap in this definition, as good APIs are rare and good
JSON APIs with a single client are virtually non-existant.

~~~
steveklabnik
Here's Bustle's:
[http://www.bustle.com/api/v1/sections/home.json](http://www.bustle.com/api/v1/sections/home.json)

It needs run through a pretty-iffier, of course, but seems straightforward to
me.

------
__alexs
"Friendly reminder that "people with JS disabled" includes those on high-
latency networks, bad firewalls, and browsers you don't support." \- @jcoglan

[https://twitter.com/jcoglan/status/370173041193406464](https://twitter.com/jcoglan/status/370173041193406464)

~~~
ebiester
I've thought about this.

If I have a blog or similar media site, and require javascript, I might offer
a 15/month option to allow access to a text only interface and an RSS feed. No
graphics, no pictures, a very simple link and text interface.

And the moment I start seeing subscriptions coming in, I'll believe in
progressive enhancement again.

(Progressive enhancement doesn't affect me as an application developer in the
web space. But if it did...)

~~~
qu4z-2
Realistically most of your readers will come through links and the like, and
thus not be willing to pay a subscription.

If it were a site I frequent, I'd probably be willing to pay $3-4/month for a
simple, clean, static html version (no need to remove the pictures though -- I
can choose whether to load those client-side). Something like this, say:
[[http://mnmlist.com/unknown/](http://mnmlist.com/unknown/)] I'd only pay
15-20/month if I could get the entire web like that :)

Most likely, at 15/month I'd just not visit your site.

EDIT: PLEASE don't use the navigation from that site though. It's ... way too
vertical.

------
ritchiea
_At some point recently, the browser transformed from being an awesome
interactive document viewer into being the world’s most advanced, widely-
distributed application runtime._

This is the key sentence in the article and this is why I was motivated to
become a web developer. Recently someone asked me if I felt like I was missing
out by doing most of my programming on the web since desktop apps are "real
programming" and I said no because the web is the best environment for writing
apps today. I don't have to choose whether I want to write for Mac OS which I
use myself, or Windows which most consumers use or Linux which hardcore
techies use. I don't have to choose if my mobile app is iOS or Android first.
Sure there are still tradeoffs, and sometimes a desktop or native mobile app
is still going to be a good choice. But the browser today is an amazing
environment that everyone on the web has access to and it's only getting
better. And we should be excited about leveraging everything modern browsers
can do to make great software.

~~~
rimantas
That key sentence is at least half wrong. The „most advanced“ part at least.
And web was never the best environment for writing apps. And it never will be.
It's like hoping to transform the fork into the spoon. I have spent one and a
half decades in it, and it may be becoming bearable environment, but far far
from the best. Unless it's the only thing you know, then it is the best by
definition.

~~~
ritchiea
Maybe we're reading that sentence differently? I read it as "the most widely
distributed runtime that is also relatively advanced." I took it as a given
that Tom Dale does not believe the web is the most advanced programming
environment.

------
andrenotgiant
Progressive Enhancement is still important for CONTENT SITES

Why? Search Engine accessibility.

It used to be that Googlebot wouldn't find content loaded asynchronously, or
links that rely on Javascript. Now it's different - You can confirm that
Googlebot discovers a lot of Javascript links using Webmaster Tools:
[https://www.google.com/webmasters/tools/home?hl=en](https://www.google.com/webmasters/tools/home?hl=en)

BUT - There's still no way to break from the "Page Paradigm" \- Google needs
URLs to send searchers to. They don't yet send people to specific states of a
page. That's why I still use Progressive Enhancement, it forces me to ensure
each piece of content has a URL that points to it.

~~~
bluepnume
Having URLs which point to specific resources is still 100% possible for
javascript web apps. Take a look at Ember, as an example, those guys place a
load of emphasis on URLs as 1st class citizens in frontend web apps.

------
thecoffman
This post does nothing to address the biggest reason people might have
Javascript disabled: security. If I'm browsing through Tor (or whatever), I'm
not going to turn on Javascript to use your site. If your site doesn't work
without it, you've lost a customer.

Granted, people who disable javascript are obviously vastly outnumbered, but
just saying "fuck you" to security conscious (and most likely tech-savvy)
users seems like a mistake.

~~~
robbyking
I think the real question is which group is larger: the customers I'll gain by
enabling a feature that requires JavaScript or the customers I'll lose because
they're unable to view a feature that requires JavaScript.

According to developer.yahoo.com[1],

> After crunching the numbers, we found a consistent rate of JavaScript-
> disabled requests hovering around 1% of the actual visitor traffic[...].

I work for a medium sized company that version tests almost every change we
make, and in the end the version that wins is the feature with the higher
conversion rate.

[1] [http://developer.yahoo.com/blogs/ydn/many-users-
javascript-d...](http://developer.yahoo.com/blogs/ydn/many-users-javascript-
disabled-14121.html)

~~~
kintamanimatt
1% is a small percentage but can make up quite a large number of people.

~~~
robbyking
I completely agree, but if a version test wins by more than a few percentage
points, it usually makes sense to implement the feature even if it does make
the page inaccessible to 1% of users.

~~~
kintamanimatt
I honestly doubt you'd be saying that if you were part of that 1%!

~~~
ewang1
Except here, the 1% chose to be in the 1%.

------
wmt
I tried the Bustle.com, showcased in the article as a good example of a pure
Javascript website, on my Android browser, and the Javascript was used to
reserve 25 % of my screen to show me a "BUSTLE"-banner that doesn't go away
when I scroll down.

Don't expect your users to have a mouse. The share of web users on their
mobile phone has grown from 6,5 % to 17,25 % since June 2011. Any bets on what
the share will be in a year or two from now?
([http://en.wikipedia.org/wiki/Usage_share_of_web_browsers#Sta...](http://en.wikipedia.org/wiki/Usage_share_of_web_browsers#StatCounter_.28July_2008_to_present.29))

------
mynameisme
There's no reasons sites like tumblr shouldn't work without javascript.
Period. And while there are some things on the web that are genuine
applications (trello, dashboards, etc.), the vast majority of things are
content driven, which should never require js.

~~~
d0m
You claim facts without explaining your reasoning.. Let me put it this way
using your own vocabulary:

    
    
        There's no reasons sites like tumblr *should work without javascript*. Period.
    

Why why!

~~~
ris
I rather think the onus is on people like you to come up with a reason that
they _shouldn't_.

~~~
d0m
Well, philosophically, I think having static content served in a backward
compatible way is perfect. Pragmatically, I haven't found a good solution to
do it without duplicating all the code. That by itself might be a very good
reason as to why one would focus the development efforts on the 95% of users.

You say _people like me_. I'm more agnostic here.. just interested to
understand the arguments of both side. I just think that a post saying "This
is black. Period" doesn't add much to the discussion.

~~~
mynameisme
What duplication? Why do you need to use js to render text? I'm not saying
that a user without js should expect all, or any of the bells and whistles.
But to use client side js rendering to show a user a paragraph or two (like
blogger), or literally a sentence (when twitter did client side rendering) is
just insane. It breaks stuff, and is MORE work than just doing it the right
way.

~~~
d0m
Well, without getting into details and irrelevant domain specific examples,
let's say you want to load new content as you scroll down.

\---------------- HTML static content way:

1\. The new content will be fetched using ajax. Already some problems. From
the server-side rendering, it needs to fetch it from the DB, then load it in
the template. Using django, for instance, the view will fetch it and we'd show
it using {{variables}}.

However, to fetch the data using ajax, it's not just the view making a query,
it needs to have an API standpoint, i.e. /api/fetch-new-data/. So, already,
the code is duplicated. Yes, the server-side could use the same API, but
there's always the problem of returning JSON vs django-ORM-queries, etc.

2\. Once the data is fetched (say in json), it needs to be rendered. How? Do
you simply do a

    
    
        $.get('/api/whatever', function(data) {
            $('.some-div').append('<p>' + data + '</p>');
        });
    

It's fine if it's just a <p>. But usually we'd have a more complex html and
thus we'd be using client-side template. So, the server-side templates need to
be duplicated. There are various ways to do it, varying from complexity, but
there is clearly some duplication here. And, bear with me, it's usually not a
simple .append, more javascript needs to be done which can alter the data,
etc.

\------------------------- Javascript way

1\. Load pure html. 2\. Fetch initial data and render it using client-side
template. (It can also be bootstrapped since it's in the same JSON format).
3\. On scroll, fetch more data, and render it using the exact same code /
client-side template. \----------------------

Lastly, bear with me that it's just a simple example. It's rarely 100% static
content only. I.e. there would be forms with error validation, etc. If you
want to do it the no-javascript way, you need to have it submit, reload the
view with validation error, etc. Only then, you can add some javascript to
enhance it. Contrast that to simply validate on javascript submit. Yes,
obviously, the server needs to validate it, but that's part of the api,
nothing needs to be re-rendered.

All in all, if you agree that the same code would do the pre-loading and the
dynamic real-time stuff, you usually save yourself lots of headache and
complexity.

I believe blogger had that problem as they show different views for the same
content. When you switch between views, it dynamically updates it in
javascript, rather than doing a page reload. Could it be made better by having
a html/css content for no-js browser, sure thing. Would that duplicate the
code? Sure thing. And again, there are different varying of complexity. Maybe
that example wouldn't be too complex, but it's more code to maintain, to test,
etc.

It's a bit late here, but hopefully you understand what I mean. And by the
way, I'm still not 100% certain about what's the best approach. I know that I
used to be pro html static first for backward compatibility and no-js users,
and only javascript to enhance the page. But recently, I've tested it by doing
it in javascript on the client and it's seriously so much faster and cleaner.

And yes, there are some good frameworks to deal with the duplication, but it
adds lots of complexity. For instance, see airbnb Rendr which try to solve
that exact problem that I'm talking about. I.e. being able to load backbone.js
on the server during the pre-rendering stuff.

------
acjohnson55
... _the browser transformed from being an awesome interactive document viewer
into being the world’s most advanced, widely-distributed application runtime._

If only that were actually true. In reality, we're designing the interfaces
for these applications using a presentation language made basically for
desktop publishing. For interactivity, we essentially have one more or less
shite language ([http://bonsaiden.github.io/JavaScript-
Garden/](http://bonsaiden.github.io/JavaScript-Garden/)) to choose from. We're
still arguing over the very basics on whether we should use callbacks,
promises, generators, etc. for _simple sequential operations_. Hell, we're
still trying to figure out how to get a reasonable call stack record to debug
when working with any of these options. And God help you if you want to use a
modern language that compiles to Javascript _and_ have your debugger too.

But to address the author's original point, I think progressive enhancement is
alive and well. While the majority of browsing is done on the desktop, I just
think it makes way more sense to think first about presenting your basic
content and _then_ enhancing it than how you're going to strip out all the
bells and whistles to get your design across on less capable platforms. In the
long run, the former will probably save you more time and QA effort. It's just
more natural to think about using capabilities when present then working
around their absence.

And no one says your baseline should to a screen reader for all possible web
apps. Just pick a the baseline that makes sense for what your doing, and
enhance from there. At some point, it may make more sense to fork your
platform and have separate implementations for different pieces of your
interface. It doesn't have to be one monolithic project that magically
enhances from mobile phone screen reader all the way up to VR cave.

------
ronaldx
I disagree strongly. Concisely:

JS has many potential UI/UX benefits which should be used for the users'
benefit: although they can also be used to users' disadvantage.

If your (static?) website shows blank with no-JS, I find it unlikely that
you've considered UI/UX at all. I therefore assume that you are more likely to
fall on the disadvantageous side.

------
Xcelerate
I agree with all of his points. I think a lot of the counter-arguments are
centered around "Yeah, but I see websites that unnecessarily use Javascript
when a simple text-based solution will work". That's not a Javascript problem;
that's a site-design problem.

I'm sure you could make a dynamic page that has a negligibly different loading
time compared to a static page that both display similar (static) content, but
it's the way that you do it that matters. Loading a page, that loads a
library, that pulls in another library once the page is loaded, that then
displays spinning gears while pulling in a bunch of static content is of
course the wrong way to do it for a lot of things. But that's a design
problem.

------
dham
First of all you still get html in the end. So whether you get the slightly
reduced json payload size and use the users cpu to generate html or just get
cached html from the server you still end up with the same thing, and
arguably, similar response times.

I find the answer is not one or the other it's both. If a certain page
requires interactivity then embrace Javascript and do the interactivity with
Angular or Ember. You end up writing less Javascript. If you do it as
decorating html using jQuery then you will end up with more javascript. Most
pages in web apps don't require this much interactivity on every page though.
There may be a few pages here and there. Most of it is just document viewing.
In that case just send down cached html. Sure Bustle.com is fast, but so is
Basecamp, both take entirely different approaches to display pages.

When I first got into Knockout a few years ago, I was a kid in a candy store.
I wanted to do everything with Javascript and Knockout. Soon I grew up and
realized you just don't need all that crap to display an f'in table. It's just
a table for God's sake. We have been displaying tables since the dawn of web
browser. In fact you will pay client CPU cost trying to display a table in
Angular when you could just send it over in HTML.

Now if that table requires heavy editing(not filtering, or sorting, that stuff
is easy as decoration), then sure bring in Angular.

On the other hand if I need drag and drop, validation, on complex forms, I'll
definitely bring in Angular.

Choose the right tool.

------
josephscott
Stating that one approach or the other is always the right way is the problem.
Figure out which one works best for the type of site you are working on.

How your site will be used is often a high level indicator of which approach
will provide a better experience for your users. Gmail for example, no public
part of the site, not uncommon for users to leave it open in a tab all day.
Often great for all Javascript approach.

Twitter on the opposite end. Lots of public facing pages, performance was
worse when they required Javascript just to render 140 characters on the
screen. This style of site is generally better off with a progressive
enhancement approach.

------
Joeboy
Great, I'm going to be debugging other people's websites in my spare time as
well as at work.

~~~
qu4z-2
I experienced this recently. I had to look through the source of a site
because their image gallery javascript was broken.

I was already looking at a gallery of thumbnails... how hard can it be to make
each thumbnail a link to the image in question, and at runtime attach a
javascript handler to open the image in a pop-up or whatever?

------
andybak
A great javascript app is wonderful thing but if you fail, you tend to fail
hard.

A HTML/CSS page + progressive enhancement tends to involve much less shooting-
oneself-in-the-foot.

If you've got the talent, time and budget to do it well (note that is 'AND'
not 'OR'. Gawker being a case of 2 out of 3 not being enough) then please go
ahead.

However - if you have any doubts about your ability to see the whole thing
through to perfection then a half-assed website is much less awful for your
audience than a half-assed app.

------
tlrobinson
If you live somewhere with a decent internet connection and don't travel often
you may have forgotten that lots of places still have slow or unreliable
internet connections. Most of those people probably have JavaScript enabled
too, but every extra request required to use the site is a point of failure.

I'll be the first to advocate requiring JavaScript when doing so significantly
increases value, but for content sites please at least include the main
content directly in the HTML.

------
toddmorey
I pretty much agree with all of the points in this article. I do wonder,
though, why Bustle.com (the example used) is an Ember.js app and why it
displays nothing but a blank page if Javascript is turned off. Skylight makes
perfect sense as a full JS app. But Bustle, a content site, seems to be more
of an "interactive document" (as he mentions).

~~~
tomdale
Bustle actually uses PhantomJS to create static versions of the site, for
serving to search spiders.

The advantage of using Ember.js for Bustle is that it's really, really,
ridiculously fast. Seriously, try it. Go to bustle.com and click around.

They could make it work without JavaScript, but they're a new company with a
long list of technical challenges to solve. They ran the numbers and the
percentage of users with JS disabled is so microscopic it just doesn't make
sense to spend time on it.

~~~
Isofarro
"Bustle actually uses PhantomJS to create static versions of the site, for
serving to search spiders."

What value dose using PhantomJS offer above what progressive enhancement gives
you for free?

This feels like a contradiction. Avoiding progressive enhancement, and then
bolting on a hack for search spiders in a way that is less robust than the
technique you're avoiding.

"The advantage of using Ember.js for Bustle is that it's really, really,
ridiculously fast. Seriously, try it. Go to bustle.com and click around."

The initial load is horribly slow. For example:
[http://www.bustle.com/articles/4549-pew-poll-american-
people...](http://www.bustle.com/articles/4549-pew-poll-american-people-dont-
support-syrian-strike) took 10 seconds. That's very slow for a one page
article.

That's gonna hurt people following links to the page. That's exactly the
reason why Twitter walked away from it's JavaScript driven content and went
for progressive enhancement. A substantial portion of new visitors will have a
cold cache, and have this annoying wait for what is effectively a one page
article. (cf. [http://statichtml.com/2011/google-ajax-libraries-
caching.htm...](http://statichtml.com/2011/google-ajax-libraries-caching.html)
)

"They could make it work without JavaScript, but they're a new company with a
long list of technical challenges to solve. They ran the numbers and the
percentage of users with JS disabled is so microscopic it just doesn't make
sense to spend time on it."

Another contradiction. Progressive enhancement isn't a technical challenge, it
is so brain dead simple.

So they figured out the number of users with JavaScript disabled is too small
to warrant supporting. Interesting, except, progressive enhancement isn't
solely about people with JavaScript disabled, right? cf:
[http://isolani.co.uk/blog/javascript/DisablingJavaScriptAski...](http://isolani.co.uk/blog/javascript/DisablingJavaScriptAskingTheWrongQuestion)

Also, so they ran the numbers of these, under the incorrect assertion that
progressive enhancement only impacts people with JavaScript disabled. But did
they run those same numbers that determined the number of search spiders
wasn't microscopic, to justify the PhantomJS bodge you initially mentioned?
Are you really implicitly asserting that there are far more search spider
visitors to bustle.com than visitors with JavaScript disabled? (I'd love to
understand the logic that lead to that determination).

So, what justification was there to spend time on building a PhantomJS site
scraper to provide static content to search engine spiders, and yet fail to
appreciate that progressive enhancement would have served that spider
audience, as well as the JavaScript audience, as well as the variety of issues
that progressive enhancement helps alleviate?

But, if bustle.com were a content site, then progressive enhancement is the
way to go, right? But this is an ember app, so it is not a content site
(clearly). Except when search spiders visit, then there's a need for a static
version of each page. It's very confusion. is bustle.com an app or a website?

Also, how does bustle.com / ember, guarantee perfect delivery of assets other
than HTML to a visitor's browser? How does it guarantee robustness?

For example, when using a CDN, how does it manage when this happens:
[http://www.theregister.co.uk/2012/01/05/google_opendns_clash...](http://www.theregister.co.uk/2012/01/05/google_opendns_clash/)

How does bustle.com / ember protect your JavaScript so that when a third-party
chunk of JavaScript (like Google Analytics, Disqus, Facebook, ChartBeat,
Quantcase, WebTrends) does something funky, or hiccups and causes a JavaScript
error?

------
EGreg
Javascript is great for making more efficient sites. Here's why:

1) Static resources can be cached on a CDN and composited on the client
instead of an overloaded app server

2) You can load data instead of heavy and repetitive HTML over the wire

3) You can cache the data in the client and re-use it later, making for
snappier interfaces

That said, you have to watch out for URLs. Just because you can write
everything with javascript doesn't mean you should break URLs. And of course,
crawlers other than google's crawler will probably not be able to execute your
JS.

~~~
mariusz79
1 has been possible since the beginning of time. Every modern browser as cache
built-in.

~~~
EGreg
The browser requests CDN resources directly and Javascript is used to combine
those static files into something that the user sees. Since the code is
executing on the client, the app server is not overloaded.

------
asgard1024
I like to save every document I read online (including discussion, if
possible, these often contain insightful posts). Maybe it's obsessive, but I
do it.

Some documents, especially those using Ajax for loading content or multiple
pages, make this difficult. I hate them. (Hacker News, oddly, does it too -
when a discussion is archived, it becomes paged, which makes it more
complicated to store.)

I wish there would be a standard way to store page offline, including all the
JS changes made to its looks, all the external content etc.

------
mkilling
Lots of absolute views in this thread.

How about: If it's profitable for your site to offer a non-JS fallback, do it.
If it isn't, don't.

~~~
drdaeman
There are other things besides personal financial profits. Think of them as
taxes to maintain Internet working and flexible.

If you don't pay that tax, you're personally contributing to a future when
simple documents become full-fledged programs, and everyone lose abilities to
easily manipulate and interact with them in any but author-defined ways.

There are cases where you can be exempt from that tax - if your site is not
about documents, but their transformations, i.e. it's more of a process, not
data. (Then you should call it app, not site). But vast majority of sites
isn't.

------
mistercow
Are there good tools out there for helping to ensure that a JavaScript web app
without progressive enhancement is accessible to the disabled (e.g. screen
readers can parse it, speech recognition software can interact with it).

I ask because I've recently discovered that Google has massively failed in
this department with some of their products, at least as far as speech
recognition is concerned. Google Docs is a great example of what I'm talking
about. If you try to use it with Dragon NaturallySpeaking, buttons and menu
items are often not recognized, text entry is only reliable by using the
separate (and inconvenient) Dragon "dictation box", editing is a nightmare,
and review comments can only be placed by actually copying from a separate
program. Your best bet if you need to collaborate is honestly to just use
Microsoft Word, and then either upload and convert, or copy and paste, and
then accept the fact that a lot of collaboration tools won't be usable by you
or any of your collaborators.

I can't imagine how frustrating it must be to try to use modern web apps as
someone who can't type effectively or read a screen, and it seems like the
problem is only going to get worse as people rely more on canvas without
taking accessibility into consideration.

------
jhh
While I agree that you can assume Javascript being enabled I really think that
"conventional" web development has still many advantages over making SPAs.

Business logic on the server, HTML generated on the server, conventional mvc-
architecture, use ajax and push state to make it highly interactive.

Fine - you can assume JS being available, but from that it simply does not
follow that you have to throw away the traditional (rails style) dev model of
the web.

------
hcarvalhoalves
You can have your hypermedia on api.<yourdomain> and then your AngularJS or
whatever on app.<yourdomain>, consuming your api. Then all you need to do is
serve HTML from your api when the User-agent accepts html (bonus points: set a
canonical meta tag pointing to your foshizzle app so you don't lose SEO).

Best of both worlds.

PS: All this crazy talk stems from the fact Javascript created an apartheid on
the web. We need to make a clear distinction between the HTTP-web and the
Javascript-enabled-web. The fact the same software (browser) serves this dual
purpose adds to the confusion and allows bad architecture decisions,
interwinding content and rich interfaces inside the same hypertext mudball.

------
dep_b
Well I guess you wouldn't want your monitoring tool indexed by Google anyway,
right? As soon as you only have JavaScript as the only way for accessing some
data it will be harder to index. There are some solutions provided by Google
[https://developers.google.com/webmasters/ajax-
crawling/](https://developers.google.com/webmasters/ajax-crawling/) but the
situation still is not satisfactory.

For my clients it's usually the case that being found well in Google is a
major part of their business case. PE makes sure that a basic crawlable
version of your website exists with proper titles and tags.

------
tambourine_man
Not a single mention of SEO on the article. I guess Google is dead too.

~~~
elliotanderson
Try viewing meta.discourse.org without Javascript on then look under the hood
- it's a pretty easy problem to solve with some <noscript> tags. Other
approaches use PhantomJS to create a "rendered" copy that is accessible to
primitive clients/scrapers.

~~~
drdaeman
> Other approaches use PhantomJS to create a "rendered" copy that is
> accessible to primitive clients/scrapers.

Then your site does not require client-side JS support.

The point is not about what technologies you use to produce documents, the
point is what data you [can] serve.

------
soljin2000
Only if you never need any referrals from search engines. I know a site that
has this awesome locally sourced food delivery/pickup system. Connecting
consumers directly with the growers.

Their site is 100% in JS. And if you google for anything even remotely close
to what this site sells you simply cannot find them.

Unless you are a members only app site would I say progressive enhancement is
dead. Well that is unless you care about the millions of users on slower
mobile connections with crappy smart phones.

~~~
kmfrk

        It’s a myth that if you use a client side MVC framework
        that your application’s content cannot be indexed by
        search engines. In fact, Discourse forums were indexable
        by Google the day we launched.
    

[http://eviltrout.com/2013/06/19/adding-support-for-search-
en...](http://eviltrout.com/2013/06/19/adding-support-for-search-engines-to-
your-javascript-applications.html)

They are probably a good case to follow, if you want to see what people's
experiences with SEO for JS-based services are.

~~~
asdasf
Why do people who are against server side templates always provide "well, just
use server side templates _and_ client side rendering both!" as a solution?
Hooray, I can write my app twice for no reason! Way to sell me on pure
javascript for delivering content.

------
pcunite
I'm ready for JavaScript sites ... but notice a very important nugget in the
authors post, "web apps need to have good URLs". This cannot be overstated.

1\. Stop preventing middle and right clicks on JavaScript enabled links. For
left clicks, sure ... control the flow.

2\. Respect the fact that this is NOT a desktop environment, therefore my view
of your program's "screens" should be on a per URL basis. I actually might
want to view a list you generated in my own separate "window" or "screen" with
the URL visible, usable, and savable in the browser.

------
cpursley
Wow, the size of the Ember app vs. Typical webapp + JavaScript is impressive.

Inspecting the Bustle app with the new Chrome Ember Inspector is very cool.

Has Bustle open sourced any of their components or written on how they
developed the app?

~~~
bti
One of the engineers wrote a little bit about it on Reddit[1] awhile ago.

[1]
[http://www.reddit.com/r/webdev/comments/1kf84d/bustlecoms_sp...](http://www.reddit.com/r/webdev/comments/1kf84d/bustlecoms_speed/)

------
kenster07
I doubt devs are "ashamed" of making js web apps. The main issue is that it
takes more effort to do so than a traditional web app. There are browser
specific quirks. Frameworks like rails are so well-integrated with the db
layer that it will be difficult to match that productivity with pure JS apps.
And finally, a lot of devs don't want to take the time to learn, when the
current standard is perfectly acceptable for most use cases.

------
pkroll
140 comments in and no mention of TurboLinks? You hit a URL, you get the
content. You click a link, it just gets the body and replaces that: no
reloading the CSS, or JavaScript, so content gets rendered faster. Search
engines get the actual content from the URLs, users get the speed. Issues
having to do with making your scripts idempotent are problematic, but it
certainly sounds like a good base.

------
netmute
The problem is not Javascript per se. Personally, I like a good webapp.

What annoys me is the tendency of Javascript guys to rebuild every damn
application there is as a webapp, and rave about it like it's the best thing
ever. Javascript has become their hammer, and the whole world looks like it
needs a good pounding.

Just because you can doesn't mean you should.

------
aufreak3
For me, the real indicator that progressive enhancement is "dead" was that as
I began reading the post, I was left wondering "what the &*%^ is this
progressive enhancement that he's declaring as dead?" until, pretty much, when
I finished the post.

------
spacecadet
I still teach both "graceful degradation" and "progressive enhancement" to
students in my "Web101 - Introduction to Web Development and Design"...

------
BobbyBobby
This post doesn't address the main reasons for PE.

\- Accessibility \- Spiderability by search engines

If you want to say PE is dead please explain how these don't matter to most
websites.

------
Theriac25
Javascript is cancer.

------
nraynaud
Yeah, i like to split the world in 2: web pages and web apps. For web apps, i
don't hesitate to assume javascript.

------
AdrianRossouw
the one thing that i think is still kind of uncovered ground for javascript
frameworks is proper i18n and l10n support.

------
KaoruAoiShiho
Has been dead for a while. The people who complain on HN about "I get a white
page" are treated like trolls.

~~~
camus
you are the troll obviously, calling people that disagree with you trolls.

------
oelmekki
Edit : actually, I'll make an article out of this, because I came late in the
discussion (I'm from european timezone) and the message probably won't be
heard.

People that consider app should be usable entirely without javascript
certainly miss the point. So do people that consider progressive enhancement
is only about supporting people that deactivated javascript.

As author mentioned, the browser is now more an execution environment rather
than a document viewer. You know what it means ? It means that developers have
no control over the execution environment. With server side, if it works for
you, it works for everyone. With client side, you'll never know. You don't
know what extensions your user use. You don't know how stable his system is.
You don't know how stable his connection is. And you can't ask your users to
have a such carefully crafted environment as your servers.

What this should make us concludes is that the most heavily your app rely on
javascript, the better it should be at error handling.

How do you handle error in javascript ? If an error occurs in a callback
function, clicking that <a href="#"> again and again will simply do nothing,
and your user will get frustrated and yell : "it does not work !".

With progressive enhancement and graceful degradation, it suddenly becomes
simple. Your link has a real href. You can deactivate all event handlers using
the event "window.onerror". That way, clicking a link after a crash will
follow it.

You even don't have to implement the feature totally on server side. If your
client side feature can be emulated on server side, do it (and your user won't
even realize something went wrong) ; if it can't, simply warn your user about
it. Anyway, javascript runtime will have been reinitialized.

So, for all of this to work and make sense, we just have to use modern
definitions :

* progressive enhancement is ensuring no link / button / whatever would "freeze" if javascript crash

* graceful degradation is ensuring interface get reversed back to an useful state when an error occurs (like, showing again submit button for what was ajax forms). This can easily be done if your page is composed of objects that respond to some kind of destructor method.

If you think client side errors do not happen that much, just put something
like that in your code :

    
    
        window.onerror = function( error, url, lineno ){
          $.post( my_exception_url, error: { error, url: url, lineno: lineno, page_url: window.location.href );
        }
    

This will post all exceptions to a server side url, where you can relay it to
your exception management system (I simply raise a system exception using the
passed parameters) or store them. You'll be surprised.

~~~
oelmekki
Article posted :
[https://news.ycombinator.com/item?id=6319738](https://news.ycombinator.com/item?id=6319738)

------
leokun
The websites as applications is correct. I see grumblings from some old time
grumpy folk about why does this site need JavaScript. Because it's a runtime
now. The web has evolved, and aren't you glad it did because flash is dying.

~~~
gergles
I'd rather have Flash + HTML that I can parse and view on any application
rather than JS that works on the list of Approved Browsers that were all
released within the past week.

Nobody would've made a Flash-only site but for some reason JS-only sites that
shut out anyone with slow computers, old computers, mobile browsers that
aren't Safari, etc. are totally okay.

~~~
kreper
All kinds of wrong here.

------
brokenparser
It ain't dead until Netcraft confirms it.

------
asdasf
Why on earth is he framing it as though it has something to do with time?
Like, there was a time when you couldn't rely on browsers having javascript,
so progressive enhancement? Progressive enhancement wasn't because browsers
didn't have javascript, it was because people turned it off. They still do.
And horrible javascript "apps" that just show some text and pictures are only
going to make that number get bigger.

------
crassus
What bugs me about web apps is that I need to download a new copy of it every
time I go back to it (or at least send all the AJAX requests again). Load
times suck.

------
goatslacker
Just for fun I made
[http://sighinternet.tumblr.com/](http://sighinternet.tumblr.com/)

