

Progressive enhancement is faster - bpierre
http://jakearchibald.com/2013/progressive-enhancement-is-faster/

======
rblatz
Progressive enhancement is a luxury, and not everyone can afford it. It vastly
increases your test surfaces, and requires multiple designs for every
page/feature.

If I'm trying to get a product out and I can reach 99% of my audience by
assuming they have JS enabled, then I'm going to do that. I'm not going to
spend 2x as long (at least) to reach that extra 1%.

~~~
jaffathecake
You've misrepresenting progressive enhancement as "for people with JS
disabled". I guess you didn't get past the first paragraph of the article?

Progressive enhancement actually decreases your testing surfaces by moving
more logic to the server which is under your control, whereas the clients are
running a variety of different implementations.

------
mcgwiz
Websites differ in their basic nature, and one size does not fit all. Roughly
speaking, websites that are document-like (primarily consumption-oriented)
should probably be progressively enhanced; websites that are tool-like
probably should not (interaction oriented, like gmail, analytics apps, other
saas). Websites that fall in the middle will have to carefully consider the
user experience benefits and the available technical/operational resources.

The upside is that progressive enhancement can refer to a spectrum of
techniques. Deliver only above-the-fold content as HTML, inline your CSS/JS,
inline the initial JSON data, omit <form> POST support, etc. Two templating
systems do not need to be supported. It's a straightforward technical problem
to apply JS templates server-side (and I'm speaking as a boring old .NET
developer--Nustache and Edge.js come to mind).

Btw, another benefit of progressive enhancement can be SEO.

------
gavinpc
It seems like the author is confusing progressive enhancement with progressive
rendering.

Either way, to say that the Dale piece "conclusively" shows progressive
enhancement to be "a futile act" is, in the OP's word, "misrepresenting."

"Exceptional" or not, the key is to recognize when you're dealing with one of
those cases -- sites which, like Wikipedia, could be great on Web 1.0
browsers. In those cases, your focus is likely to be more on the content and
its structure. In the "progressive enhancement est mort" view, you'll have to
spend more time on engineering.

~~~
jaffathecake
Progressive rendering is enabled by progressive enhancement.

I'm not convinced progressive enhancement is more effort unless you make it
more effort. I covered these arguments and more in my previous post
[http://jakearchibald.com/2013/progressive-enhancement-
still-...](http://jakearchibald.com/2013/progressive-enhancement-still-
important/)

~~~
dasil003
> _Progressive rendering is enabled by progressive enhancement._

Sometimes, but only by coincidence. The baseline of supporting JS off is that
you have the render the entire page contents server-side. If the progressive
enhancement is addition of content that is only supported by JS then you are
correct. However the ideal progressive rendering is to send a generic static
HTML shell with no personalized content that can be delivered instantaneously
from a web server without any back-end chatter, then load in the dynamic
content from the client side. Facebook is probably the most advanced
implementation of this technique, and it yields a dramatic increase in
perceived performance due to the fact that page starts visually appearing
faster which psychologically extends the user's patience.

~~~
jaffathecake
That doesn't sound like a definition of progressive rendering I've heard
before [http://www.codinghorror.com/blog/2005/11/the-lost-art-of-
pro...](http://www.codinghorror.com/blog/2005/11/the-lost-art-of-progressive-
html-rendering.html)

~~~
dasil003
Read and learn young grasshopper:
[https://www.facebook.com/note.php?note_id=389414033919](https://www.facebook.com/note.php?note_id=389414033919)

~~~
jaffathecake
Good article, but that's not the definition of progressive rendering

~~~
dasil003
What's the precise definition? The browser rendering partial HTML as it is
streamed down? That is just built-in browser technology to achieve the same
goal. There's no meaningful distinction in terms of the end result.

~~~
jaffathecake
Actually, I misread your earlier post.

Yes, what Facebook does is progressive rendering, but I disagree that it's the
ideal since it's still blocked by JS. The ideal is serving HTML from the
server which can get content on the screen before JS downloads.

"That is just built-in browser technology to achieve the same goal" \-
exactly. Tweetdeck does progressive rendering, but it avoids the simplest way
of doing it and instead reinvents the technique with JavaScript. You can see
what this does to performance.

~~~
dasil003
The point is that the perceived performance of Facebook is impossible to
achieve with server-side rendering.

Whether they should optimize for that 99.99% or the other 0.01% on
philosophical grounds is a point you are free to debate. However the facts are
the facts.

------
seanconaty
The thing that always bugged me about rendering things in the client was...

1) supporting 2 templating systems (server & client) 2) no graceful
degradation (or "progressive enhancement" depending on your opinion) (i.e.
being able to get a page's content with a simple wget)

In any case, since it hasn't been mentioned in this discussion, I'd like to
direct people's attention to PJAX
([http://pjax.heroku.com/](http://pjax.heroku.com/)).

I've found this to be a nice, simple solution to have pages work identically
with and without javascript. The initial page load is rendered by the server
and the HTML of the subsequent sections of the page are rendered on the server
but loaded via ajax and updated with one jQuery .html() call. The app URLs and
the ajax URLs are the same but they return the page's full contents
(<html>...</html>) when requested regularly and the page's partial contents
(<div id="#content">...</div>) when requested asynchronously.

Check it out if you haven't.

~~~
k3n
That technique is exceedingly inefficient (download-wise) though, at least if
your markup is anything but trivial.

~~~
romaniv
We used to live with "exceedingly inefficient" full page reloads when we had
dial-up, single-core computers and slow servers. And it worked. Now we have
multicore computers, mufti-megabit DSL connections, cloud-based hosting and
yet you present it as if a difference of couple kilobytes (which can be
reduced to nearly zero by proper design) makes a life-or-death difference in
website performance.

~~~
k3n
> yet you present it as if a difference of couple kilobytes (which can be
> reduced to nearly zero by proper design) makes a life-or-death difference in
> website performance.

That's quite the stretch, given what I wrote. I only said it was inefficient.

There's also something to be said for the fact that rendering templates on the
server will make any meaningful client-side caching almost impossible. And
mobile is the new dial-up; while some are fortunate to have mobile broadband,
it's certainly not ubiquitous, and multi-core phones certainly aren't the norm
either unless you only want to consider the HN readership for your sample.

------
kalms
1\. I don't want to dabble with templating on a server. It's annoying, and
separates two layers that I don't want separated.

2\. It needs to work offline, and for me to support that, I would have to
double up on work, and maintaining two levels of templating.

3\. No framework has made this easy, in fact new frameworks seem hell bent on
making it even harder. See [http://bone.io](http://bone.io)

4\. Telling us to "do that." is not going to make it happen. It has to be
easier, and clearly it is not: Otherwise more developers would be doing it. I
want someone to convince me, but I have yet to see a post going in depth on
the technical implementation of such a solution (- that adheres to the 3
points mentioned above).

5\. Document-oriented sites (blogs, wikis, maybe even forums) should never
have been implemented with only JS in mind anyway.

~~~
jaffathecake
[http://lanyrd.com/mobile/](http://lanyrd.com/mobile/) works offline, without
JS, shares templates between the client and the server for updating pages
async and rendering offline. Doesn't use a framework so it doesn't have a huge
JS payload.

~~~
kalms
Frameworks doesn't necessarily mean you're delivering a huge JS payload, but
more to the point: How was that template sharing achieved? Would love some
notes on the technical implementation.

~~~
jaffathecake
The templates are mustache, delivered via a single JSON file
([https://m.lanyrd.com/templates.v356.js](https://m.lanyrd.com/templates.v356.js)).
They're used on:

* The server-rendered web (python)

* The enhanced & offline web (javascript)

* The ios app, where native views aren't used

* The android app, as above

The client code is pretty dumb, it knows how to turn a link into an API call,
the api response basically says "Render template x or equivalent native view)
with this data…"

------
Joeboy
If your website genuinely has a good reason for requiring js then that's fine.
If your buggy js gratuitously stops me from being able to access textual
content, you're going on my list of people whose fingers need breaking with a
polo mallet before they do any more damage. You'll be next after the
blogspot.com guys.

------
Joeri
Or, in other words, use the right approach for your use case.

The article is missing one optimization in the js case: the initial xhr can be
inlined as static json data in the original html page. Ofcourse, if you have a
static original html page then it can be cached on a cdn or in appcache, so
really the perf story is not clear cut.

~~~
jaffathecake
If you can inline static JSON you're a small amount of effort away from using
server-templating to serve HTML. Do that.

~~~
tuxracer
Hardly. Your backend needs to support both your templating language and
whatever you're using to populate those templates. Which means either adding
support for JavaScript to your backend or creating and maintaining redundant
code.

Meanwhile your backend could be Python, Ruby, whatever and can easily inline
the initial chunk of JSON without needing support for any of the frontend
technologies doing the actual template rendering (such as Handlebars+Backbone
for example).

So far as still needing to wait for JavaScript to download it becomes a matter
of how the ROI works out for one's particular use case. Let's say simply
inlining the JSON for the initial content narrows that 'time to initial
content' gap to a few hundred milliseconds. Is removing that gap still a
worthy return on your investment?

------
camus
Progressive enhancement is good practice, period.

