
Server-generated JavaScript responses - steveklabnik
https://37signals.com/svn/posts/3697-server-generated-javascript-responses
======
ryankshaw
as I read that I kept wanting to say "No, we're so past that! i need client-
side everything w/ json apis" (as I'm sure a lot of people and the mindshare
of the interwebs are going) but then I read the last part:

    
    
      > If your web application is all high-fidelity UI, it’s completely legit to go this route all the way. You’re paying a high price to buy yourself something fancy. No sweat. But if your application is more like Basecamp or Github or the majority of applications on the web that are proud of their document-based roots, then you really should embrace SJR
    

and I realized that he's completely right, the majority of the web is still
document oriented "pages". If that's your case, don't try to be an "app" and
the 37signals way works just fine for you. in other words, it would not be a
good idea to make a blog be a Single Page App (I'm looking at you blogger).

~~~
spankalee
I'm sure there will eventually be a happy medium where a page/app is rendered
on the server, but upgraded asynchronously to be as dynamic as it needs to be,
so we'll see the convergence of apps and pages. The return of progressive
enhancement, I suppose - but even better because of pushState, indexeddb, etc.

That's if there's truly some benefit. Server-side rendering has its place, but
probably isn't important for apps on the very "app" side of the content-app
spectrum. What makes me want very easy to use server-side rendering, even for
apps, is the fact that so many of those apps have embedded content - say an
email in Gmail - that it'd be great to reuse the same templating if it's later
deemed important to have a link to a static version of the content "outside"
of the app.

~~~
jameswyse
I find the best approach is to render the initial view server-side and then
render any future requests client-side with data from your API. That way your
site will be super fast and can be indexed by search engines.

If you're using node.js then you can even use the exact same template
rendering code.

~~~
couchand
Did you read the article? They specifically address this:

 _This means it might well be faster from an end-to-end perspective to send
JavaScript+HTML than JSON with client-side templates, depending on the
complexity of those templates and the computational power of the client. This
is double so because the server-generated templates can often be cached and
shared amongst many users (see Russian Doll caching)._

------
bastawhiz
This is all well and good until you want to handle error responses on the
client. Let's say your user is on spotty wifi and they hit the submit button
for your form. Guess what happens if that AJAX request fails? Probably a whole
lot of nothing, unless you've got some templates client-side to show an error
message. At that point, if you're making your error messages fit into each of
your UIs, you'd might as well just be using client-side templates anyway and
SJR is moot.

On top of that, you can't update the UI in a meaningful way until you get a
response from the server.

That's also not even touching any security considerations that you need to
make to use this technique: you can't implement a CSP, you've got to make damn
sure you're properly escaping every piece of data [in a special way] that
comes through, and you've got to make sure the response that you're sending
can't be used in a script tag (i.e.: you need to add an intentional syntax
error that you strip off) or an attacker could simply put a script tag on his
own site pointing at a URL on _your_ site that returns sensitive information.

TL;DR: Badasses only.

~~~
crazygringo
This is spot-on.

I worked with someone who helped develop AJAX interfaces before it even had
the name, back when XMLHTTPRequest was brand-new, and they built a product
similar to this in philosophy -- must have been around 2000.

I worked with a similar model as well on a couple apps, until finally
migrating entirely to client-side rendering, and I would never go back. The
complexity of server-side code having to write out client-side to update
things just defeats you after a while, it's hard to keep a separation of
concerns, and things start to turn into spaghetti despite your best
intentions. Refactoring becomes increasingly difficult.

But most of all, it's just like you say -- you can't do error handling, or
change the interface before you receive a response from the server, or deal
with simultaneous requests, or probably 20 other things.

I'm honestly really surprised to see someone recommending this today. Dealing
with all these huge numbers of stand-alone snippets which update certain parts
of pages in certain conditions, is just an organizational nightmare.

~~~
Joeri
I think the reason they do this mostly boils down to wanting to write ruby. If
there was a native ruby engine in the browser, i'm pretty sure they would do
everything they could client-side.

~~~
swah
And initial latency/page weight?

Even wikipedia would work great as a client-side app with json, bu then you
want to open 30 tabs and you can feel it lagging.

------
chrismccord
I've been experimenting along these lines with making Rails' partial real-time
using websockets and it works quite well for Basecamp/Github style apps.
Basically you can get "true" MVC while using your existing erb/haml views and
you get real-time updates for all connected clients. Some web apps that are
replicating desktop like behavior require full client-side MVC, but I think
many apps can hit a sweet spot and get the best of both words of server-side
rendering with realtime updates.

project:
[https://github.com/chrismccord/sync](https://github.com/chrismccord/sync)

~~~
netghost
Hey thanks for sharing that, I think it's a really interesting middle ground.

------
spankalee
I'd much rather use templates that can be rendered server side as an
optimization if necessary, and then updated normally on the client.

As for the criticism of SPAs that you need to download the "entire" JavaScript
library is loaded, this can be mitigated by lazy loading the dynamic bits. If
you look at how Google+ behaves, it renders the page server side, and then
loads controllers for various parts of the app on demand. For such a complex
app it loads incredibly fast.

It might be a little bit before there's a server-side renderer for custom
elements / Shadow DOM / template binding, plus patterns for deferred loading
code, but personally I think that will be the way to go in the near future.

~~~
Touche
> It might be a little bit before there's a server-side renderer for custom
> elements / Shadow DOM / template binding, plus patterns for deferred loading
> code, but personally I think that will be the way to go in the near future.

There won't be. These guys are just desperately clinging the past. They should
be writing the Ruby on Rails for Opal instead of trying to extend the life of
something that, frankly, has seen its day.

~~~
porker
> These guys are just desperately clinging the past. They should be writing
> the Ruby on Rails for Opal instead of trying to extend the life of something
> that, frankly, has seen its day.

Care to expand? I'm confused what you mean.

~~~
Touche
I mean while the rest of the web world has moved on to the client being the
center of the universe the 37Signals guys are busy trying to make sure they
can still work the same way they have since 2005.

~~~
kayoone
i am all for SPAs an love angular, but in reality >90% of the web world is
still using the document orientated model of 37Signals. That will change
eventually, but its a long way to go.

------
paulbjensen
There is this interesting pattern of having template compilation be possible
from both the server and the client, such as with AirBnB's Rendr. It gives you
the flexibility to choose where you want that template compilation to occur.

With regards to Twitter's Time To First Tweet being slow under their Single
Page App, I recall Alex Maccaw mentioning that their JavaScript library
execution was inefficient, in that the client had to download a whole bunch of
JavaScript before it would load the part of the app responsible for rendering
the data.

I believe he was suggesting that if Twitter optimised the delivery so that
parts of Twitter's JS were served after the rendering part of Twitter's JS
code, then the time to first tweet could be faster.

In my opinion, delegating template compilation to the client offers a nice
separation of concerns; the server is the API, and the client is the UI.

It would be interesting if there was benchmarking done into comparing these
approaches, to see where server-side template compilation can be beneficial
over client-side template compilation.

~~~
agilebyte
Exactly, RequireJS if you need to load parts of the client dynamically but use
plain old HTML for initial _page_ delivery if you have a lot of results.

It is also what is recommended for Backbone, for example:

[http://backbonejs.org/#FAQ-bootstrap](http://backbonejs.org/#FAQ-bootstrap)

------
dmazin
RJS was the most awful experience of old Rails. This comment has no other
utility, it was just really ugly and depressed me and I've never gotten to say
that before.

~~~
jops
Twas but just a blip in the otherwise excellent timeline of ever increasing
joy and happiness.

~~~
calgaryeng
Ever increasing happiness brought by an Omakase meal.

------
zinssmeister
"The template is just JavaScript instead of straight HTML" I guess that's one
way to make this work. But for everyone that is serious about writing a web
app with a slick UI should build things around api calls and move the views to
the frontend (backbone.js & handlebars templates are great).

~~~
lotyrin
Yeah, did you finish reading? He covers that. The cost of doing so might make
sense for your projects, but so many people jump all the way to doing that
when their "app" is still mostly document-based and it's the wrong way to go.

~~~
zinssmeister
Yep, I finished reading but I disagree with this approach. Even in a mostly
document based project I would recommend using frontend templates. Fire and
forget type actions are just really easy to do and the whole thing just makes
more sense IMO

~~~
lotyrin
But if you're building Document-based CRUD app number one-billion-and-five
where rendering templates is a significant portion of your app's execution
time, doing that in the frontend means you cant cache the output, so every
render for every user forever has to do all of the work. Sure, it scales well
in terms of user volume because you will always have roughly as many resources
to process templates as you have users, but it doesn't scale as far in terms
of data size and template complexity because when if it gets slow there's
nothing you can really do.

Doing it serverside with russian-doll caching means that each template only
gets rendered once for a given set of inputs and (at least for most CRUD apps
I've ever built) you have huge cache hit rates, so it'd mostly be people
pulling static HTML out of memcache all day. Crazy fast.

~~~
mattmanser
This seems like a poor defence.

Why couldn't you just memcache the json response for a given set on inputs?

Template rendering is _never_ , _ever_ the slow bit in a CRUD app. Ever. At
its core it's simple string concatenation. Getting the data, the SQL, is
almost always the slow bit unless you're doing some serious computation or
file jiggery, which is still not template rendering.

~~~
way2throw
My Rails CRUD experience is the opposite. Template rendering always dominates
database time. Without caching, rendering templates with nested partials is
generally 5-10x slower than data access. Typical times are 20-50ms for
ActiveRecord, 100-300ms for views. This is in production environments with
production databases, Rails 2-4, even with alternative template engines that
optimize for speed.

~~~
mattmanser
Err, as a comparison, if my C#/ASP.Net MVC responses takes more than 50ms I
start looking for reasons why it was going so slowly. An uncomplicated page
will take 20ms or so.

And that's on a small VPS.

But even going to Basecamp, the entire time waiting in chrome (which is
basically the entire server-side run time + a few ms) is between 150-200ms and
that's on what I assume is a fairly busy server.

So I suspect you're doing something wrong.

~~~
way2throw
> going to Basecamp, the entire time waiting in chrome ... is between
> 150-200ms

The numbers in my comment are "without caching". Comparing them to Basecamp,
which caches views and fragments heavily, is apples to oranges. Once I add
caching, my typical response times for a cache hit are in the 10s of
milliseconds.

You made an Amdahl's law argument that view caching is fruitless because
rendering is an insignificant part of total response time. So I responded with
_uncached_ performance to show why Rails needs view caching.

It's no accident that Rails has comprehensive caching support; that the Rails
team has worked hard to refine and optimize caching in each release; and that
DHH writes about it so often (including in this article). You can't have
performant Rails without view caching because rendering is dog slow.

------
schpet
is this secure by default in rails yet? i find it surprising that these
techniques are promoted at the same time vulnerabilities are being publicly
disclosed:

[https://groups.google.com/d/msg/rubyonrails-
core/rwzM8MKJbKU...](https://groups.google.com/d/msg/rubyonrails-
core/rwzM8MKJbKU/fU28_YloK2MJ)

~~~
nfm
I believe the fix for this (checking if the request is xhr) hasn't been
committed yet.

~~~
krapp
Is there a way to check that which can't be faked by altering the browser or a
js framework though?

I was under the impression that trying to validate that was ultimately as
fragile as checking the user-agent string...

~~~
dhh
It relies on a header, which can't be set through the attack vector, so it's
all kosher.

~~~
homakov
Since we are on the same page, could you help me in this discussion with
nzkoz?
[https://github.com/rails/rails/issues/11509](https://github.com/rails/rails/issues/11509)
we're talking about different things

------
wldlyinaccurate
To me, this seems really convoluted - maybe as a non-rails developer I'm
missing some key information?

Why, for example, would you use this over client-side templating and data-
binding? Create the template once, grab some data with AJAX, bind it to the
template...

~~~
schrodinger
Then your initial page hit requires 2 round trips... one to download the page
and the javascript, another to get the ajax, and then also a javascript based
rendering step.

This way, the rendering is done server side so the initial html comes with the
first request, and ajax updates are still generated using the same server side
code path, just sent back inside of some javascript.

~~~
Encosia
It's not much work to include the "first page" of data as a JavaScript
variable declaration embedded in the response to the initial page hit to avoid
that second round trip.

------
izietto
I'm the author of this comment:
[https://github.com/rails/rails/issues/12374#issuecomment-294...](https://github.com/rails/rails/issues/12374#issuecomment-29408698)

I still don't get the benefit over using RJS (or SJR, or JSR, whatever)
instead of render :json : with the former you have the javascript spreaded in
your views, with the latter you can organize it inside the assets, which IMHO
is a way better solution.

~~~
rartichoke
It's not really mixed in with your views. You would put in a .js file that has
nothing but javascript in it.

Some benefits I see are:

1\. It just works perfectly with and without javascript which means your stuff
gets crawled by search engines and it works great for people without JS. Then
if you have JS enabled you get the enhanced UI.

In #1's case I do this all the time for search boxes. In admin dashboards that
I make with rails I allow people to search for data.

It's trivial with rjs to implement a solution that works with no javascript
and then if you have JS enabled you get your search results shown immediately
without a full page reload.

2\. You can tell the rjs code to use server side (erb) partials that you
already have created so you don't need to have templates made for both.

Or in #2's case if you just use jquery without using any template engine it
gets super messy trying to append elements to another element with inline html
strings. All of that is eliminated by using your existing erb partials.

~~~
izietto
I mean that you would have JS in your app/views folder together with the HTML
views, instead of the assets folders only.

1\. It doesn't depend from RJS/render :json, but from the fact that your
controller responds to sync requests, in addition to the async ones.

2\. You can tell to render :json to use server side (erb) partials:

    
    
      render json: { html: render_to_string(partial: 'product') }
    

I do it all the time

~~~
rartichoke
I didn't know about #2, that's really cool. How does render_to_string handle
cached partials?

Does it still need to compute the output of the partial's string
representation on every request?

------
krapp
I've done this in PHP as a hack but I didn't realize it was supposed to be A
Thing, with an acronym and everything.

~~~
kayoone
haha, yeah that was my thought exactly :)

------
rdtsc
There is an interesting web framework that takes this approach as well -- N2O.

[http://synrc.com/framework/web/](http://synrc.com/framework/web/)

It can do server side rendering but then pushes it via a websocket (well with
fallback) to the client.

The use case I understand is mobile clients.

------
pilif
The thing about SJR is that it lends itself much better to progressive
enhancement: if you need/want to present a page which also works without JS,
that means that you need to be able to render the page on the server, which
means that by using SJR, you can easily reuse that same view code that you
already need to have for the non-JS clients.

Of course, as JS on the server becomes more and more wide-spread, you might as
well just use the same templates at both ends, but it' still more
infrastructure than treating the client JS as just part of the view.

I discussed this back in 2011 on my blog:
[http://pilif.github.io/2011/04/ajax-architecture-
frameworks-...](http://pilif.github.io/2011/04/ajax-architecture-frameworks-
and-hacks/)

------
callmeed
_> > The combination of Russian Doll-caching, Turbolinks, and SJR is an
incredibly powerful cocktail for making fast, modern, and beautifully coded
web applications._*

I'm personally not against SJR and I haven't really gone whole-hog into any JS
framework (Angular/Ember/Backbone) ... but almost every RoR developer I've
talked to is disabling Turbolinks on all their Rails 4 apps. I personally find
it slow to load/render in dev mode, which leads me to believe it would be
confusing for users in production.

Is anyone using/liking Turbolinks for Rails 4 apps?

~~~
hayksaakian
I use it in a small app (rails 3 though) and while it may be faster, the issue
is that it does not follow typical UX of a website.

Usually people: Click link

White screen / loading (computer is doing something)

Next page progressively shows up

Turbolinks is different

Click link

Nothing is happening...?

Next page appears all of a sudden

------
carsongross
Great stuff: we do ad-hock versions of this in a few of our applications. It
deserves a formalization.

In my brief thinking about this, one potential version would use HTML 5
attributes like AngularJS, but use HTTP/restful endpoints as "the model". I
can imagine a few different approaches, but something like:

<div data-dyna-src="[http://myserver/my-div-endpoint"](http://myserver/my-div-
endpoint") data-dyna-method="poll 500ms"> ... </div>

Which would then poll the given endpoint and swap in new content in a
pluggable-but-visually-pleasing manner. [http://myserver/my-div-
endpoint](http://myserver/my-div-endpoint) would serve up the partial of the
div, so everything would be DRY. Basically move the model back to the server,
but still buff up the presentation layer a bit.

You'd probably need a few different patterns: updateable divs, forms,
updatable tables, progress bars, as well as some good default transitions and,
of course, make the whole thing pluggable, and potentially provide for both
HTML and script interchange at the end point (that's what our ad-hoc version
does: it provides a data channel, an html channel and a raw script channel,
but it feels like a hack.)

Anyway, if anything is going to save us from the oncoming javascript-filled
dystopian hellscape, it's probably something like this.

------
ilaksh
I thought that was called JSONP?

Anyway this is clever from a traditional web development perspective but not
from a contemporary one. I like to use AngularJS (with prerender.io when
necessary for server side rendering). I write all the non HTML code for both
the front and back end in ToffeeScript which is derived from CoffeeScript.

~~~
mikkelewis
This is made with a XHR request, not JSONP. JSONP requests are made by
inserting a script tag and defining a callback function to get around cross
site requests limitation with XHR.

~~~
steveklabnik
... which now should generally be done with CORS, if you can.

~~~
byroot
As much as I love CORS, it unfortunately have a ton of browser compatibility
quirks... And not just from IE.

------
martinaglv
Rails is a great framework. But posts like this and the one discussing how the
team made product decisions around their caching strategy [0] and the horrible
hacks that they did to make it work, for me are telltale signs that Rails is
not the right tool for building web apps anymore.

Web development is evolving, as it ever has, and I believe that the next step
is pushing the view layer to the client, and caching _data_ , not html, on
every step of the way. The major hurdle here is SEO, but I hope that when the
right framework comes, it will offer an elegant solution to this problem.

[0] [http://37signals.com/svn/posts/3112-how-basecamp-next-got-
to...](http://37signals.com/svn/posts/3112-how-basecamp-next-got-to-be-so-
damn-fast-without-using-much-client-side-ui)

~~~
ds_
I wouldn't say SEO is a major hurdle, there is prerender.io
([http://prerender.io/getting-started#ruby](http://prerender.io/getting-
started#ruby)) and countless other services which help (I made one).

------
lazyjones
Such things were popular more than 15 years ago, when websites had to conserve
bandwidth (for many users with slow links) and JavaScript wasn't considered as
dangerous as it is now (so people didn't use NoScript much and didn't frown
upon JS-only websites like I do).

------
togasystems
In general, is this a good practice? You do loose the ability to re-use
endpoints on mobile.

~~~
dhh
We reuse all controllers and models for our mobile views and for our API. We
just do that reuse at the server instead of through the client. So no loss
there.

------
thurn
Isn't it impossible to optimistically update the UI with this model? You need
to wait for a server roundtrip before you can display the results. There are
certainly cases where that's OK, but it can make an app feel a lot less
snappy.

~~~
molf
Optimistic updating needs a serious amount of code and robust error handling
on the client and is therefore non-trivial. This approach seems aimed to
reduce the amount of work while being relatively fast.

~~~
atjoslin
Unless you abstract away the non-trivial part with something like Firebase :-D

------
drinchev
Yep, that's a nice and promising technique. I'm currently working on a NodeJS
project and I'm using EJS templates with Handlebars templates on the server-
side and on the front-end. The benefit is impressive. I can deliver the same
content on page-load and using fancy Backbone routing I manage to change
dynamically the page without duplicating code logic. I'm reusing some
components on server-side and client-side, so my templates will always render
with the same context ( JSON ) but on different JS environments. Amazing how
far can we go, just to satisfy Google and SEO guys.

------
jcampbell1
I suppose this is okay, but it can quickly become a mess as soon as you need
to make the modified portion of the dom interactive:

    
    
        $('#messages').prepend('<%=j render @message %>')
          .find('.delete').click(fn)
          .end().find('.reorder').hover(fn,fn)
    

I suppose they work around this problem with event delegation.

If you use something like angular, the whole mess becomes:

    
    
        $scope.messages.unshift(message);

~~~
artellectual
actually there is a gem that solves that problem if you want to use SJR

[https://github.com/xpdr/transponder](https://github.com/xpdr/transponder)

[http://kontax.herokuapp.com/](http://kontax.herokuapp.com/) \- example app
running on heroku

[http://github.com/xpdr/kontax](http://github.com/xpdr/kontax) \- example app
on github

------
briantakita
The web development world has been trending toward a Thick Client. This
article is a call back to the Thin Client and centralizing all logic in the
app server.

Most devs are still used to the Thin Client approach, so there is a
productivity benefit to remain where you are familiar.

However, you can have just as many or more productivity enhancing libraries
and practices in client side javascript as in the server side rails-like
frameworks.

------
bsaul
That made me wonder: Is there a general consensus that google will never ever
be able to crawl single page applications ? Because after having played with a
client side framework, i really see no other real reason to get back to server
side rendering.

Looks like all this is really just a way to compensate for google lack of
advanced technology on the crawling side ( i just loved that last sentence :))

~~~
TheAceOfHearts
I've gone to a few AngularJS meetups at Google and they seem to think it's not
a big problem. The crawler executes JS just fine, it's just a browser.

------
milos_cohagen
I enjoy dhh articles, but find them a bit hard to understand not being a
rubyist. dhh says:

2\. Server creates or updates a model object. 3\. Server generates a
JavaScript response that includes the updated HTML template for the model.

Why does a template need to be updated? Isn't that part of the idea of
templates, that they are fixed for changes in the model?

------
vojant
I don't understand what's the advantage of using SJR.

~~~
wakaflaka
I too did not read the article.

------
smegel
> unless you make you’re doing a single-page JavaScript app

Yeah, cuz like no-one is doing that anymore. /s

