
Don't depend on JavaScript to render your page - dwltz
http://blog.donnywals.com/dont-depend-on-javascript-to-render-your-page/
======
meetingcpp
> Error establishing a database connection

Hm, don't depend on a database when rendering your page? _scnr_

~~~
dwltz
wow that's embarrassing! Should be back up now..

~~~
kephra
Dont depend on databases to deliver static content, could be the lesson
learned.

[http://kephra.de/blog/Make_here_CMS.html](http://kephra.de/blog/Make_here_CMS.html)
<\- shameless plug ;-)

This static site generator only requires make and bash. Being bilingual
requires JS at the client, and the picture galleries are created by a 7 lines
XSLT/PHP script under makefile control.

~~~
dwltz
absolutely a lesson learned. I'm actually working on a wintersmith version of
my blog..

------
sheraz
I'm amazed at how fast sites load when javascript is disabled. It reminds me
of the early days of broadband. Like the first time I sat at a college campus
computer that had a 100mb connection -- my face melted off.

I actually have javascript disabled on one of my mobile phone browsers simply
for speed reasons.

------
aikah
You know if some devs don't know better all they need is a bit of education.
Sometimes you build your awesome product, you forget basic things that can be
useful. And it doesn't have to be a huge investment.At least render the
homepage in the backend so people with js disabled know what it is all
about,and then upgrade to whatever js your app needs.

However some STUPID developers need to be named and shamed.Like this one :

voxxed.com

that displays A BLANK PAGE ON PURPOSE WHEN JS IS TURNED OFF.

proof : in their css :

    
    
        .hidden, .no-js {
           display: none; 
        }
    

That is absolutely revolting.

------
kephra
_ok_ I finally can read your article. And sorry you are missing the main
point.

Rendering static content client side using JavaScript is a bad idea, even if
you have fast network in terms of latency and speed.

The main problem with this AJAX antipattern is, that it blocks all spiders and
search engines. So you should not do that, if you are not Google docs, who can
inject the content into their own search engine, and wants its content not to
appear on other search engines.

As a rule of thump: Do not use JavaScript for anything that should be indexed
by a search engine. Instead use JS only when interacting with humans, e.g. for
blog comments, checkout and payment. Hiding comments by making the comment
system AJAX might be a good policy for a blog, if you do not want the comments
to appear in search. But the blog posting itself should be static content.

~~~
nawitus
Google executes JavaScript for indexing these days.

~~~
technomancy
Sorry, but Google is not the only bot that matters.

------
valuegram
As someone who is considering using a JavaScript framework for client-side
rendering on a new application, this strikes me as an anecdotal over-
simplification. We can find terrible uses of any technology. I'm not sure that
is a reason not to use that technology.

I believe the theory is that your application may take a little longer to load
initially, but subsequent interactions should be much faster, since all
rendering is happening on the client, and the server is only delivering API
interaction. There is also the potential for even the initial render to be
faster depending on what server-side rendering logic is being transferred to
the client.

~~~
lucaspiller
> subsequent interactions should be much faster, since all rendering is
> happening on the client, and the server is only delivering API interaction

It depends on what you are actually doing and how much you cache client side,
but for most cases that isn't going to make things faster for the end user.

Take the example of a theoretical blog post that takes 10ms to get from the
database and 50ms to render on the server. You could potentially save server
capacity by shifting that 50ms of rendering* from the server to client, but in
actual fact the client (a low powered mobile device) probably renders it
slower than the server.

The main thing though is this doesn't take into account the 2500ms round trip
time it takes the mobile device to make a HTTP request. This is going to be
the same whether you render on the client or server.

Now I agree it makes sense in some cases, if you cache the data on the client
(ie an email client could cache the 100 most recent emails) then it will be
faster, but the most services that use client side rendering aren't doing
this, so there is no real benefit to the end user.

(Apologies if I'm just repeating the article, but it's down for me)

*Serialising into whatever format you API uses is still rendering, so you aren't actually saving 50ms.

------
dmak
Honestly, if you are taking that long for your handle bars to be loaded, then
you have a bigger issues. For example, perhaps the website is waiting for a
huge response from across the world to resolve, and there is no way for any JS
framework to render that without the proper data.

------
onion2k
It's a pretty shambolic website, and a good example of why rendering some
things on the server is useful, or better yet, baking to a static site.

But...

Clissold Leisure Centre is run by a charity. They've clearly gone for the
cheap option here. Sometimes cost outweighs the benefit of spending more on
the site - they needed a feature rich online booking application and got one
that's a bit slow. It's possible that this is the best they could afford. It's
also possible that the rendering time of the site has no impact on the signups
or bookings that the leisure centre takes. In which case you have to ask -
_why should they spend more on a better app if there 's no benefit to their
business?_

------
nordicway
At least in Angular, you can just use ngCloak to avoid the flickering:

[https://docs.angularjs.org/api/ng/directive/ngCloak](https://docs.angularjs.org/api/ng/directive/ngCloak)

~~~
heinrich5991
That doesn't address the issue that it's taking long to load the page, it just
hides some symptoms.

------
TickleSteve
This exemplifies a problem with a lot of software these days; the "everything
_must_ be dynamic" attitude.

If something _really_ is dynamic, use a dynamic technique. If something is
static, use a static mechanism.

Simple really...

------
1971genocide
This whole handlebar-moustache-angular way of rendering page is retarded (
sorry for my strong language )

I am currently using virtual DOM to render and I have no problem "rending my
page using JavaScript" Even with an insane amount of page data that is
rendered dynamically.

using pre-computed pointers to your page element makes so much sense coming
from a background in embedded systems.

I have no idea how web-developers think they can get away with adding so much
overhead to something as simple as rendering a page. we have bigger problems
in the world like hunger and machine learning and why are you guys suck on
figuring out how to render a page ??

~~~
mixonic
Your language is completely inappropriate. It is not strong, it is
inappropriate and you should edit to remove it. Thanks.

[edit] someone gave me my first down-vote for posting this. I'll stand by it.

~~~
picks_at_nits
Good for you to say so, and good for you to stand by it. Right and wrong is
not a popularity contest. If anything, really right things are often
unpopular.

------
maxwerr
Funny how many ppl here seem to think you can't have a js site and get
crawled... check the user agent and direct to a prerendered site if it's a
spider? Prerender.io?

~~~
technomancy
User agent sniffing? Seriously? I thought that died the death it deserved a
decade ago.

------
maxwerr
I love how many ppl here really think you can't have a js site and still get
crawled... look at the user agent and preen der for spiders? Prerender.io?

~~~
sfeng
Google runs page javascript now.

------
btbuildem
Yup, still down.

------
matt_oriordan
Looks like you shouldn't depend on a server or TCP/IP either to render a page
given the website is down ;)

[http://www.downforeveryoneorjustme.com/http://blog.donnywals...](http://www.downforeveryoneorjustme.com/http://blog.donnywals.com/dont-
depend-on-javascript-to-render-your-page/)

~~~
dwltz
yeah... i was not at all prepared for this many people to visit this article

~~~
SigmundA
The irony is that had you used javascript to render the page taking advantage
of everyone's CPU you may have lightened the load on your server.

History seems to repeat itself. First there where dumb terminals where the
mainframe did all the work. Then the age of the PC with heavy clients. Then
the age of dumb browsers where the server did all the work again. Now the age
of heavy browsers running javacsript.

The thing is, you have a capable programming language able to utilize
distributed CPU resources in a safe manner, why would you not want to take
advantage of it? Because a dumb spider can't crawl it? Simple fact is this is
happening because it's obvious and it would be like fighting the rising tide
to deny it, IMO.

~~~
slifin
Would the performance difference between the server sending javascript with
the logic for rendering vs the server just rendering be so significant as to
have saved his website from a hacker news hug?

~~~
tinco
It depends on the exact style but yes. He could have the majority of his page
statically served or cached, and then load the dynamic parts (the comments
most likely) in with JavaScript. At least the article then would be readable
even if the comments didn't load.

~~~
vinceguidry
No, that can still bork your server. Too many async requests will kill it just
like too many page loads, only now you've got each client going back and forth
to the server.

The real way to do comments is the same way we've been doing it for years.
Post to a form, load the new ones with a new page load. Better yet, push that
functionality off to a system designed to handle it, i.e. HN.

------
supercoder
Though in this case rendering the page with JavaScript probably would have
taken the load off your sever and prevented it from denying requests.

~~~
pdkl95
The best speed, obviously, would be to simply serve up static pages (possibly
cached from a dynamic template, if necessary).

Rendering _only_ with javascript is only useful when the client _loads and
runs_ the javascript. The rest of us that see javascript as a _very serious_
security and privacy issue get the actual page that is served up. I strongly
recommend making that page a _useful_ page in some way.

If a tool or framework doesn't provide that page, then maybe it's time to
start filing bug reports about that tool's broken output.

~~~
krapp
To me, rendering in javascript makes the most sense when you expect
incremental updates. For instance, a forum or comments page where finding and
fetching a couple of new bits of text would obviously take less effort than
rendering and serving the entire document for that same bit of text. I would
agree with you that if you have a page that never changes, and there's no
benefit to fetching assets asynchronously, and you're not doing anything
dynamic, then you might be better off with static pages and caching.

Although to be fair, the number of people on HN who care about the security
implications of javascript are grossly out of proportion to the general case.
Near enough to 100% of people have javascript turned on by default that
everyone else might as well be a rounding error.

------
dingdingdang
I'm tired of these "javascript is dangerous" for rendering sites articles
(sorry if this is not one, I can't load it at the moment!) - instead there
should be one big ass article saying: "javascript is fine for incremental
rendering of features, the url should always reflect these increments and
hence hard-reloads should be able to pull a fully server side rendered version
of the site". END OF STORY.

~~~
dwltz
That was exactly the point this article is trying to make, JS shouldn't be
your only rendering method.

