Hacker News new | comments | show | ask | jobs | submit login

This conversation seems to come up often. Server side templating is still really important and isn't going away too soon.

There's still so many problems to solve before server side templating can seriously be considered obsolete.

1. Search engines still can't scrape javascript created content. Hacks like #! don't count.

2. Grade A browsers still don't properly support pushState() and if they are supported in some cases it's a buggy experience.

3. There's still a decent amount of people with JS turned off.

4. Having the client do the heavy lifting of your web site is a bad idea because the experience cannot be controlled. Anyone without a decent computer is going to get a sluggish experience and mobile performance is still very questionable.

#3 is probably the least important thing but it will also depend on what your site is doing. If I'm selling a product that with global reach and I ignore #3 then I'm throwing away money basically because India and China are massive and they have pretty high %s of people without JS or very old browsers.

Regarding point #1:

Yes it can be done, but it slow down the process a lot.

Can you go into more details?

Making your web server respond back with server rendered templates if it detects a bot user-agent instead of serving back json normally isn't really what I would consider a solution.

It's a solution in the sense that you can satisfy both search engines and users but it requires the developer to code basically 1.5 web sites instead of 1 (it's not really double the work but it's certainly more work than just wiring up some routes that spit back json).

I cannot go into technical details because that is a solution I worked out for Nuuton. Though it is possible by using a helper bot. The crawler does not do the job itself, but gets the data passed to it by another bot.

I don't want to sound dickish, but I can't really give you much technical details. Sorry.

Sounds like it's not solved. Don't you think Google would be happily crawling javascript sites without the developer having to do anything special?

I don't know what Nuuton is but from the looks of the home page it seems to be some random no name search engine that isn't even released.

I wouldn't really call that solved, considering most end users are using Google, Bing or Yahoo.

What "Grade A browsers" don't support pushState()?

My definition of grade A might be different than yours.

Grade A to me is a browser that is part of the "used just about everywhere" group.

Windows 7 ships with IE 8, so I consider IE 8 a grade A browser. Using tools like history.js to offer buggy fall backs isn't acceptable behavior IMO.

Ah, yes, it is just a question of definitions. I took it to mean something like "has generally good support for standards," in which case no version of IE besides maybe 10 would make the grade.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact