There's still so many problems to solve before server side templating can seriously be considered obsolete.
2. Grade A browsers still don't properly support pushState() and if they are supported in some cases it's a buggy experience.
3. There's still a decent amount of people with JS turned off.
4. Having the client do the heavy lifting of your web site is a bad idea because the experience cannot be controlled. Anyone without a decent computer is going to get a sluggish experience and mobile performance is still very questionable.
#3 is probably the least important thing but it will also depend on what your site is doing. If I'm selling a product that with global reach and I ignore #3 then I'm throwing away money basically because India and China are massive and they have pretty high %s of people without JS or very old browsers.
Yes it can be done, but it slow down the process a lot.
Making your web server respond back with server rendered templates if it detects a bot user-agent instead of serving back json normally isn't really what I would consider a solution.
It's a solution in the sense that you can satisfy both search engines and users but it requires the developer to code basically 1.5 web sites instead of 1 (it's not really double the work but it's certainly more work than just wiring up some routes that spit back json).
I don't want to sound dickish, but I can't really give you much technical details. Sorry.
I don't know what Nuuton is but from the looks of the home page it seems to be some random no name search engine that isn't even released.
I wouldn't really call that solved, considering most end users are using Google, Bing or Yahoo.
Grade A to me is a browser that is part of the "used just about everywhere" group.
Windows 7 ships with IE 8, so I consider IE 8 a grade A browser. Using tools like history.js to offer buggy fall backs isn't acceptable behavior IMO.