One can still design good API-oriented URIs with a server-side approach, you're just providing different serializations of the same resource (a nice HTML one and a JSON one for API access).... so there's nothing fundamental that Twitter has lost or abandoned here. They're just using the web better.
Server-generated HTML: the browser GETs it and renders it as it streams through.
The latter is arguably something that will lead to a better user experience once the page is in a "steady state", i.e. all dependent representations are loaded into the browser and rendered. But relying on it for "first render" makes for a slow experience when (e.g.) clicking on a link to an individual Twitter status.
Also, with client-side rendering you execute more code on the client but less on the server, so in an environment like Twitter where it's not possible to do any sort of heavy caching (everybody sees something else), you're simply trading time on the server for time on the client. Not faster, not slower.
Server-side HTML generation is not a magical 0ms process.
I'm not quite sure that it's as much of a zero-sum game as you present it. I can easily think of scenarios where rendering on the server is much faster (e.g. using a compiled language vs JS, taking advantage of powerful hardware, granular caching, etc) and much more constant.
In theory it sounds right. However, there are a couple of cases where users will have to load JS a lot more than they should. Since most of the logic lives in the JS file(s) they will be changed and pushed out a lot more. This will force users to download the JS every time code is deployed.
Also, I am not sure what percent of "New users" land on Twitter pages, but they will have to download the JS.
And generally from a user perspective you merge the steps "show site" & "show content" back to "show site with content".
So even if the server takes just as long to generate the HTML (and I don't think that's the case) the perceived speed will be higher when the site loads and that's it compared to loading, showing something and loading again to show the rest.
edit: parasubvert was faster and said the same thing with fancier words;)
Also, it's perfectly possible with client-side rendering to show a blank page until you have all the data. Would it be perceived as being faster? Well, you can't really say that until you test it, can you?
On a side note, it almost seems like you're trolling - given that your own site seems to render content on the server. ;)
> Why would you assume that client-side HTML generation is slower than server-side generation?
* server hardware is assumed faster than my smartphone
* servers can share caches
* http conditional gets can now apply to rendered content (as apposed to templates and data which would require re-render client side)
It takes serious balls to admit you were wrong after you kicked off an entire avalanche that has been breaking the web ever since. Yes, a lot of us knew then it was a ridiculously bad idea and we all said so; but for them to actually take this advice after going so out of their way to go completely client-side is just fantastic. Kudos, twitter folk!
Also, as some people already noted, Twitter didn't abandon client side rendering. If you go on the site now, you will still get redirected to the #! page.
You want this: http://engineering.twitter.com/2012/05/improving-performance...
Besides, in many cases I think you can fix performance issues on a one off basis where a page that gets too slow because of heavy JS and/or API proliferation could move server side if that makes sense while the rest of the app rendering stays client side. It doesn't have to be all or nothing.
What does your post add to theirs? Honestly?
Then link to them here as well. Save your own blog post for when you've got some original insight to add, then submit it to HN. Not before.
>I'm looking to get discussion here and on Twitter...while I formulate my own thoughts...and more analysis. That's why I said...your the detail of this post.
That's lazy as shit.
If the initial JSON was coming bundled with the page, the only reason for the slowdown is parsing JS? I thought that was what browsers were good at those days...
What Twitter was experimenting with was a thick client. The initial page load would contain little more than the basic Twitter architecture and a URL, and this architecture would send off more requests (individual tweet, profiles, responses, lists) based on the contents of the URL, updating the page to fit what it gets back each time.
In a salad bar, you don't get one plate. There's a whole barrage of vegetation to select from, and you have to walk around yourself to get any of it. You also have to spend time deciding what you want. Metaphorically, the salad-eater is your web browser, and that time it takes to send and receive more requests, as well as make decisions and display the results, turns a 140-character message into a slowly loading behemoth.
That's the essential difference.
In this case it would be moving some of the controller logic (the parts responsible for rendering views and dispatching events from the UI) to the browser, and using an API to communicate with the model (and the rest of the controller).
So, if I'm going to display a page of 10 tweets, in the classic server side architecture, I load the tweets, render a view template based on a context containing those tweets, and then send the HTML to the browser.
In an API-based web application architecture like this, I _always_ send the same HTML, which is cached in memory and is almost a no-op to send to the browser, then the client side scripting looks at the URL of the page and makes a call to a JSON or XML API that will result in those 10 tweets. These tweets are then rendered on the client side by the browser.
This is beneficial in a few ways:
* You can re-use the exact same API and expose it to third parties, who can then write apps with all the functionality of your webapp, and you've one less thing to test.
* Rendering a template can be a (relatively) time-consuming operation that may require some I/O. On a single-threaded platform like Node.js, where you simply want to dispatch an event (request) as fast as possible and move on, rendering a template takes a lot of time.
* If the time is going to be spent rendering that template anyway, you may as well crowdsource it and let the browser handle it. The browser is probably only processing one page load at a time, whereas your servers could be processing thousands or millions.
The cons of using this system are related to the benefits:
* The API you use is likely to suffer from Abstraction Inversion, or if that's intentionally avoided, a Leaky Abstraction. The reason for this is that there may be some pages to be rendered that require more complex, or less abstract, queries on the data. On the server you could easily issue a JOIN query, but it may not make sense to expose such a thing directly via a RESTful interface. Therefore your application may end up firing numerous API requests and joining the results manually in order to complete a request. If you DO expose this functionality, it will probably end up looking very out of place and very specific.
* The extra time spent by the browser may not be suitable in all cases. It poses a number of accessibility and performance problems on certain devices. I may not be able to use a certain website with a screen reader because of this, or I may not be able to view it on an ancient version of Opera running on a shitty old phone over 3G. Even on a modern smartphone, the time taken from seeing a Twitter page's header and background with the loading indicator to actually seing the content was a pain in the ass.
* Depending on your application and architecture, you may be able to deliver even better performance by caching rendered templates than by offloading it to the browser. For example, I believe Reddit pretty much prerenders and caches every single page on the site, which is why most stuff loads instantly, but at peak times it takes a while to load the message inbox, or the 80th page of posts. This might be made even worse by extra API calls and client side rendering.
If I may, I'll employ another Anti-Pattern, Cargo Cult Programming. This means that a team sees some method or 'One True Way' and starts implementing it without understanding why.
It's almost guaranteed to go wrong, and it tends to apply a lot to things that are hip, like this architecture and things like NoSQL.
Not really sure "web application architecture" unambiguously indicates "client side controller", but it does in this dialog when contrasted with "server side architecture".
Are we surprised to see a step towards SOA (presumably from OO that was also at the service layer)?
For me Web / mobile apps that are architected from a primarily OO perspective can turn out very differently (often with unique OO bottlenecks) than apps architected from a service-first perspective, using OO to fulfill services.