
Modern Web Applications are Here - craigkerstiens
http://lucumr.pocoo.org/2011/11/15/modern-web-applications-are-here/
======
nikcub
With the last three projects I have started (none live, yet) the backend is
nothing more than a REST API and the frontend a single HTML page with
javascript, javascript, javascript.

This is what web applications are now and will be. The user experience can not
be compared to the old style of application. If you are still building
applications today that are GET, fetch, pause, render, etc. then you are years
behind. It is awesome being able to click on a link to a 10-page forum thread
or blog comment page and have it render in ~100ms.

I think all of the web server frameworks will have to adapt - from RoR to
Django etc. since a lot of what they do is being moved to the client (and it
becomes even cheaper to run large-scale web services because of this). There
are also tons of gaps on the client side - from more capable and cachable
templating engine through to a full-stack framework (something like RoR for
Javascript but less confusing and hard to use than the current)

The server now is just db+REST, auth and pubsub - soon enough somebody will
release a generic PaaS that does all this based on a schema. Almost no more,
or very little, server-side code (unless you insist on supporting old HTML
clients).

~~~
rev087
> With the last three projects I have started (none live, yet) the backend is
> nothing more than a REST API and the frontend a single HTML page with
> javascript, javascript, javascript.

You might be doing that already, but the article mentions (if I got that
right) that if you GET an arbitrary URL, the server will still serve the whole
rendered page - asynchronous loading only happen on any subsequent requests.

This way, you don't "break" URLs, pleasing search engine crawlers and allowing
copy/paste of the links, and still delivering a great experience for the user.

I truly believe there is a big demand for a Node.js framework to facilitate
this. There are already a few that allows this kind of architecture (to some
degree), but none are quite "there" yet.

~~~
moonboots
What are some of the frameworks you've seen? I've been looking for frameworks
that allow client-side javascript routing to be reused on the server. So far
I've found two, both using a combination of backbone.js and jsdom to render
pages on the server:

<https://github.com/Morriz/backbone-everywhere>
<https://github.com/developmentseed/bones>

~~~
jordow
Rendering arbitrarily on either the client or server is hard. Other solutions
that I saw forced the developer to have to think about it, so I developed
FaxJs which allows the system to handle that complexity for you.

<https://github.com/jordow/FaxJs>

The gist is that you just write standard declarative ui structures in pure
javascript and the system disassembles the markup on the server, and
reassembles it on the client with all the events still in tact. You wouldn't
know all that is happening just by looking, though.

    
    
        ...
        var twoDivs = {
          className: 'outerDiv',
          onClick: this.outerDivClicked,
          innerDiv: {
            className: 'innerDiv',
            content: 'inner-most-div!'
          }.Div()
        }.Div();
    

Or you can just do it all on the client too, if that's your thing. You'll need
to work to get this integrated into your routing technology as with any
rendering system.

------
bretthopper
I agree that Modern Web Applications are here. The key point is that they
don't _have_ to be architected the same as Battlelog to be considered modern.

Most of us are too concerned (or straight up scared) of the JavaScript/HTML5
revolution that we can't see what is possible right now. Sharing templates
server and client side is a magic bullet that everyone should be using.

Web applications are only going to get faster, and if you want to keep up,
you're going to have to do implement modern solutions like Battlelog and
Google+ do.

I can envision a time when full page refreshes are the exception and that
isn't a bad thing.

~~~
Joeri
You don't need to involve the server in generating the UI at all. You can have
a static index.html, that bootstraps a javascript environment from static js
files, whose first order of business is to contact a web service to fetch the
initial content and configuration as JSON data. This makes all content static
except that which is truly dynamic, which offers amazing CDN and caching
possibilities. The JSON data can be cached in local storage, for easy offline
support without constantly revving manifest files.

Once you get at that point, the dev experience starts to look a lot like
traditional desktop client-server app development. Except desktop apps are
built using components instead of templates, components that combine rendering
and behavior into one encapsulated entity that you can treat like a black box.
There's nothing preventing a web app from using that same architecture. In
fact, ExtJS does just this.

But I don't think it stops there. I think we're going to get meta-
configuration languages like mxml to send the layout of these components to
the javascript environment, which will, essentially, mean that we're building
another browser inside the browser. And so all software evolves until it can
render a web page, including web apps.

~~~
ntoshev
If you do this, crawlers can't index your pages. Also your users see the page
loaded blank and being gradually populated with content, which is not a good
UX.

~~~
maccman
Your first point is not correct. Crawlers can crawl JavaScript only websites
using the Ajax crawling API:
[http://code.google.com/web/ajaxcrawling/docs/specification.h...](http://code.google.com/web/ajaxcrawling/docs/specification.html)

Your second point has merit. However I'd counter it thus: * You can load a
HTML page first, containing a representation of the end content, so you can
avoid the flash * You can show some sort of loading indicator on page load.
Users are usually fine with waiting for the initial page load - it's only
subsequent interactions that need be fast.

In other words, there's no reason why the page need be blank.

~~~
sirn
However by using hashbang URLs, you have to keep that little snippet of
JavaScript forever to maintain permalink even after the web has moved on to
another solution.

It's not a future-proof solution.

~~~
fooandbarify
Hashbangs are only a stop-gap at this point - modern browsers ship with the
HTML5 History API.

~~~
quanticle
Does IE8 ship with the History API?

~~~
akavlie
No, IE8 is not a modern browser by most definitions.

~~~
quanticle
Which definitions? I've seen definitions of "modern browser" that only exclude
IE 6 and 7.

~~~
dextorious
They are wrong. They use modern to mean "quite new and available to most
clients" instead of "actually implementing the majority of the latest
standards".

IE 9 comes close, and can be considered modern.

------
azov
I'm not sure I agree with the "work of beauty" statement. A browser plugin?
Rendering all pages on the client via some massive JS framework? Intercepting
all page loads and hooking into browser navigation? Compiling templates into
Python & Javascript? Is all this complexity really justified?

I know that web apps is all the rage those days, but given the native plugin,
pickiness about browser version, the fact that they apparently not care about
being indexed by search engines, and all the trouble they went through to make
it all work together - wouldn't they be better off to just implement the whole
thing as a native app?

~~~
spitfire
This is a throwback to thick client-server computing. Welcome to 1992
everybody!

I'm not a web dev, but I like the client side rendering. (Even if I hate JS
with a passion). This is a fantastic move as it lets you very clearly seperate
presentation from business logic and the database. Also, the rendering HTML/JS
can be easily cached. So you get a normal page load the first time you try the
app, every time after that you get _instant_ results. Nice.

~~~
maxwell
Why do you hate JS?

~~~
robmcm
It's an immature language with lots of design mistakes. When the bible to JS
developers is called, 'JavaScript the good parts' it's a worry...

------
Fluxx
While I'm excited about "modern web applications," my understanding is that
they're harder to develop for and test (I haven't done one yet). If that's
true, it's worth considering if the ROI is there to make your app a single-
page javascript app vs how fast you can iterate via the more standard method.
You may not have the traffic to where offloading rending to the client makes
sense. It's likely many actions your app does are "fast enough" as well, so a
rich UI experience isn't going to be a huge improvement. Reserving the highly
interactive bits for where it really counts may be a better idea.

That said, it's only a matter of time before the "harder to develop and test
for" goes away and rich apps will become more of the norm.

~~~
andrewmccall
I think they're only really harder to develop in so much as it's more like
developing two applications. One backend that throws out JSON/XML or whatever
and a frontend in HTML/Javascript.

I actually find that bit easier, because I can separate out the the parts and
worry about things a piece at a time. The other benefit of this is if you
chose to develop for iOS, Android or any other platforms you already have an
API. Equally if you wanted a developer program, again it's already there.

The complexity comes when you make the decision as to whether or not to
support clients that can't or won't execute Javascript, if you want to make it
work for them too you're stuck with more work and things do get more
difficult.

As far as testing goes, there are plenty of mature javascript unit testing
frameworks and if your inclined to unit test your javascript already this
doesn't add a great deal of overhead. Your really just testing something in JS
you would have otherwise done and tested server-side.

~~~
ma2rten
I have to agree with you that "developing two applications" is actually
simpler, especially if your application matures and when you want start
developing different kinds of clients (e.g. mobile).

However, my experience is that it still has some mayor drawbacks. Javascripts
frameworks (testing and otherwise) are still less mature, then their server-
side counterparts are. It is not always as obvious how things should be done.
There are still lots of incompatibilities between browsers. And when an error
happens you can not log it (or at least you have to do more effort).

------
modeless
Interesting, as Battlelog has been criticized by the gaming press including
the Penny Arcade guys, who call it buggy and hard to use.

I have to say, too, that in my experience the more client-side state a web
page keeps, the buggier it tends to be. Building applications this way is
harder, and I hope we don't end up losing the characteristics that have made
the Web so successful in the transition.

~~~
ktsmith
Battlelog by itself would be fine if it was just for statistics and
socializing. On the PC it's also how you launch the game as there's no in game
server browser. When you launch BF3 it launches origin which in turn opens
your browser to battlelog. The context/app switching is annoying and slow as
you go from your browser to the game and then back to the browser when it's
time for a server change. When the game first was released and things were
slow and buggy (in game and on battlelog) it was very painful.

~~~
natesm
I never had any problems, and I have to say it is easily the best server
browser I've ever used. Valve's pre-TF2 updates one (still toggleable back on)
is probably the only other one I've actually _liked_ before. It's a huge
improvement over DICE's browsers in the past, which have always been merely
bad at best (the BF2 menu had to _load_ when you pressed escape).

A lot of users complained that there was no feature to allow you to wait in a
queue for a full server. So... they added it. Pushed a server update. No
patch, no new binaries to download. A new checkbox simply appeared.

Now, there's no reason that this has to be in browser. EA and Valve both
clearly have WebKit or IE implementations (I think I heard the "clicking"
noise in the Origin browser) that play nice in fullscreen games. They could
certainly be integrated as part of the game interface, but Battlelog makes it
clear to me that HTML and CSS are the way to go with video game server
browsers in the future.

~~~
ktsmith
During the first few days the game was out I routinely received errors trying
to get the server list at all but as I've said, they have been incrementally
improving battlelog since release. My biggest complaint is that the server
browser isn't available in game. The game completely closes when leaving a
server from in game or closing a game from battlelog. Then it has to be
relaunched when starting the next game. This is really slow for many of us and
there doesn't seem to be a very good reason for it. The use of origin also
doesn't add anything for the player. I also would prefer being able to launch
the game directly without having to open three applications (origin, browser,
bf3).

------
beggi
FYI, Armin doesn't mention it in his article but the guys that built
Battlelog, ESN, is releasing the web framework behind it:
<http://www.esn.me/product/planet/>. They also have a great service,
BeaconPush similar to Pusher, only better IMO. Both because it supports the
notion of users and also it has both a Flash websocket and XHR long polling
fallback where as Pusher only has Flash websocket fallback (believe me, this
matters, I've tried both).

------
andreavaccari
We are a startup called Glancee. We build a mobile app (iphone version +
android version) that finds people in your area with friends or interests in
common with you. The apps are native objective-c and java apps, and the
backend is a mix of python, mondodb, and a bit of erlang.

A month ago we decided to build a facebook app to reach users that don't have
a smartphone. We chose not to change one bit of code in the backend, and we
were able to build the web app in 3 weeks with backbone, jquery, and
websocket-js.

You can try it here: <http://apps.facebook.com/glancee>

The app is just one 40-line html page, the rest is javascript (and templates
embedded in js). You never refresh the page when clicking a link, which gives
you the feeling of using something as fast and robust as gmail.

CSS files and JS files are compressed with requirejs before being deployed, so
to load the page you need three requests (plus images). Right now our biggest
bottleneck is the facebook api, which is tremendously slow.

------
andrewfelix
My problem with Battlelog isn't necessarily that it's browser based. My
problem is that I'm forced to run 2 other system based applications on top of
it. Namely Origin, the sole focus of which seems to be forcing me to buy EA
games through EA exclusively.

Battlelog as a stat tracker is great. As a system of convenience run in
conjunction with Origin and the actual Game EXE it sux.

------
Spearchucker
And herewith another example of history repeating itself.

If we make the assumption that: \- The vision of Web 1.0 (mid- to late 90's)
was Web 3.0 (the modern web app). \- Web 2.0 really just evolved the
technologies and tools.

During the 90's we deplored thick client apps. We had 2/3/n tier on the
desktop, and we wanted web apps.

Now, 15 years later, we're building 2/3/n tier apps in the browser - but we
make the same architectural mistakes we made with thick clients, we ignore
user control and consent, we expose devices to all sorts of attacks that don't
exist in thick client apps...

That's hard-core irony, right there.

~~~
quadform
For those of us who weren't around, could you please tell me why we deplored
thick client apps in the 90's? Was it just that they had to be MS Windows
Win32 or MFC apps? Or is there some other reason?

------
kwamenum86
"All the pages can be rendered on both the client side via JavaScript as well
as the server. How this work I cannot tell you"

There are plenty of templating engines that have been ported to Javascript.
Mustache is the first example that comes to mind:
<http://mustache.github.com/>

Once you have the templating engine the rest of the logic is pretty easy to
build.

Rendering web apps entirely on the client in general is pretty awesome
although there are two problems: 1) the push state API is not supported in all
browsers yet, which forces you to resort the fragment identifier +
onhashchange to approximate the same functionality. And of course the fragment
identifier only affords you a fraction of the same luxuries as the push state
API. And of course onhashchange is not supported in older browsers. 2) When
you fall back to the fragment identifier rendering on the client is actually a
little bit slower. The fragment identifier is not sent to the server meaning
the javascript has to be loaded in the browser before anything at all is
rendered. Does this lead to several seconds of delay? No. But it is
noticeable. At least with push state you have the option of rendering the
initial content on the server and all subsequent requests on the client
without increasing complexity too much, assuming you have a good templating
solution in place.

But yes I agree. Modern Web Apps are Here :)

------
Too
> The real interesting thing about Battlelog however is a Windows PC specific
> component. If you are heading to Battlelog from a Windows PC and you own the
> PC version of Battlefield 3 you can launch into a game right from within the
> browser. How does this work? It works with the help of a browser plugin that
> exposes additional functionality to the in browser client. Namely it has a
> function to start the game and pass it information as well as a general
> purpose function to ping an IP address which is used for the server browser.

Unreal did that back in 1998. They register the unreal:// protocol in windows
so any hyperlinks that contain an address such as unreal://127.0.0.1 will
launch the game and connect to that ip. The good thing about this is that it
can also be used by third party websites such as promoting your clans server
and it is completely browser independent. I don't know if it could be abused
to "rickroll-launch" the game but i haven't heard of any such incidents.

~~~
marcusf
I was thinking this as well. Why not just register a protocol handler calling
out to the battlefield client. Seems much more x-browser. Why wouldn't that
work, and why is the browser plug-in real interesting? What am I missing?

~~~
the_mitsuhiko
You are missing the two way communication that is impossible with URL
handlers. A URL handler can only transmit information in one way and cannot do
that implicitly. The user has to click on that link.

------
marknutter
I don't see how this is different than using any of the myriad javascript mvc
style frameworks that are out there in tandem with websockets.

~~~
mafro
I'm glad to see someone on here doesn't think this is quite so sensational.
The only relatively new thing they've done is launching the native game from
the browser plugin - and I also wondered if a custom URL scheme would not work
instead (and be simpler).

In terms of game manufacturers it is light years ahead of what they've done
before. I just don't see why the author is quite so blown away by it..

~~~
logn
I tend to agree with you. XSLT web apps can render on the client and just
transmit XML. Same idea. GWT does this too. And I'm sure there are older
examples.

------
inopinatus
Now that the client is the MVC execution environment with the client-server
interaction used mostly for data replication, plus some extra invokable
server-side behaviours, we can congratulate ourselves on having more-or-less
reinvented Lotus Notes.

------
mark_l_watson
I was surprised to only see one brief mention of GWT in this comment thread. I
use GWT on one of my personal projects and SmartGWT on two client projects,
and even though the development process has some difficulties like long Java
to Javascript compile times, it is great to be able to write/debug both client
and server side code in the same IDE.

Something I read a few months ago: Thoughtworks paper on technology that
described GWT as a bad idea, very well implemented. :-)

~~~
mark_l_watson
I should have added for people who are not familiar with GWT: you write your
client app in Java, almost like writing a Swing app, and it gets compiled to 6
different combinations of Javascript. Google's setup code determines browser
capabilities and downloads the compiled Javascript best for your environment.
After that, the only data passed between client and server is model data for
the UI.

------
platonichvn
Companies that properly implement service oriented architectures will be very
well positioned to create these advanced web applications that push the
computational cost of rendering the UI to the client machines. An added
benefit that many overlook is the ability to execute on mobile strategies. I
do not buy the idea that mobile applications will one day all be html based.
Client-based applications will always produce a richer, more integrated
experience. But, if you have already designed your html application to
function more like a rich client. You already have the api calls that any
other mobile, or desktop application would need. You have forced yourself to
design the back-end based on services that encapsulate a good amount of
business logic that will not have to be repeated when implementing the
different "views" of your application.

~~~
andrewmccall
That's the big win I think this architecture has, it's already an API and
already ready to create platform specific implementations.

It also makes a developer program something that's quick and easy to support,
again the API is ready and waiting.

------
xtacy
This reminds me of Quora's LiveNode stack: [http://www.quora.com/Quora-
Infrastructure/How-does-LiveNode-...](http://www.quora.com/Quora-
Infrastructure/How-does-LiveNode-work)

------
euroclydon
Is this phrased accurately?

 _The framework then hooks into your browser's navigation code an intercepts
all page loads. Instead of letting the browser replace the page with something
new on load it instead does the HTTP request via Ajax and adds an additional
header to the HTTP request: X-Ajax-Navigation._

Wouldn't it be better to intercept the page leave event, rather than load?

~~~
stdbrouw
I figure it probably just intercepts all click events, which usually trigger
loads, hence "intercepting page loads".

------
baddox
Does anyone have any good examples of a substantial web app using this
architecture that doesn't require purchasing a video game or creating an
account? I'd like to take a look at Firebug (actually, Chrome developer tools)
and experience the snappy performance the author talks about.

~~~
skeletonjelly
I like poking around the javascript "applications" of Twitter and Stack
Overflow. The latter uses jQuery, and both provide for a nice code "book" to
read (granted, the variables are minimised, and the code compressed, but you
can fix the latter by running it through jsbeautify).

------
forgotAgain
Worked on something similar with javascript, nginx, gevent, 0MQ, and a C
backend. The only issues we had was with file uploads for transfers of large
amounts of data.

How do other people do file uploads with an asynchronous web server?

~~~
wildmXranat
I would suggest to use the nginx file upload module, which would only pass the
file path to the uploaded file to your gevent/tornado app server.

~~~
forgotAgain
Thanks, I'll check that out.

------
oh_no_my_eyes
to get battlefield 3 working on ma friends windows vista pc i had to install
215 updates. it took 6 hours. why? because the machine had to have IE9
installed even though that wasn't his default browser and wasn't the browser
started when origin kicked off battlefield. so for sure this is cool in
theory. the idea is certainly plausible and fun to talk about, but melding the
complexity of web apps with the complexity of system apps makes me shudder.

