
The Sun is Setting on Rails-style MVC Frameworks - jamesjyu
http://caines.ca/blog/programming/the-sun-is-setting-on-rails-style-mvc-frameworks
======
tomdale
I wouldn't say that the sun is setting on Rails-style MVC frameworks, but I do
think their role in the ecosystem is going to change. Before, most people
could get away with writing the entirety of their app as a single Rails or
Django app.

I think the shift first started with the ascendancy of native mobile apps.
Now, developers had to seriously start considering their HTTP APIs as first-
class citizens and not nice-to-haves. Once that happened, it's not a big leap
to realize that treating your web application as somehow different from any of
your native clients is a bit, well, insane. You can either choose to write a
server that is a mix of JSON API and rendered HTML, with conditionals all
about trying to figure out the right thing to render, or you can pull all of
that logic out into a stand-alone JavaScript application, with better
responsiveness to boot.

I think this approach is a winner. The server guys can focus on building a
kick-ass server, and the front-end guys can build an awesome UI without having
to muck about in the server bits.

One thing that still blows my mind is how _hard_ it still is to get data from
the server to the client. Everyone is writing custom code specific to their
app. As the article says:

 _There's no reason for us to all separately think about these problems and
solve them in a million different ways every time we're confronted with them.
Aside from the years of wasted time this involves, we've also got a bunch of
non-standard and sub-standard APIs to interact with, so all the client code
needs to be custom as well and nothing is reusable._

I think this is a huge problem, and Yehuda and I are doing our part to try to
solve it. Our Ember Data framework (<http://github.com/emberjs/data>), by
default, assumes certain things, such as the names of your routes and how
associations should be loaded. We want to enable people to start building apps
_right now_ instead of writing hundreds of lines of JavaScript that are custom
to their application, and do it in a sufficiently comprehensive way that you
very rarely need to drop down "to the metal." For example, we handle things
like mutating associations on the client-side even before a guid has been
assigned to a record.

Personally, I'm excited for how this is going to all shake out. I think Rails
will continue to be an important piece in the toolchain, but it will no longer
be the primary one, and I can't wait to see how it evolves to fill that role.

~~~
jshen
"You can either choose to write a server that is a mix of JSON API and
rendered HTML, with conditionals all about trying to figure out the right
thing to render, or you can pull all of that logic out into a stand-alone
JavaScript application, with better responsiveness to boot."

You get better responsiveness if you send down pre-rendered html for the first
page load. This means you still need the conditionals, or suffer on
responsiveness.

~~~
jashkenas
Not in the slightest. Speaking categorically, JSON + templates are smaller to
transfer over the wire than fully-rendered HTML -- and can be cached in
exactly the same way ... even bootstrapped into a single HTTP request.

If you care about optimizing it, you can get your JS UI to render faster than
the equivalent pre-rendered HTML would have.

~~~
stock_toaster
Don't you still have to render server side for crawlers and screen readers
(accessibility)?

~~~
bill-nordwall
Nope. Screen readers have been able to handle javascript-generated content for
years. According to WebAIM's most recent screen-reader survey, 98.4% of screen
reader users have javascript enabled:
<http://webaim.org/projects/screenreadersurvey3/#javascript>

~~~
mmahemoff
Crawlers, less so. The major search engines are getting smarter about it, but
for now, you still need some kind of HTML output from the server if you really
want to be indexed properly.

It doesn't rule out a pure client-side app at all, but you do have some extra
work involved to output HTML from the server. Which is why NodeJS will ride on
the coat-tails of this approach; less redundancy.

~~~
techwraith
Hah! "The major search engines" - you mean Google, right? ;)

------
run4yourlives
Round and round the mulberry bush we go.

Evolution of computing:

1\. Direct mainframe access (No client)

2\. Networked access via "dumb" terminal (thin client)

3\. Client/Server technology (Thick client)

4\. Client/Server over the internet (thick client)

5\. Browser based applications (thin client)

6\. Device specific apps (thick client)

7\. Hrm.. If I had to guess, device independent thin clients...

~~~
strictfp
Where did AJAX apps go? If you ask me these are thick clients. The funny thing
is that HTTP explicitly omitted AJAX in order to acheive linkability and thin
clients. With AJAX linkability got thrown out the window and the clients just
keep getting fatter. No wonder that the server gets a less important role. I
hope that people understand that they are breaking REST when they bild these
type of apps.

~~~
secoif
How are they breaking REST? What does the method of consumption (e.g. AJAX
enhanced pages vs single page apps) have to do with restful API design... at
all?

edit: also possible I misunderstand your statements

~~~
strictfp
I was talking about REST websites, A.K.A web 1.0 websites.

What do I mean? Well, a 1.0 website generally follows REST. REST means
representational state transfer, which means that each application state
should have a representation on the server. Another way of expressing this is
that each possible state on the website should have an URL. No more "go to
example.com and click X, then Y then scroll then Z". Instead, every possible
state has a link, so you just give the link. HTTP was designed to enforce
linkability.

Enter AJAX. Suddenly the server is out of control. You can now deviate from
the linkability principle, and a lot of apps do.

When the linkability constraint is lifted, the client state is allowed to
deviate from the server state. This gives less responsibility to the server.
No wonder that it gets less to do.

That is what I meant.

You can still use a REST API from a web 2.0 website, that is another question.
But a web 2.0 webpage plus an REST API to fetch data makes the total app non-
REST (at least if you don't actively try to make it such).

------
bentlegen
> Front-end frameworks like backbone.js, as well as advances in web
> technologies like HTML5's history.pushState are now making server-free views
> a realistic quality of cutting-edge front-ends.

This is not exactly correct. Many web applications are now foregoing client-
side templating and are back to doing it on the server. GitHub is a great
example of this, and Twitter is going this route too.

~~~
nailer
> Many web applications are now foregoing client-side templating and are back
> to doing it on the server. GitHub is a great example of this, and Twitter is
> going this route too.

They had client side templating, and they switched back to server-side? Got
any links for that, I'd love to read them (seriously, not snarkily).

~~~
scotth
No, they didn't. Pop the inspector's network panel open and watch what happens
on Twitter. All the content comes down as JSON.

~~~
andybak
The claim is that they in the process of switching back, not that it's already
rolled out.

~~~
lapusta
They were switching from hashbang routing to pushstate, not from client to
serverside rendering. Although they might to pre-rendering for IE, which will
get support of pushstate only in version 10.

<http://storify.com/timhaines/hashbang-conversation>

~~~
andybak
Hopefully they are going with pushState with a fall-back to server rendering
for all user agents that don't support pushState.

That's a compromise I can get behind. And it's pretty much progressive
enhancement...

------
stephen
I don't buy HATEOAS--this notion that clients will be adaptive enough to take
advantage of changes in the rel/whatever links seems unlikely.

(E.g. I can't see a client going "oh!, there's a new business function I
haven't seen yet, let me invoke that!".)

With the rels/links, you're just moving the coupling away explicit URLs to the
names/identities of rels/links in the response.

Unless you anticipate changing your URLs often, I don't see this as being
terribly useful.

~~~
bct
> (E.g. I can't see a client going "oh!, there's a new business function I
> haven't seen yet, let me invoke that!".)

Of course not, nobody thinks that. That notion does not exist.

> With the rels/links, you're just moving the coupling away explicit URLs to
> the names/identities of rels/links in the response.

I suppose, but that's a _much_ looser coupling than the alternative (i.e.
writing in the documentation "the comments URL is
<http://example.com/comments> \- you can't rearrange your URL structure, you
can't start using a different domain (e.g. a CDN) for comments, existing
clients can't use other sites that implement the same API, etc).

HATEOAS is about building general protocols rather than site-specific APIs.
That it makes it easier to change your own URLs is just a bonus.

~~~
stephen
> Of course not, nobody thinks that.

I'm pretty sure Fielding does, see "improved on-the-fly":

"The transitions may be determined (or limited by) the client’s knowledge of
media types and resource communication mechanisms, both of which may be
improved on-the-fly (e.g., code-on-demand)."

[http://roy.gbiv.com/untangled/2008/rest-apis-must-be-
hyperte...](http://roy.gbiv.com/untangled/2008/rest-apis-must-be-hypertext-
driven)

Hypermedia is great for intelligent clients (e.g. humans) who can adapt to,
say, a webpage changing and new fields suddenly showing up in the hypermedia
(HTML) that are now required.

However, for an application, it's going to be hard-coded to either do:

1) POST /employee with name=foo&age=1, GET /employee?id=1

Or

2) GET /hateoas-entry-point, select "new employee" link, fill out the 2 fields
(and only 2 fields) it knew about when the client was programmed (name, age),
post it to the "save employee link", go back to "/hateoas-entry-point", select
"get employee" link, fill in the "id=1". (...or something like that).

In either scenario, the non-human client is just as hard-coded as the other--
it's either jumping to external URLs or jumping to internal links. Either way
those URLs/links (or link ids) can't change and the functionality is just as
frozen.

Perhaps the benefits of hypermedia would be more obvious if Fielding built a
toy example or two that we could all touch and feel instead of just dream up.
But so far there seem to be a lot of non-HATEOAS REST APIs that are doing just
fine sans hypermedia.

~~~
sopooneo
This has been my feeling for a long time, and I have never seen a HATEOAS
proponent address it to my satisfaction. Frankly I think it is a pretty
important point. Are we expecting automated consumers of a REST API to be
curious and spontaneous the way human users of the web are?

~~~
extension
With the exception of spiders and other AI-like things, no, we are not
expecting clients to spontaneously consume RESTful services in meaningful
ways. REST clients are generic. They don't know anthing about specific
services, thus allowing those services to evolve independently. A client that
is coupled to a specific service is not RESTful, nor is any API that can only
be consumed by such a client.

~~~
sopooneo
So say you want to get data from a RESTful web api, do you have to customize
your generic REST client? Because everything I've ever written that called an
external API had to know what it was looking for in advance. Like to interact
with Twitter's API, I went to their documentation page and read up on what
URL's to call for the information I needed.

~~~
extension
If you want a client coupled to a specific service then you don't want REST,
you want a classic client-server architecture, which is more or less the
antithesis of REST. But everyone insists on calling it REST when it goes over
HTTP, then they complain that the apple tastes nothing like an orange.

------
shaydoc
Personally from an ASP.NET perspective, its the same. To me ASP.NET is done.
Mixing in server side with client side UI is just wrong, they should be
decoupled completely. I want to build a html5/js/css UI. I want to keep it
clean and I want to talk to a data service/ or mocked out data service to do
what I need. Starting out with ASP.NET, I used to think this is just totally
the wrong way to do things! Then ASP.NET MVC came along which was much nicer
than all the pain of web forms ( viewstate etc..) , but now as I said already,
I want my client side to be written with no knowledge of the server side
workings, just plain old html(5)/css and leverage the power of javascript and
associated frameworks ( i like knockout.js and amplify.js).

Its the convergence on standards and the sophistication of browsers that
really matters. We all use browsers, all the big vendors now are appreciating
this, enabling progression to standard presentation technology. Innovative
successful businesses are delivering great services and UX using these
standards and they are showing the rest that they need to move to keep
competing.

Responsiveness and performance are strong drivers for a rich/smart/thick
client, there really are very little advantages to a thin client in
application terms from a UX point of view.

Managing upgrades is a sinch with browsers, so if you have a sophisticated
client side javascript UI its easy to re-release.

All in all, I tend to agree poster here!

------
gerggerg
Sure, I guess if you fundamentally misunderstand MVC but not in the real
world.

You don't have render your view on the server, using a js framework is
perfectly fine in MVC. There are plenty of rails plugins to make this even
easier and I whole-heartedly disagree with, _"Rails-style MVC frameworks have
a horrible routing solution for RESTFul JSON APIs"_. Rails has simple routing
that works great for restful resources and auto renders json that can be
easily overridden.

Having a unified/standardized REST interface for web services would no doubt
be nice, but it's existence has little to nothing to do with MVC.

------
jarrett
I'm very much looking forward to (and hoping to help build) thick-client
frameworks with the same level of polish and maturity as Rails 3. Obviously
there's a long way to go, but I think it will happen.

I'm not a Node expert, but I'd be willing to bet we end up with at least one
Node framework at the head of the pack, if for no other reason than it enables
easy sharing between server and client-side logic. (E.g. write your validation
rules only once.)

You could of course write the backend in any language, and you could even have
some code sharing if you automatically generated JavaScript from your backend
language. (E.g. ClojureScript.) But I'm attracted to the simplicity of just
using JS, and not having to worry about all the weird little things that can
happen when you compile one high-level language to another.

~~~
moe
_at least one Node framework at the head of the pack, if for no other reason
than it enables easy sharing between server and client-side logic_

I'm surprised this still hasn't been seriously tackled. The major node
frameworks (namely express) all seem to follow the old-fashioned rails-model
which seems like an anachronism on that particular platform.

~~~
AdrianRossouw
Our Bones[1] library for node.js does this, in most applications we have
written probably in excess of 90% of the code is shared between server and
client.

We have written cloud based hosting systems[2], desktop applications[3] and
more standard web applications with it. I think it wouldn't be too hard to
make it possible to build phonegap'esque apps with it too.

It's pretty great stuff, but it opens you up to very new and interesting
problems due to the environment being so different. The client side has
absolutely no concept of a 'request', and there is a lot of stuff that just
can't be done on the client (is this email address unique?). The server side
does not have the long running state that the client does, which causes
another category of problems.

I think it's going to become a more dominant approach, because it's just so
damn convenient, but it's going to take a bit more time to properly 'crack'
it.

[1] bones : <http://github.com/developmentseed/bones> [2] mapbox hosting :
<http://mapbox.com/tour/> [3] tilemill : <http://mapbox.com/tilemill/>

------
tbatterii
"This probably means that 'controllers' need to go away in favor of
'resources', and routes can be vastly simplified, if not completely
derived/inferred by the framework."

[http://docs.pylonsproject.org/projects/pyramid/en/1.3-branch...](http://docs.pylonsproject.org/projects/pyramid/en/1.3-branch/narr/traversal.html)

~~~
rbanffy
Also, Zope maps URLs to method invocations on objects or their containers -
<http://server/ob1/ob2/m1> is served by storage['ob1']['ob2'].m1() (there are
ways to mess with that, of course)

~~~
tbatterii
yeah, traversal is one of the nice things pyramid took from zope. Very
powerful if your data model fits in a nice hierarchy.

------
velshin
There's often a distinction between APIs intended for general consumption
(platform APIs) and APIs optimized for JavaScript and/or mobile clients
("private" APIs).

A platform API tends to be stable, versioned, well documented, and
"unoptimized" or strongly RESTful. GETing a resource (noun) returns just that
one representation. e.g. GET /v1/user/123/profile or GET
api.linkedin.com/v1/people/id=123

"Private" APIs tend to return more data in bulk, optimized to reduce the
quantity of remote calls and improve page load times. The responses tend to be
structured in a way that's easier for the client (browser/mobile app) to
render content, usually by including more tangentially related data than a
traditional REST resource would contain. e.g. a browser's load of
<https://github.com/rails/rails> does GET
github.com/rails/rails/graphs/participation

Twitter uses the platform API in the browser. e.g. GET
api.twitter.com/1/trends/1.json

I'd be interested to hear from others leveraging APIs in their browser/mobile
clients what they're using for MVC (e.g. backbone.js vs server side) and
whether they've "optimized" their APIs for the client.

------
zoul
On related note, it surprises me that client-side frameworks like
cappuccino.org are not used more often for building full-featured web apps. It
seems to me that the abstraction from HTML and CSS is quite a desirable thing,
I find the experience of modern desktop frameworks like Cocoa thousand times
better than fiddling with HTML. Does anybody here have substantial Cappuccino
experience? Why isn't it more successful?

------
simonw
Only if you think thick JavaScript apps hooked up to JSON APIs are a good way
to build for the Web. I don't.

~~~
sopooneo
Why not? I ask sincerely. It seems like a reasonable approach to me for web
apps, just NOT for simple sites that rarely change or are not "application"
like

~~~
simonw
Plenty of reasons.

Firstly, web app vs web site isn't a binary distinction - it's a gradient. Is
Flickr an application or a site? It provides extensive tools for uploading and
organising photos... but the bulk of the site is flat pages people use for
navigating vast amounts of content.

Secondly, URLs really, really matter. Twitter have a big engineering task on
now to clean up the mess made when they switched to using broken-URL #!s. The
ability to link to things is the most important factor in the Web's success.
An application built as a fat JavaScript client that talks to an API is opting
out of the URL-driven web.

Even if something uses #! or pushState to emulate linkable URLs, if I can't
hit a URL with an HTTP client library (not a full-blown JS+DOM implementation)
and get back a content representation, it's not a full-blown member of the web
- it's no better than content that's been wrapped up in a binary Flash
container.

Don't think I'm not pro-JavaScript - I'm a big fan of using JavaScript to
improve user experience etc (heck, I have code in the first ever release of
jQuery). I'm just anti JavaScript being required to consume and manipulate the
Web.

I'll concede that there are plenty of applications for which a fat client does
make sense (image manipulators, data vis tools, possibly interfaces like gmail
although I'm continuously infuriated by my inability to open gmail messages in
new tabs). But the thinking embraced by the original article, that Rails-style
frameworks are on the way out because EVERY site should be built as a fat
client, is dangerously short-sighted in my opinion.

~~~
mattbriggs
Hashbangs are just a hack while we are waiting for the browsers of the world
to all support push state. It is easy to use, well supported, and works really
well. What twitter was saying is that in a public, content based app like
theirs, the trade off of using that hack is not worth it, so they are moving
to push state, and degrading to a worse experience for browsers that don't
support it. Twitter isn't an argument against js apps, it is an argument
against js hacks to provide fancy functionality to old browsers.

~~~
swah
But then we need to use the same set of templates on server-side (on full page
loads) and on client-side (when updating via JSON) ? Or we do like Quora and
generate HTML on the server-side?

~~~
mattbriggs
I'm not saying I agree with twitter, was just trying to explain their argument
:) I think it is better to go fully one direction or the other. Either don't
support IE9-, or go full reloads until they feel comfortable not supporting
IE9-. (or stick with the hash bangs)

In a more general way, I use backbone to make data driven components. Those
components are always rendered client side, and the layout/static content is
rendered server side. I think duplicating would be theath to madness.
Generally, it's fine to bootstrap initial data on page load, and render
everything. But in times where that takes too long, I have rendered a "dead"
version on the server (like, greyes out with a spinner) then replaced it on
the client.

------
dkharrat
I believe SEO is one of the main things that hinder client-thick architecture
adoption for many apps that depend on their content showing up on search
engines, especially among content-driven websites (e.g. Stackoverflow, Quora,
etc.). But I agree, all signs are pointing at the direction of thick-client
implementations and eventually search engines will solve the indexing problem.

~~~
swah
But Quora loads HTML chunks via AJAX.

------
j_baker
I agree with the author, but for diametrically opposed reasons. It's more that
I don't think my web framework should decide that I need to be using MVC. If I
want to tightly couple my view logic to my model logic, that's my (and my
team's) business. If I want to drink the MVC Koolaid, that's my business as
well. It isn't the business of the person who wrote my framework.

 _It's much simpler to handle views and view logic in only one place, and that
place is slowly moving away from the server side._

I see. So one is simpler than two? That's a bit simplistic. Every situation is
unique, and it's impossible to make such a sweeping statement that's true in
all cases.

------
robfig
So MVC frameworks are going away for thick-client apps because their routing
is not convenient for defining resources? And that their templating abilities
are too powerful (or not powerful enough?).

Not sure I get it.

(Current routing schemes do not seem overly difficult for this, and depending
on your MVC framework, you can plug in your own routing.)

------
Eleopteryx
Here's my anecdote:

1\. I dumped view helpers for decorators (namely, the Draper gem). I got rid
of complex logic in my view templates. Most of the logic I need in my views is
transformed object properties. I need a date to appear as mm/dd/yy. I need a
filesize in bytes to appear as KB/MB. I want a comment formatted as Markdown
to be HTML. So now I'll decorate my objects so I have a nice
post.formatted_date instead of needing to write the helper
formatted_date(post). At worst I have some conditional statements in my views
looking for the existence of a property. CSS also comes into play; the :empty
selector allows me to cheat in some cases.

2\. I write my view templates in a JavaScript templating language. I am a fan
of slim for Ruby, so I went with Jade for JavaScript. I use therubyracer to
call on these precompiled templates within Ruby itself, and render them like
anything else. I then go on to use these same templates client-side. The
reason why I abandoned view helpers to a large extent is because of this. Any
logic that I need within a template would have to be duplicated server-side
and client-side, which is antithetical to the goal. For my use-cases I've been
able to do this successfully. It requires an adjustment to mindset, but is
feasible. And really, my templates are a lot cleaner now than they've ever
been.

3\. When a user lands on a page, they get rendered HTML. Subsequent requests
use AJAX and JSON responses to load things dynamically from there. Best of
both worlds. Also, users who have JavaScript disabled can use the site albeit
with not as much slickness.

4\. Using to_json is hell; don't do it. I use the RABL gem for assembling my
JSON, and use the same decorated objects. In the case of JSON, depending on
your API, you might want to include say an ISO8601 date as well as a formatted
date. Not a big deal, just vaguely duplicative.

The downside was how much code I had to write for myself. Using JavaScript
templates was a biggy. But this is something that could probably be packaged
as a gem, if I or someone else took the time. The framework (in my case,
Padrino) still provides lots of tools that I need. Ruby ORMs are a big part of
this.

There's still the issue of duplicative routes. I have routes defined in the
app itself, and then within my JavaScript framework (currently Spine, but
previously Backbone) I have to hardcode these same routes. I don't like this,
however, routes are probably the last thing to change in my app if I put any
thought into them ahead of time. This is something that requires additional
thought, but I'm thinking there should be a way to get my app routes available
client-side.

------
SideburnsOfDoom
Well, the ASP.Net MVC 4,0 beta, released a few days ago, has better support
for "Web API" (i.e. Data endpoints) and for Single Page apps so it seem that
they are aware of these trends.

Links:

[http://weblogs.asp.net/scottgu/archive/2012/02/19/asp-net-
mv...](http://weblogs.asp.net/scottgu/archive/2012/02/19/asp-net-
mvc-4-beta.aspx)

<http://www.asp.net/single-page-application>

~~~
shaydoc
I seen this and I downloaded it earlier this week, it wouldn't build straight
off, there were bugs in the script references, also EntityFramework 4.1
wouldn't resolve for me locally (had to add web.config ) , etc, not a great
start (no biggie, just saying). Resolved these issues anyhow.

On review, its fair enough if you love ASP.NET MVC, but I really don't like
the mixing of the View Logic with the Model in ASP.NET MVC, like I already
said keep it clean. From a View point of view keep it decoupled I say, keep
any server side code out of there. At the end of the day , I have to agree
ASP.NET MVC has got rich resources and it is a great offering for developers,
no doubt.

And MVC 4 useS knockout js behind the scenes anyhow, which is really great and
personally I think that knockout goes far enough.

The only reason I say this, is because I just refactored about 20 Web Form
Views to pure HTML/CSS/js on the client side and turned out I had no need for
any server side at all except for the restful service (hidden away in my
knockout view model, used amplify.js to abstract the service calls) created on
top of my domain model. so I am thinking, why do I need MVC if I have Restful
WCF/ ASP.NET Web API. Going this way means I am totally decoupled!

I am much happier building a restful service, knowing that any client can
consume this, and I think I am happy building client side anyway that I so
choose and not tying myself into ASP.NET MVC unnecessarily.

So here's the deal for me if go pure W3C on the client :

1\. No mixing in logic

2\. Don't care about the server or server side code.

3\. I can mock my backend real easy ( e.g. Amplify.js )

4\. I don't need Visual Studio for client side dev.

5\. There's a growing wealth of open source libraries

6\. Makes me think more about the structure of my server side behavioural
domain model.

Anyways, just how I see things..always open to more compelling arguments!

~~~
SideburnsOfDoom
Can you expand more on "the mixing of the View Logic with the Model in ASP.NET
MVC" – I must have missed a trick in how to further separate concerns.I favour
strongly typed views, ViewModels and thin controllers. But are you talking
about how the generated HTML is just output, and contains both page structure
and page data?

Also with this release of MVC, the Restful data controllers are added on the
side. Designing this from scratch, it would probably be different – if your
client wants JSON, XML or some other format, then the Data endpoint is for
you. If you want text/html, then it's special and you go somewhere else.

We can already serve JSON or XML data off controller endpoints in MVC3, and
using a bit of extra code, even switch between them by checking the Accept
header. But never mind, it's not done until it's in the framework.

If you really are past using MVC views entirely, and have just pages without
server-side markup (except perhaps feeding in data URL) + a rest API then you
are an outlier and can consider other fameworks beside MVC, such as OpenRasta.
Or Ruby.

MVC is not an opinionated framework – you can do things any number of ways. If
it supports Data API + static HTML websites, which it looks like, that will
not be the only thing that it supports.

~~~
shaydoc
I am not past anything, I am just saying this paradigm is now kinda defunct to
me. I would prefer to work a little differently.

No doubt MVC facilitates all of the above, if you favour strongly typed, then
that's good, continue that way, but its not necessary ( controller is now
basically going to be a restful web service )

"We can already serve JSON or XML data off controller endpoints in MVC3" -
true, and you'll probably be using Web API to do that soon.

I am just saying its not necessary to mix, I feel it over complicates the
client by making you mix serverside logic a la razor, webform whatever.

I don't particlarly see the point in it now though. If you look for example at
the knockout mapping plugin for knockout js, it will take a json source and
automatically resolve into your js viewmodel. so on the client all I am doing
is thinking about the client. give me a rest api and some json objects and I
am off and running.

All I am simply saying, Pure Html/CSS/JS can be done without the need for any
knowledge of the server on client side, just the interface contract and like I
say a restful wcf service / Web API is good.

I am not an "out lier" when I say I think its a good way to develop, its my
opinion. The truth is, I did refactor a load of "webforms" lately and the
result was,

1\. Html/Js/css - knockout.js and amplify.js

2\. JSON Service ( c# in behind serving the data and validating business rules
)

~~~
SideburnsOfDoom
> "We can already serve JSON or XML data off controller endpoints in MVC3" -
> true, and you'll probably be using Web API to do that soon.

Yes, I will as soon as I'm on a released version of ASP MVC 4. I get that
format flexibility at no extra cost using no custom or server-side code. Win.
It enforces the separation of concerns between serving html pages and serving
API data. Which may or may not be good depending on how you think about your
App. For my existing apps I think it's positive.

ASP MVC is a framework that is not opinionated as I said - it won't insist on
doing things the right way, and it is also a framework that is not innovative.
Most or all features have been pioneered and proven elsewhere. I'm not saying
that as an insult, there are big upsides to that approach and I can see why MS
stays in the mainstream.

------
programminggeek
I don't think Rails-style MVC frameworks are going to disappear for a lot of
content heavy internet sites, but API first architectures are going to
continue to become popular I think. That is part of the reason I built radial
<https://github.com/RetroMocha/radial>

The tooling around rapid API development seems very immature right now, but
it's going to keep getting better. Once it is drop dead easy to write API's
first, you will see a lot more apps having a HTML5 app, Android app, iOS app,
Windows 8 app, Mac app, etc. as more of the standard and less of an edge case.

------
lukifer
Monoculture is a terrible thing. I don't think there are any frameworks or
methodologies which should never be used, nor any that should always be used.
Learn the advantages and tradeoffs of each, and use the right tool for the
job.

That said, I've personally found that as nice and clean as MVC frameworks can
be, they aren't always necessary. As long as there is a separation between
logic and presentation, it's possible to write clean, readable, maintainable
code without a formal MVC structure.

------
rubynerd
That's all well and good, if you do your API with Rails.

Assuming this is an actual API, consider a rouge script slamming your API,
whoops, your main app is out as well, nobody is signing up for your service.

Assuming this is for backbone/batman/other-generic-clientside-framework, nope.
Too much JavaScript makes everything seem sluggish, and I really don't want to
reimplement everything twice (once in Rails for conventional browsers and for
people who disable JavaScript, second in client-side MVC)

~~~
techiferous
> for people who disable JavaScript

I stopped caring about this long ago. I write database-backed web applications
that require authentication, and in this context if a user has JavaScript
turned off, that's _their_ problem, not mine.

~~~
pdwetz
Sort of funny; as soon as I hit a new site that requires javascript, I often
run away! But of course that's my problem; I get why user preference would get
in the way of your clean implementation. :)

~~~
coderdude
Well, when you cripple your own client there's not much you can ask for.

------
singingfish
So all I'm seeing is complaints about inflexible URI routing, and excessively
opinionated model classes. This stuff is basically solved in at least one
other "rails style" framework
(<https://metacpan.org/module/Catalyst::Runtime>). Web controllers are a rat's
nest anyway, the whole thing is basically broken, and the only sane thing to
do is to seek for an acceptable comprimise.

~~~
LewisCr
Are there any write ups online that explain how catalyst solves these
problems?

------
amorphid
What I'd love to see is a set up where I log into any computer and my stuff is
all right there. Say I borrow your iPhone to do some stuff. Instead of using
your setup, I log in and all my apps are right there. Some run from the
swrver, some get pushed to the phone and used locally if that's more
efficient. Let's call it client & server, anywhere you go computing, or
something like that.

A client & server setup probably means that you buy access to a mobile
carrier's calling/data plan that isn't attached to a data plan. You buy a
handset, tablet, laptop, yada yada that is a standalone device or use one at
the local Internet cafe. A set of standards are put in place for how code runs
on a client and server. Your data may remain on a client, but ideally is
stored entirely in the cloud, too. The cloud data is encrypted.and can't be
accessed without your permission.

This is my dream and I'd be surprised if I'm the first one to think of it.

------
mltcx
is this rails bashing week or something?

~~~
jrockway
Bashing? Nope. Rails is simply not how people write web apps anymore. Most
people write a UI-independent API layer, and then write the UI in Javascript.
This makes Rails largely unnecessary; you still use the individual pieces, but
the framework as a whole no longer makes sense.

~~~
gnaritas
None of that is true. If you think it is, you're in a sheltered bubble too
focused on the new hotness from the valley. Out in the real world, where there
are millions upon millions of business apps being written, most people are
still banging out apps server side.

~~~
jrockway
I am from a sheltered bubble: one where we enjoy engineering efficient
systems, and one where I work with the smartest people in the industry.

This is how we wrote boring business software at the investment bank where I
last worked, and this is how we write software at Google, where I work now.
People write apps this way now because it's faster, easier, and more flexible.
(For me it means I don't have to tweak UI code anymore, there are lots of
people that like to do that and now they don't have to care about the backend.
Just use my API!)

But sure, I believe that not everyone does this; I still maintain legacy apps
too. Why rewrite millions of lines of code nobody really cares about?

But that doesn't mean you should write a new web app in 2012 like you did it
in 2008. Just like we stopped writing CGI scripts when good frameworks became
available.

~~~
gnaritas
> I am from a sheltered bubble

As long as you're aware.

> But that doesn't mean you should write a new web app in 2012 like you did it
> in 2008.

There's actually plenty of reasons to continue doing things the old way. Just
because new approaches are available doesn't mean they're always automatically
a better fit than the old ones.

You can't overlook the people issue either. Companies don't just hire new
staff because new stuff came out; they use their existing staff who may not
have time to keep up with the latest and greatest, and it makes perfect sense
for them to keep doing things the old way, which works just fine.

------
SkyMarshal
If you poke around the newly redesigned <http://html5rocks.com/>, this is
pretty much exactly where they're going with it. Thick rich HTML5 app, cached
in the client, and passing data back and forth between the server.

~~~
joemoon
Friendly criticism:

The background are really distracting (and not very aesthetically pleasing).
There is not enough contrast with the light colored links. These two things
combined make the whole sight difficult to read.

------
AlexeyMK
One thing I've been playing around with is the idea of coupling ORM-style
models with rules about how those models should be presented as a JSON-API.

<https://gist.github.com/1823921>

The approach breaks a lot of 'separation of concerns' rules between the view
and the model, but for certain kinds of APIs (read-only, in particular) this
may be fine. Would be interesting in feedback/seeing where similar projects
have gone.

------
instakill
Yeah, tell that to the overwhelming number of clients worldwide that don't
support client-heavy operations (and will probably remain so for the next ~5
years).

------
hilti
I'm writing my web applications in a "resource like way" since 2009. This
small framework supports resources represented as FOOP objects. Look at the
wings example over here: <http://rundragonfly.com/dragonfly_routes>

~~~
hilti
FOOP = a style of functional programming.

See here <http://www.newlisp.org/downloads/newlisp_manual.html#foop>

------
Fivesheep
recently, I have the idea to move all the ui stuffs to a chrome extension. on
the web side, it provides pure data only, like json.

------
wavephorm
RPC's over WebSockets is what will replace REST/Ajax.

On the client-side you have code like this:

    
    
      clientapp.sendRPC.doSomething(123);
    

And on the server you write code to receive it:

    
    
      serverapp.receiveRPC.doSomething = function(number) {
        write(number)
      };
    

No Ajax, no URLs, no REST-ful religion. In fact ajax just feels wrong now. I
unfortunately have to add some backward compatibility to my websocket apps,
and I just get this really icky feeling like I really shouldn't be doing this
arcane stuff anymore.

~~~
deno
REST is just a layer of indirection over a regular RPC backend anyway. It may
be easier to reason about things like caching and side-effects on the HTTP
level. Of course HATEOS is mostly useless, unless you can stay within the
general purpose protocols, like AtomPub and extensions.

> RPC's over WebSockets is what will replace REST/Ajax.

If SPDY takes off, there’s hardly any advantage to WS besides push events.
Stateless protocols are easier to scale.

------
talleyrand
Golly, I wish I understood what this is about...because if it means getting
rid of MVC, I'm all for it.

------
jebblue
The first problem to solve is to eradicate completely JavaScript from the
entire programming world 100%.

After that, program the server and the client in Java or if on Windows, C#.
GWT works this way although they compile the client code to JavaScript. If we
remove JavaScript we get better performance, more deterministic behavior
across platforms and totally eliminate one language we have to know making our
lives easier and the browser thinner and better performing.

When someone comes out with a browser that does that we will be seeing the
next killer app.

------
lfnik
One question for the author. Have you ever written a framework? Yeah that's
right.

ha.

~~~
lfnik
upvotes are O-V-E-R

~~~
antidaily
is that a Portlandia reference?

~~~
lfnik
Indeed it is. And I was sitting next to the author of this blog the entire
time.

