

Imagine a Beowulf Cluster of JavaScript Frameworks - jtaby
http://tomdale.net/2011/04/imagine-a-beowulf-cluster-of-javascript-frameworks/

======
kinofcain
This is why the closure compiler is so essential.

The ability to remove unused code at compile time means you have access to a
very large, full-featured library, but only have to include the parts you
need. You can do something similar with a custom JQuery or YUI build, but not
nearly to the same level, and with the same granularity that the closure
compiler does automatically.

Unfortunately, to take full advantage of the compiler you must run with
advanced optimizations turned on, which in turn means that your code has to be
structured for use with the compiler. You're not going to take an existing
codebase and just throw it into the compiler with advanced optimizations and
have it work out for you.

But if you start a project with the closure library and the closure compiler,
you can get some pretty amazing optimizations. We were seeing relatively full-
featured apps at compiled, uncompressed sizes of under 10k. Full sites well
under 100k with a lot of javascript is completely doable.

Couple that with the dynamically-loading module system and you've got a crazy
fast front end.

To top it all off, the compiler respects and can warn about type mismatching
if you annotate your code. It's a great way to both document and check your
codebase.

The closure library and closure compiler are warty enough that I don't think
they'll see widespread adoption the way that JQuery has, but the closure
tools, or something like them, is definitely the way forward.

~~~
rgbrgb
I'd be curious to know if Flow or Twitter or big webapps take advantage of
this.

~~~
kinofcain
Twitter does not, I don't know about flow. Most people are using one of the
non-closure libraries that only work with closure's "simple" optimizations,
and running one of those through the closure compiler on "simple" doesn't get
you much that one of the other minifiers could do, and the other minifiers are
easier to use.

I'd like to see a library that is more approachable and internally consistent
than the Closure Library, I think something like that coupled with plovr, (a
build tool that wraps some of the craziness of the closure compiler in a much
nicer package) would be a killer next-gen js framework.

I think that's really where we're headed, all this complaining about whether
you should use a big library or build a big custom library is really just a
symptom of the tools being sort of crap.

------
jashkenas

        How much of Flow’s nearly 900k of (minified!) JavaScript 
        do you think is the application developers filling in 
        the deficiencies in Backbone?
    

I'm trying really hard to resist saying something overwhelmingly snarky about
SproutCore apologists, but this article is trying to be a slap in the face of
Backbone.js, and libraries like it ... so if you want to have that discussion
-- bring it. You think 843K of JavaScript is too large for a truly
comprehensive and gorgeous app like Flow? The SproutCore hello world app for a
broken-looking table view weighs 717K, when measured in the same way.

But there's a bigger question here, when picking on a library that's only been
around for 6 months: Has anything on the scale and professionalism of Flow
ever been accomplished by a SproutCore app, in SproutCore's 4 years of
existence?

I think that the proof is in the pudding:

<http://www.sproutcore.com/showcase/> (2 of the 4 belong to the SC team)

<http://documentcloud.github.com/backbone/#examples>

'nuff said.

~~~
maercsrats
I started out working on a fairly complex dynamic form builder for the
research group I work for. They needed a way to build up forms to administer
to possible research subjects. The rest of the app was using pieces from
jquery-ui so I stuck with that and added backbone.js to handle more of the
complex interactive tasks. All of it was written in coffee-script.

Early on this approach worked fine and for simple CRUD stuff it's not too bad
but as the researchers wanted more and more options it got difficult to sync
everything; dom interactions, model data and server data. I started to
investigate SproutCore and thought "This object model with bindings is just
what I need!"

So, I'd really like to disagree with your comment, but I can't. The reason is
because I can't integrate SproutCore with anything else that's been written
for our app. I can't even get access to some of SproutCore's pieces to try and
integrate it. The entire view part of our app would have to be re-written
using SproutCore to use the features and that isn't viable right now.

As it stands, I'd really like to use SproutCore. There are some great ideas
there and I think that some of my projects would really benefit from their
projects. But right now, backbone.js + jquery-ui has hit the sweet spot of
easy to get up and running and looks good. The researchers are really happy
with what I've got done and I'm able to move on to other problems.

------
tlrobinson
I believe the solution to this problem isn't tiny frameworks, it's better
tools and/or languages.

Closure Compiler and GWT do a great job of stripping dead code. Cappuccino has
a tool I wrote that attempts to analyze your application and remove unused
files from the final bundle of code. It could do a lot better though.

One thing these tools have in common is static typing vastly improves their
ability to remove unused code. Perhaps JavaScript and other very dynamic
languages aren't ideal for large web applications built with large frameworks.

I like the idea of optional static typing. For example, a language in which
you can rapidly prototype without paying close attention to strict type rules,
but later you can "solidify" your code by enabling static typing. Is anyone
aware of a language like that?

 _Edit: now that I think about it, that's basically what Closure Compiler is_

------
polotek
Left a comment there, might as well leave it here too.

An interesting debate (minus the personal attack stuff). But I think the
problem is that both sides are trying to warp and narrow the playing field to
suit themselves. The truth is there are plenty of projects that will never
need all the machinery of SproutCore. And there are plenty of projects that
start out too small and then struggle to grow more complex. Choosing your
tools is hard. It should be hard and it should be taken seriously. Neither
side should be trying to convince people that it’s an easy decision.

\- "This is why you should always just start with SproutCore". \- "This is why
you should just start small and integrate micro-frameworks as needed".

Both bogus arguments really.

I’m also quite sure that neither side really wants to disparage the other
side. They just want to make sure their own side isn’t being disparaged or
misrepresented. This is what most internet arguments are made of unfortunately
:)

What would be more awesome IMO is if we started talking about when it’s a good
idea to take either approach. Let’s hear some use cases. And let’s stop using
Twitter as an example. There is only 1 Twitter and their requirements are not
going to be representative of the web dev community at large.

------
asnyder
As co-creator of the NOLOH Framework (<http://www.noloh.com>) it's crazy for
me to constantly read these posts, clearly we don't do a good enough marketing
job. One of the benefits of NOLOH is it's lightweight and on-demand nature,
that everyone in this thread, and in the blog post seems to be requesting, but
without actually identifying it.

In the case of NOLOH, lightweight and on-demand means that the server sends
the client only the necessary highly optimized client-side code for their
application specific to the user's device at that current point in time,
resulting in faster initial and continued loads. Similarly, as the user
continues to use the application, NOLOH continues to send only the necessary
and optimized code. This eliminates the fat-client problem, as the user only
has what they absolutely need, specific to them. Similarly, in the case of
search engine robots, NOLOH sends standards compliant semantically rich
content, without the other baggage.

This is accomplished by implementing different renderers for each target
device, so rather than use a general client-side library that loads everything
it could possibly need for all browsers and situations, NOLOH has specific
variations for each browser, version, and device. The ever growing number of
browsers and devices demands this.

In "Lightweight, On-demand, and Beyond" in this past November's issue of
php|architect, <http://www.phparch.com/magazine/2010-2/november/> (sorry for
the pay-wall, we'll repost it this month as the exclusivity period expires),
we go in-depth explaining the next version of our lightweight and on-demand
functionality, where we make it easier for us to maintain an ever growing
number of target devices, while sending even better code to the client,
without any drawbacks on either the client, or the server.

It's been very interesting reading all the posts regarding event-driven
programming, fat-clients, and unnecessary bloat. I remember initially thinking
about these issues in 2005 when Philip Ross and I first created NOLOH, we were
young and naive at the time and like all good solutions, didn't have the
status quo entrenched in our thinking. So when I read these posts and they
have the same gist as our initial white-papers it saddens me that we haven't
made as much progress as I thought we would've, it also makes me feel old.

------
gaustin
I'm a big fan of MooTools.

It's very modular and consistent. It's easy to pick and choose just what you
want to include. You can do it by hand or use the builders on the MooTools
site that let you roll custom distributions of the library.

It's like having a micro- and full-stack library in one package.

------
troygoode
I was pretty shocked that New Twitter has a meg of custom js - the new UI
doesn't seem _that_ complex...

~~~
ahoyhere
It's not. Twitter has JS management issues. This has been true since the first
time they added JS to the web UI.

------
dgraunke
I think his first link was supposed to point here:
[http://mir.aculo.us/2011/04/11/i-for-one-welcome-our-new-
mic...](http://mir.aculo.us/2011/04/11/i-for-one-welcome-our-new-micro-
framework-overlords/)

~~~
tomdale
Fixed; thanks!

------
ahoyhere
Tom indirectly calls Thomas Fuchs (author of Scriptaculous, Zepto, Emile, core
committer to Prototype.js, etc.) disingenuous because...

" _Of course_ Mr. Fuchs is able to tell you which JavaScript library will
precisely match your requirements -- _his job is writing JavaScript
libraries_!" [Emphasis, and presumed outrage, his.]

Come now. That's simply false, and if you think about it, silly. "He writes
open source frameworks for free so he's biased against open source
frameworks"? Really? Thomas has never sold a JavaScript framework or written
one for hire, and his libraries are MIT-licensed. It's not his "job".
Furthermore, Prototype.js and Scriptaculous are not micro-frameworks. So
Thomas Fuchs is arguing against major examples of his own work. Yup, must be
some kind of evil hidden agenda.

I called Tom out about this misrepresentation on Twitter and he probably
changed it by now, but I for one think it's important to know about the
history of such things.

Other nitpicks:

 _Mr. Fuchs has apparently never heard of dependency hell._ Very logical
argument, that. Way to slyly insinuate he's a bad and inexperienced
programmer, without actually introducing any actual facts.

 _Dustin Diaz has done a great job of putting together many of these micro-
frameworks with Ender.js, but as a curator, he has to rely on the original
author if he wants to make a change._

Really? How is that a counter-argument?

I'd be very surprised if someone has read Thomas' little essay on micro-
frameworks and _genuinely_ come away with the idea that what he really
supports is taking a bunch of other people's OSS projects and mushing them
together with an integration layer.

The whole point about micro-frameworks is _you don't have to make them "go
together"_. You wield them individually like scalpels instead of spinning them
en masse like Edward Scissorhands.

There are a LOT of problems with an undertaking like Ender.js, but that's not
the fault of micro-frameworks "not being made to work together." It's the
fault of doing something that, from the outset, is fairly expected to be more
trouble than it's worth - however noble it may be.

One might argue that an integration layer that hooks up a bunch of other
micro-frameworks is no better than -- and in a lot of ways, worse, and more
complex, and less reliable -- than a monolithic library. So I call "red
herring" on the Ender.js argument. It doesn't really support Tom's attack on
Thomas' article against monolithic frameworks, because it is one... just made
from parts, like Frankenstein.

I could go on, but I'm even boring myself at this point.

What I'd like to know is: Why is the suggestion "You don't need all that code
all the time" considered so radical & threatening?

\--

And yup, full disclosure: I married Thomas Fuchs, but I used Prototype &
Scriptaculous for well over a year before I ever met him.

~~~
slexaxton
Mostly agreed, and well articulated. Though, the point I believe that Tom was
trying to make about Diaz having to ask Thomas to change things was more valid
with the links that were likely stripped from your comment:

<https://github.com/madrobby/emile/pull/7>

While you already argue against something like ender.js being valid, Tom's
argument, which preceded your comment (and which seemed confirmed by Thomas'
post, imo), was that including emile in Ender.js has actually been a pain
because he couldn't get Thomas to answer him, let alone change something for
him. I believe the irony that was being pointed out was that Thomas then gave
ender a shout-out after ignoring it (from an outside and likely wrong
perspective).

I say all of this in more of a objective bystander, rather than someone who
wants to interject his own opinion on the actual topic. I don't need that kind
of stress.

Full disclosure: I found myself quite attracted to Thomas the few times I met
him. I think it's the accent.

~~~
madrobby
The point is that Emile is 50 lines of code and can be wrapped up for any
purpose in about 2 minutes (export to some object).

Dustin wanted a different API to call upon, so he had to change some stuff,
again relatively easy, because it basically fits on a screen in a text editor.

Let's not forget, all of this is open source, and it's meant for adaptation,
forking and to be built upon. (Note that Emile was very much a proof-of-
concept, with no emphasis on beatuiful, reusable code; it was written as a
teaching tool for a talk on CSS animation I gave two years ago at Fronteers.)

~~~
slexaxton
I agree, but I think the point of Tom originally linking it was to show that
Dustin had integrated a micro-framework (from you), and couldn't get a
response from you (even to say the stuff you mention above) and eventually
closed the ticket.

It's not my own commentary, though. I was just clarifying to Amy why the
argument that seemed entirely unrelated was at least tangentially related.
Personally, I would have just modified it and went on my way :D

~~~
polotek
Actually Dustin's pull request is a bad example entirely. He wanted to change
the whole API of emile. He liked the functionality but it didn't work for how
he wanted to integrate it into ender. If an API doesn't work for you, you're
kind of screwed, whatever library you're using. You either hack it yourself or
you ask the maintainer. If some API of SproutCore wasn't to your liking, then
what?

Dustin could've written an adapter around emile to expose the api he wanted to
for ender. But that would defeat the purpose of ender which is to cleanly
integrate several great micro frameworks. He didn't _have_ to wait for Thomas.
He wanted to explicitly. It was a goal of his to keep the dependencies pure.

Dependency hell is a problem with integrating several different frameworks.
But I'm not sure this is the best illustration of that.

