Rob has been around a while, but was most recently working on Angular 2.0. However, he's had technical differences with the team and has returned to a framework that matches his approach better.
If I read/watched correctly, what's really different about this framework is:
1. Large focus on ES6/ES7
2. JSPM
3. MVVM as made popular by libraries like KnockoutJS (which Durandal was built upon).
I think this is awesome. While I don't plan on rewriting an app I'm already working on in it, I think the interface to the framework is very very simple (which is refreshing), due in large part to the tight ES6 integration.
Also, I think MVVM is preferable to MVC (mostly because "controllers" are an overloaded term and end up being VMs anyway), but that's just a personal opinion (as is everything else I wrote).
Having tried nearly all the JS frameworks, I really like KnockoutJS. There are a few bits missing from it, but for someone like me who doesn't want a 1 page app knockout fits into a many page website really well. You can enhance pages as you go. Add a few nice GUI bits, some ajax to load and save... knockout makes it all very lovely.
But all the frameworks, the view syntax/binding gubbins is horrid. Hard to debug, easy to make syntax errors. Bleugh :(
Absolutely agree. Every time I start a new project, I start with knockout, then realize I need more of the bits, and then migrate to something else with more "built-in".
On an app I'm currently developing, I've actually adopted the pattern of using knockout with some other libraries (directory, for routing, mainly) and actually have found relative success.
One thing I have been tossing around in my head though is creating a web development pattern (I'm really just avoiding using the word "framework" here) that does this by default (and intentionally).
My vision for it is to be a marrying of knockout (for data-binding & templating & components), director.js (for routing, I think pager.js, while well intentioned, is too complex), and requireJS for async loading, with everything but knockout being optional.
I'm thinking of calling it chainmail (because the power is in the combination/linking together of the small libraries) -- would you be interested in something like that?
Also, have you checked out the components feature of knockout lately? it's awesome
I'm curious if you've used Durandal. What you're describing is essentially what Durandal is. Modular framework built on knockout and require. The router in durandal works well and is one of the pieces which influenced Angular 2.0's router from what I've read. Just glancing through Rob's new framework the router also seems very similar to Durandal 2.0.
So I've actually taken a look and tried to read through Durandal's docs -- I haven't actually used it in a project though (or even kicked the tires on it in a TODO app or something)...
I found the docs really hard to follow, and they seemed really complicated (and I didn't think it should have been)... It is entirely possible (and even likely) that I didn't put in much effort. But knockout (and tools like it) hit such a sweet spot for me because they are (almost) dead-simple.
we've built essentially what you're talking (though we hand-rolled our router) about where I work. As the applications grow larger, you really have to be more diligent with your memory management in Knockout, which is the only thing that really gets in my way.
That being said, I'd really be interested in something like you've described.
Glad to hear! It really weighs on me that I'd be adding yet another web framework to the already-too-large pool of them, but if it gets done at all, I will definitely be aiming for simplicity and lack-of-surprises above all.
Right now I see the MV* web framework landscape (from lightest to heaviest) as:
I'm aiming for chainmail to fit inbetween knockout and mithril (I consider it less than mithril because it will not include a JS-based dom structure, and will be using templating from knockout.
I'm excited to see this. Durandal was (and actually still is) a joy to use and coming up with something this advanced with a possible migration path is amazing.
Small confusion: They chose to use 6to5 and jspm but jspm depends on Traceur and there is even an issue[1] for supporting 6to5 on jspm issue tracker. (Edit: Just discovered that they were answering questions on Gitter[2])
Despite the claim of supporting "WebComponents" I can't see any evidence in the source that it actually does.
There are no calls to document.registerElement() in the framework or templating repositories, and it looks like any element registration is happening against a proprietary registry, so that Aurelia components won't be available in standard web pages outside of the Aurelia framework, which would be the exact same situation we already have with Ember, Angular, React, etc.
Yes. We support web components. That doesn't mean we are built with Web Components at the core. The WC specs have a number of design flaws, in my opinion, and they don't work as efficiently as possible with databinding solutions, etc. I can expound more on this in a blog post. But our binding system works fine with Web Components. So, you can import a Web Component and use it inside a view with databinding and even attach additional behaviors to it without any problems.
Web Components are not yet a standard. They are primarily being pushed by Google without taking much feedback from other vendors. I spent almost a year working with building frameworks around them...and decided there was no real advantage to using document.registerElement for an Application Framework.
Again, if you want to build Web Components or use Polymer...that is fine. You can use that with Aurelia. But Aurelia's custom elements aren't built that way.
As a side note, Aurelia will be able to "export" its custom elements to Web Components. So, before we hit v1 you will be able to do that and then take an Aurelia Web Component into a non-Aurelia app and use it, the same way you would use a Polymer element. We haven't done that yet because we are focused on the application experience first and foremost.
I should also state that we do use HTML Imports, for view loading, which is part of Web Components and we also use HTML Template Element, which is part of Web Components...and you can optionally use Shadow DOM...which is part of Web Components. What we don't do is use document.registerElement as part of our own custom element implementation because we would give up too much control over critical aspects of binding and resource loading.
So, we use 3/4 of the spec. One of the APIs we don't use...but we will allow you to export your Aurelia elements to a document.register form for v1.
Unlike the frameworks I listed, Polymer creates true W3C custom elements, which are instantiated by the browser, not the framework. Polymer custom elements are interoperable with raw html, Mozilla X-tags, IBM Delite, Bosonic, etc.
I hate to sound jaded and cynical, but am I the only one who feels weird about a new framework announcement containing information for getting training and consulting? I understand that there is no money in releasing free frameworks, but it seems a little soon to be selling consulting and training on an early-access preview...
I would actually prefer that a framework be backed by some kind of business. When it's just a guy moonlighting, you never know when he might disappear and you've now got a production app that's being supported by "the community".
I guess it isn't too early when you consider who is actually behind Aurelia: Rob Eisenberg. If you have used Durandal before, then you will know what I am talking about. I have no doubt in my mind that Aurelia is going to shake things up. Just having a play with it earlier gave me a sense of excitement that I remember getting when I first used Angular all those years ago.
I get the impression we're dealing with a framework that is based in part on Durandal, so it is not an entirely new framework in certain aspects (I could be wrong). And on the plus side, Rob has proven himself as an effective leader on projects like this and during his time on the Angular 2.0 team, he was doing some great work.
He has been doing the same thing for his other framework that has been around for much longer. I assume the note about training is directed towards those who have use his Durandal training/consulting and may be interested in doing the same for this new framework.
I think this might have something to do with the fact that he is the CTO of Durandal (which seems to be based off of durandal.io, or maybe the other way around).
I assume that he's currently the CTO of a very small company (of maybe even just him), and one of the ways to do that profitably, while building things you love, is to consult/provide training on the things that you have built that others want to use
This looks a bit like Angular 1.3 and 2.0 to me - DI, binding and repeats look like ng-model and ng-repeat.. I'm curious to see more when the full docs are out though. No mention of isomorphism or a virtual DOM, seems to be a bit behind the zeitgeist.
Yeah. I just got done doing a very large SPA using Knockout, and I'm now starting a little side project with React/Flux.
Pain points I found with the Knockout based app: Performance, stupidly messy data flows, hard to test, lack of isomorphism. My choice of React/Flux for my next project was based on that: React makes a big deal about performance (virtual DOM), sensible data flow (flux architecture, one-way data flows, events, etc.), testability, and isomorphism.
I clicked on OPs link with some apprehension; is the stack I just chose already outdated? Not according to the blog post. They still need a "bunch of work" on performance and testing (which implies that it's not there yet, and also raises questions about priorities; testing seems like it should be more than an afterthought). There's no mention of isomorphism. And while technical details are sparse, it looks like there's no virtual DOM; no strongly enforced event model (like React has with Flux). Maybe it's there, but...
...given the amount of experience the author has writing JS frameworks, I'd have expected a lot more details on how this framework solves problems the programmer might have with other frameworks. I mean, the very first bullet is "Aurelia is written with vanilla ES6 and ES7 but transpiled and polyfilled to work on today’s Evergreen browsers. It may just be the most forward-thinking framework you've ever seen", but speaking just for myself, I've never ever thought to myself "man, this framework is awesome, but if only it had been written in ES7 and transpiled to work in my current browser!" I'm struggling to imagine why anyone would think that.
Now, maybe it's just a bad blog post. I'll do some more digging. But the message I'm getting so far is "this framework was created to solve the kind of issues framework authors care about, not the issues developers using the framework care about". If true, it's not a good message.
I was quite surprised at how little support in browsers there was for es6 and 7. Essentially, every current and near future browser was developed against es5.
I don't know why you missed it, but it mentions both isomorphism (code that can be run client side or server side) and the virtual DOM (web components, etc...).
This is not mentioned in either the blog post or the docs, at least not easily findable with a cmd+f
"Use some of the libraries on the server with NodeJS. i.e. DI." - some, and I'd expect an explicit mentioning of isomorphism if it were possible at the moment to render it on the server side.
Shadow DOM != Virtual DOM. I mean the React-style Virtual DOM.
Shadow DOM is a DOM structure that's hidden under a tag (for example, the <video> tag has a Shadow DOM which provides the controls and timecode UI).
Virtual DOM is (typically) a pure JavaScript object-based DOM representation that gets somehow flushed to actual DOM (or to WebGL or anything else). You as the programmer use the Virtual DOM directly and try to avoid talking to the real DOM underneath. Virtual DOM has a few advantages, but the one that stands out in today's browsers is limiting DOM thrashing where you query the DOM (requiring an expensive recalc) and then modify the DOM (invalidating what was recalc'd) in a loop.
We probably agree on what "isomorphism" is, but maybe not "support"?
If a platform claims to support isomorphism, I expect it to render server-side with zero drama/workarounds/blog posts. Support means that producing fully-rendered html is a core part of the product and documentation, not something glued on the side or pieces to reassemble on your own.
From the wording, it looks like Aurelia isn't there yet?
Not as far as I know. What I can tell you is that web component polyfills like Polymer end up partially recreating the shadow DOM, and that's a lot of work, unlikely to have been undertaken unless it was necessary.
"I am not saying that Angular 2.0 is going to be a bad framework. What I am saying is that it is no longer fundamentally the same thing I was originally hired to help build nor is it compatible with my vision for the future. It is no longer the best path ahead for the existing Durandal community and it also isn't the best choice for anyone who has used my other frameworks and wishes to migrate to the web."
There are two sides to it: yes, new frameworks and new stuff to learn, and that can be overwhelming.
On the other hand, rapid evolution leads to better ideas, keeps your mind fresh as your learning never ceases, and an opportunity to improve yourself, keep your career in demand, and generate more income for yourself.
Example: when I started web dev, client-side code was basically a giant function containing browser-specific DOM manipulation.
Today, client-side code is arranged in modules with single responsibility, well-defined application architecture, easy to test, easy to modify or extend, the app logic cleanly separated from the UI.
Rapid evolution also keeps the mind fresh. You're continually learning new concepts, absorbing new ideas. This helps me enjoy my job; I'm always learning.
Rapid evolution keeps my skills in demand, too. My last 2 gigs have been because I learned Angular. I'd be hard-pressed to find work if I was still building web apps like it was 1998.
I can agree that it is a bit overwhelming to keep up with the new 0day hotness. But in all honesty it's also exciting. Things are changing constantly, and mostly for the better. I am particularly glad to be a front end developer and enjoy the constant strive to learn the newest thing.
I don't really see why this is a problem. When my team started using Durandal, the learning curve wasn't very steep, and in the end, it became way faster to create new pages/features.
If a framework learns from the mistakes of the past and builds on previous success, then I welcome it.
Same can be said for Ruby, Python, C#, etc. A lot of people have a lot of ideas, preferences and time to burn creating/learning new client side frameworks.
Looks interesting. It would be nice if they were more specific about what browsers the framework supports though. There's a phrase that you can "Leverage the technology of the future but target today's Evergreen Browsers." That's a nice bit of copywriting, but it doesn't instantly tell me which browsers are supported. It would be nice to have an official supported list.
By combining ES6 modules with a simple, yet powerful Dependency Injection Container, we make it easy for you to create highly cohesive, yet minimally coupled code, making unit testing a snap.
Not trying to snark, but if you need to test javascript, use jasmine. If you want to acceptance/integration testing (depending on your definition of those terms) as a user clicking around your site to make sure the things you think happen are actually happening, use phantomjs (I'd also suggest using nightmarejs on top)
Phantom is unreliable, barely maintained, and brutally slow. I think one of the reasons Angular has seen such remarkable adoption is because it made testing easy, quick, and idiomatic.
What version of phantom are you referring to? Do you have any specific examples?
Barely maintained
They do have a lot of tickets open, but in uses of phantom (wrapped by nightmarejs), I have not run into many issues that even required filing tickets/requiring maintenance. 90% of the use case of something like phantom is rolling around the page and clicking on stuff/filling out fields/evaluating some javascript, and it does that fantastically most of the time. Maybe I haven't done enough ambitious things with phantom.
Brutally slow
Compared to what? Selenium? I hope you're not comparing to the testing included with angular, because those are two different kinds of tests. Also, in the end, I think the most important kind of test is the kind of test you write with something like phantom/selenium because it requires that the system (at a macro level) does what it's supposed to do. If you make a todo app, you can have all your unit tests pass and the app still be broken. You definitely can't have your integration/end-to-end test pass if the app doesn't actually work (though the functions may be terrible, buggy, etc.
Personally (I have no data to back this), but I doubt Angular has seen remarkable adoption because of easy testing.
[EDIT] - I just noticed that you ignored the mention of Jasmine -- Maybe Angular shouldn't require it's own testing strategy at all, that seems like a pretty good indicator of a monolith (which Angular is, so I guess that's not really an interesting point).
Most recently, this forced me to upgrade: https://github.com/Medium/phantomjs/issues/161 And that forced upgrade broke the cookie mock/stub code that we wrote. It took almost a day of thrashing to fix the failures, none of which could be seen in a real browser. Super irritating.
I've also had to work around weird JS issues that only affect Phantom, and corrupted screenshots making it rather hard to see what's going wrong. For our team's time-spent vs time-saved, Phantom has been a waste. Our app is very JS-heavy, but that's why I thought Phantom would be a good idea.
I think the most important kind of test is the kind that developers will actually use. If tests are hard to understand, hard to run, unreliable, or slow, then they won't get used (Selenium!).
I just want idiomatic, quick, easy-to-write tests so regressions in our app are much less likely to happen. Angular's approach met those requirements pretty well and that was a big influence to our choice to use it (two years ago, data point of one). It looks like React has a good approach to testing too.
If the Aurelia docs suggest using Jasmine to test the application, especially if they help setting up html fixtures and data, then that's encouraging. But I don't think they do. For now it's all DIY so, if I join your Aurelia team, I have to spend a bunch of time learning how your testing works? In my experience, that means your project will most likely have very little testing.
all good points -- will take this in mind when recommending phantom in the future. While I haven't run into the same # of issues, but I can absolutely see why you guys didn't like it, and the possible issues I could run into some day using it.
Also agree with writing tests that devs will actually run, though I think that is going away with the advancements in orchestration/devops. If buildbot or jenkins can run the tests then you may not have to rely on a human's willingness to run it. Speed's another matter though.
Oh and no, they don't suggest using Jasmine to set up the application (not that I read) -- that was just my assumption (that you could). And that's a very good point, if they don't pay attention to testing (and make it very obvious/easy), then it will fall by the wayside.
Often tests fails in Chrome/Safari when passes in PhantomJS (and after that I can't trust PhantomJS anymore). Often PhantomJS is far behind modern frameworks in features. It's from my experience.
And Angular can be tested not only with Jasmine - use Mocha or something else, if you like.
This seems really nice. Just looking through the first few links and pages, it seems to have all of the nice things I enjoy about Angular. It looks like it may have less of a learning curve too, which would be very nice.
We've reached a point where I don't even need to open links named "Introducing [fancy sounding name]" to know that it's yet another Javascript framework.
How good is the ES6 transpiler used for this? My experience with ES6 transpilers thus far hasn't been great. The feature support seems to be very spotty.
That's how you do a client-side framework or any framework for that matter, forward thinking with backwards compatibility and no custom nonsense tying it to a specific language (AtScript).
I meant in terms of the design of the framework and the emphasis from day 1 on making everything modular and interoperable with the rest of the JavaScript ecosystem, e.g. jQuery, Knockout.js, etc.
Well, as another example, Mercedes was a feminine given name originally. The company was named after the founder's daughter. Now it is associated almost solely with automobiles, at least in anglophone world.
https://twitter.com/iamdevloper/status/540481335362875392
"I think I've had milk last longer than some JavaScript frameworks."