
The viability of JavaScript frameworks on mobile - joeyespo
https://joreteg.com/blog/viability-of-js-frameworks-on-mobile
======
BinaryIdiot
This is good. Many frameworks are just too freaking huge for mobile but even
when you use tiny frameworks and you dump a bunch of images into them you're
going to have similar issues (at least on load).

Honestly using the DOM API isn't all that hard. Yes it awkward, verbose and
sometimes cumbersome but it's still pretty straight forward. I'm actually
really liking react lately but if I want something done well for a mobile I
almost never use a framework.

Another thing this article didn't touch on was latency. When I was doing work
for vehicles that had poor internet access via satellite every single http
call just killed the load (this included fetching css, javascript, etc). I
can't stress enough how much better your page can load if you combine as much
stuff as possible, even images if you can display them as backgrounds.

~~~
tomcam
This is heartening. How do you deal with cross-browser differences, especially
considering you're supporting older mobile devices?

~~~
intruder
You deal with them one at a time. It's not like there's magic involved.

Our SPA isn't even using jQuery. Loadtimes are fast, app is very fluid.

Many people seem to not understand that when you query the dom for a node, you
can store that node and reuse it throughout your app.

------
pygy_
This is where small frameworks like Mithril shine.

The site says 12KB, for some reason, but minimized and gizipped, the latest
version is closer to 7 or 8 KB IIRC.

[http://mithril.js.org/](http://mithril.js.org/)

~~~
lhorie
Just came back to mention that two of the most well-known Mithril-based
projects, Flarum.org[1] and Lichess.org[2] are open source and have mobile UIs
that you can try out, so if anyone's interested in seeing non-trivial
codebases that use a "lighter approach" alluded to in this article, there you
go.

[1]([https://github.com/flarum/flarum](https://github.com/flarum/flarum))

[2]([https://github.com/veloce/lichobile](https://github.com/veloce/lichobile))

~~~
pygy_
[http://flarum.org/](http://flarum.org/) is especially relevant since it is a
competitor to Atwood's Discourse which sparked the current discussion.

[http://lichess.org/](http://lichess.org/) is an [f|F]ree online chess
platform.

------
buro9
With Microcosm we consciously chose to go the other direction: Do the work
server-side and serve HTML.

Basically... the old way of doing things.

An example site [https://www.lfgss.com/](https://www.lfgss.com/) is 280KB on
first load (and most of that is Mozilla Persona) and subsequent requests are
usually around 10KB.

Even though that company is now dead, I'm still working on it and aim to strip
out Persona (it's deprecated) and to set a first load goal of 100KB (which
should be easy to achieve).

People should feel the power of their devices, and the only way to do that is
to have those devices do less, not more.

~~~
WorldMaker
Persona is not deprecated. It was moved to "community ownership", which is not
"deprecated" just as much as the average open source project is not
deprecated. It's a bit like saying that if Red Hat stopped paying it's
developers to work on the Linux kernel then Linux itself must be deprecated.

I wish more sites would use Persona and so I'm heartily disappointed to hear
someone expecting to strip it out of an existing usage.

(Aside, I gave a presentation on the subject, for what that is worth.
[http://blog.worldmaker.net/2015/05/13/mozilla-persona-
talk/](http://blog.worldmaker.net/2015/05/13/mozilla-persona-talk/))

~~~
buro9
I'd love to continue using it... but I don't want to run an instance as I want
to point at someone else's instance. If I have to run my own instance then I
need to know I can maintain it and keep it secure, and I am not a Node guy and
I find the code and all of it's dependencies to be too much for me to say that
I am the man to keep an instance secure and maintained.

The thing is that I want it to perform as well as my application. Just look at
Persona's portion of blame for the time, transferred bytes, transfer
waterfall, connections:
[http://www.webpagetest.org/result/151020_X5_17EP/](http://www.webpagetest.org/result/151020_X5_17EP/)
(and that's with connection preconnects working... older browsers have it
worse).

Persona is the biggest performance hit on my web app, and it's holding me back
and I do not wish to run my own instance (just to put it behind CloudFlare and
make the whole thing fly).

Whilst performance hurts, and it does hurt, and whilst I live under the shadow
of "Mozilla aren't owning this and pushing it forward"... I'm very seriously
thinking of using [https://github.com/go-
authboss/authboss](https://github.com/go-authboss/authboss) and making a
centralised front-end for it so that I can achieve a very similar thing in a
simplified way that I can support and maintain.

The key thing though... I need performance from it. I don't have that today. I
have a web account... but it needs to perform and I've seen no improvements on
that front in the entire time I've used it.

PS: I even checked Auth0 and others, but I'm doing so many logins per month
that it would cost 100x the costs to run my entire platform to use any of the
paid services.

~~~
WorldMaker
I had some success in loading Mozilla's JS library asynchronously (using
RequireJS in that particular application) after initial render/dom-load. That
certainly mitigates some of the performance issues. (You'd notice even more of
a bit of a slowness in the UI picking up if you were already logged in on page
load, but that is more and more common on the internet these days, for many of
the "SSO" authentication systems, so I don't consider that much of a problem.)

Also, from your waterfall you've posted here, most of the Persona stuff is
happening in background/parallel anyway: the big slowness in getting to render
start certainly appears to me to be your fonts more so than Persona, which is
what I would expect to see. That is, I'm wondering if you are scapegoating a
bit here as the impression that I get looking at your waterfall is that you
won't gain as many milliseconds as you might think by removing Persona.

------
onion2k
Rule #1 for optimisation: "You can't make code run faster. You can only make
it do less." It's true for mobile devices as much as everything else.

~~~
protonfish
Fortunately a lot of what these libraries do is spin bloated wheels so there
is a lot of opportunity to do less without losing features. This is why I
write mobile web apps with vanilla JS only. It takes a little more time up
front, but the result is vastly superior to what any library can offer.

~~~
swah
Have you tried using server-side rendering as a viable alternative that lets
you use frameworks like React?

------
thinkcontext
The article is interesting but the Atwood article was mainly about how badly
_Android_ javascript performs vs iOS. If you have a 5x difference in
performance between platforms it will be impossible to maintain feature
parity.

~~~
exelius
It's relevant; iOS may be faster but the user experience still sucks when web
devs rely on heavy JS frameworks. Rather than bemoan how much JS sucks on
Android, devs should realize that JS sucks everywhere and use it as sparingly
as possible.

This is especially relevant since the fragmentation on Android means this
problem won't be going away soon. Even if Google fixed its JS performance
tomorrow, so many phone manufacturers bundle their own browser that it will be
a long time before those users with "old, slow" JS on Android go away. I don't
know how the JS engine is bundled on Android, but if it's bundled with the
core OS then the vast majority of phones would just never get updated.
Developers have to deal with that, so unless you're ok with excluding a large
percentage of mobile devices on the street, dealing with slow JS performance
is simply a fact of life on mobile.

~~~
nostrademons
It's not JS that sucks; V8 performance is actually pretty competitive with
Dalvik, if not faster. It's two other things:

1\. There is a cost to download frameworks, and most webdevs ignore that cost.
A native app has the Android class libraries already on the device, oftentimes
already loaded into memory. A webapp that pulls in 600K of JS code + framework
has to read that all over the network, parse it, and JIT it before execution
can even begin.

2\. The native Android GUI frameworks render views on the GPU. Chromium will
only render views on the GPU if you have a CSS transform or opacity property
set. If you're not very careful, you can easily trigger an expensive layout +
repaint calculation on every frame.

It's possible to write webapps that perform just as well as native, with fancy
animations and fluid user experiences. I had some existence prototypes when I
was still at Google, and some of the research for that went into the current
Google Search app experience on Lollipop. But it was a huge pain as far as the
developer experience went, and a lot gets lost in translation when
productionizing. It basically involved treating the browser as an OpenGL
canvas, where certain combinations of CSS + HTML would render text and boxes
into a texture and then other combinations would let you move textures around
and fade them together in the window.

~~~
exelius
JS does indeed suck on mobile; since it's single-threaded by design, all those
blocking operations (you already mentioned some examples) add up. Hence where
you get both issues you mentioned - those refreshes are only necessary because
the single-threaded nature of JS causes a few dozen event handlers in your
framework to fire on every change you make. It also doesn't help that most
mobile devices aren't exactly pushing the envelope on single-threaded
performance.

And let me be clear: JS being single-threaded isn't a bad thing in itself (the
thought of a multi-threaded language in the browser gives me nightmares). It
just makes the user experience of JS suck on mobile. And that's not going to
change any time soon given the glacial pace of patch/update deployment on
Android.

> It basically involved treating the browser as an OpenGL canvas, where
> certain combinations of CSS + HTML would render text and boxes into a
> texture and then other combinations would let you move textures around and
> fade them together in the window.

That approach -- while interesting from a technical standpoint -- kind of
defeats the purpose of a browser :) I mean, you're basically building another
browser inside the OpenGL window...

~~~
nostrademons
Native apps are _also_ effectively single-threaded: all of your code, by
default, runs on the UI thread, and you need to use AsyncTask or explicitly
spawn a Thread to do work in the background. If you need this functionality on
the web, there's Web Workers, so it's functionally the same story.

CSS transitions/animations, BTW, run on a separate thread in Chromium. This is
the primary difference between them and requestAnimationFrame, which always
runs on the main thread.

------
warfangle
Most of the performance issues people see on mobile (and desktop!) with
javascript isn't with javascript performance itself anymore (unless you're
doing some crazy stuff), but with the relatively expensive repaint/reflow
browser operations.

~~~
Imagenuity
This point is often missed. It isn't javascript that's slow, it's the DOM and
all its overhead and cruft that's slow. Or it's the connection that's slow. An
important distinction.

------
paublyrne
I think a lot of developers wanted mobile web performance to catch up with
desktop and mobile native that we believed it would happen, and much faster
than was ever really likely. We're used to things getting better quickly when
it comes to technology.

Flagship phones improve things a little each year, but to quote William
Gibson, the future is here, but unevenly distributed.

~~~
mwcampbell
> We're used to things getting better quickly when it comes to technology.

It doesn't help that some influential people in the software industry, like
Joel Spolsky, told us to bet on the hardware improving fast. See for example:

[http://www.joelonsoftware.com/items/2007/09/18.html](http://www.joelonsoftware.com/items/2007/09/18.html)

Particularly this part:

> a couple of companies, including Microsoft and Apple, noticed (just a little
> bit sooner than anyone else) that Moore’s Law meant that they shouldn’t
> think too hard about performance and memory usage… just build cool stuff,
> and wait for the hardware to catch up. Microsoft first shipped Excel for
> Windows when 80386s were too expensive to buy, but they were patient. Within
> a couple of years, the 80386SX came out, and anybody who could afford a
> $1500 clone could run Excel.

By contrast, he describes Lotus optimizing 1-2-3 so it could run in 640K of
memory. Who would want to be today's equivalent of Lotus? So just pump out
features and wait for the hardware to catch up, right?

------
kraig911
Blame the language and not the platform is what you're essentially saying. Ok
so initial payload can be what 600kb per your argument. But at least once I
have delivered that my actual interactions on the server side can be somewhat
a minimum instead of re-rendering more HTML and sending across the wire to the
handset.

That being said yes you must develop on a crappy computer if you want to be
known as dev that's known for making apps snappy. Is there a lot of fat one
can cut from these frameworks yes... but I think Atwood's initial argument is
that android just sucks with Javascript. The many cores, single thread per tab
just makes the experience anemic with rich web applications.

~~~
exelius
> Blame the language and not the platform is what you're essentially saying.
> Ok so initial payload can be what 600kb per your argument. But at least once
> I have delivered that my actual interactions on the server side can be
> somewhat a minimum instead of re-rendering more HTML and sending across the
> wire to the handset.

Unless your mobile browser aggressively purges its cache due to the limited
memory on most mobile devices. Then you're going to have to reload the payload
every time you bring the page to the foreground; or at a minimum reload the
framework (which usually takes a noticeable amount of time as the framework
creates a bunch of temporary data structures).

------
joesmo
I think this post on mobile performance is relevant:
[http://sealedabstract.com/rants/why-mobile-web-apps-are-
slow...](http://sealedabstract.com/rants/why-mobile-web-apps-are-slow/)

------
kra34
or you could just write regular Javascript. Browsers seem to be pretty good at
handling Javascript, CSS, and HTML. The default size of vanilla js is actually
0 KB and that's not even gzipped.

~~~
albemuth
And after reaching a certain level of complexity what you have is an ad-hoc
framework of about the same size but is a slow, bug ridden implementation of
half of one of the ones in the OP.

~~~
soapdog
That really depends on the work. There is a lot of APIs in the browser these
days that people are not using. For example, there are people adding jQuery to
a project just to have the selectors when querySelector and querySelectorAll
have been available for some time.

With good care, you can create libraries suited for your problem and keep code
size and complexity down. Don't forget that when adding frameworks you're also
adding their complexity to your project. Imagine having to debug the internals
of angular... I shiver just too think of that.

The thing is that using frameworks is cool up to the moment when you reach a
framework quirk or bug and need to dive into it as well, then you're no longer
working in your problems domain but on generic framework land and that can be
much trickier than building your own, specially if it is a solution made for a
single problem.

But yes, I agree with you that if complexity grows enough you start requiring
a special type of developer with high skills to keep this bespoke code running
well.

------
aabajian
I know GWT isn't the most popular framework (it isn't really a framework, is
it?), but I've been a fan of it for a while. It solves three critical problems
that have plagued web development:

1\. Cross-browser compatibility.

2\. Code performance.

3\. Code organization (there are many JavaScript frameworks whose selling
point have been making JavaScript more organized).

But in regards to this article, one often overlooked feature of GWT is that
dead-code is automatically removed from the compiled JavaScript:

[https://www.quora.com/How-fast-is-GWT-compared-to-
JavaScript...](https://www.quora.com/How-fast-is-GWT-compared-to-JavaScript-
and-other-languages-that-compile-to-JavaScript)

Thus you don't have to include the entire Angular/React/Ember/JQuery whatever
library just to get a few features for your site.

~~~
aikah
Sure but GWT == writing Java, downloading an heavy an complex SDK , and
writing imperative UIs. Sure it comes with widgets but you can find the same
widgets in JS.

Compare with React,AngularJS and co : way more popular,declarative UIs , if
you need static typing you can use Typescript which needs no heavy runtime.

GWT might make sense for some LOB app for desktop but not for a mobile
webpage.

~~~
DCoder
Google Closure Compiler [1] can also perform dead code removal, but it
requires extra work annotating all public APIs and naturally it can't cope
with dynamic/indirect calls.

[1]:
[https://developers.google.com/closure/compiler/](https://developers.google.com/closure/compiler/)

~~~
warfangle
So does UglifyJS. At least, obviously dead code (e.g., if(false) {} )

------
luketheobscure
I think Ember is ahead of the curve on this one. Ember 2.0 introduced no new
features, and instead only focused on stripping out deprecations, dead code,
or platform specific stuff for browsers they don't support any more.

(Disclaimer: I'm heavily invested in Ember.)

~~~
bronson
The article showed Ember 1.9 coming in dead last because of code size. 2.0 is
supposed to be a little better but do you really think it's going to be ahead
of the curve? It seems pretty unlikely... I hope you'll say more.

~~~
elithrar
> The article showed Ember 1.9 coming in dead last because of code size. 2.0
> is supposed to be a little better but do you really think it's going to be
> ahead of the curve? It seems pretty unlikely... I hope you'll say more.

To be fair to Ember here - Ember includes Ember itself, Ember Data and jQuery
in that payload. React is just React - you'll probably want to include Redux
(2KB) or Alt (33KB) and then potentially a whatwg-fetch polyfill or
superagent, normalize and other libs to (possibly) fill out some jQuery
features (Zepto @ 9.1KB), if not jQuery itself.

------
hack_mmmm
Web and Mobile has many frameworks(JS/native) that can be used to solve
similar problems. I am wondering how we can write code that can be re-written
quickly on other frameworks as a metrics of performance tests. I definitely
think that developers should test there code/app on slower/older phones. A
performance mindset creates better scalable tech and can save the company a
lot of money in the long run and at the same time can be a core of user
experience

------
frik
Even the Facebook mobile web part that has been developed in React, the people
search, feels rather slow on all my mobile devices (incl high end iOS amd
Android devices).

~~~
aikah
IMHO It's still faster than most webapps on mobile. I use an old Alcatel with
Android 2.3 to benchmark performances on low end handsets and Facebook
performances are ok compared to Gmail or even Google search which takes 3+
seconds to load just to display a search box. I think Facebook has managed to
do an acceptable job if you compare their app to other web apps.

The truth is, the mobile web space is not a good as developers expected it to
be in 2015 unless users visit websites with expensive devices, which most of
them do not. Making a website responsive , while being an improvement, doesn't
make it automatically mobile friendly.

------
daok
I am curious about how Discourse works. I have several call to
"[https://github.com/discourse/discourse/commit/1061a9ed06c500...](https://github.com/discourse/discourse/commit/1061a9ed06c50083ee6d05c7bac72c86f54433a9")
when scrolling the comment of this article. Any one have an idea why they are
repeating calls to GitHub?

------
towndrunk
I have been using T3 [http://t3js.org/](http://t3js.org/) very small and very
easy.

------
vskarine
Would be interesting to see Meteor stats there too

~~~
rch
How about Sencha Touch? I'm guessing it's a beast.

~~~
aikah
Sencha Touch is pretty fast. When I was doing to Cordova development it was
still the fastest js framework when it comes to UI responsiveness. Does it
makes sense when it comes to developing mobile websites ? I don't know. I
think sticky headers , footers , transitions and stuff like that make very
little sense for a website. What kind of mobile website would you develop with
a mobile UI framework ?

~~~
rch
Maybe logging in to the admin side of one of my applications to check
analytics, approve new memberships, and the like? I shouldn't need to package
and distribute that as an app, but some of the mobile-friendly widgets are
handy.

------
CaptSpify
Honestly, as an android-user, I just run my browser with JS disabled by
default. My phone is much faster, and I don't really want 90% of the
"features" the JS is trying to give me. Too many sites do it poorly, and it
just ruins the experience.

------
gooserock
JS frameworks don't have to be huge performance killers. I use Chaplin.js, and
it's crazy lightweight and awesome even on my old iPhone 4. The user
interactions are basically the speed of a native app, or even faster.

------
solidpy
What's the status of asm.js in mobile? It would be nice if there was a front-
end framework in some other language that could compile to asm.js.

------
ccnixon
very interesting

