

Web App Speed - tbassetto
http://ignorethecode.net/blog/2015/05/31/web_app_speed/

======
Detrus
Flash sites, apps and games also earned a bad reputation for eating up CPU,
loading too long, doing too much and running poorly. And it was mostly due to
the culture of the developers.

In the Flash vs HTML5 iPhone wars, people argued that if Flash was hated
because of the developers who abused it, what's going to stop the same thing
happening to HTML5/JS/canvas?

It's happening. Usually not on some interactive visualization game but on
simple articles. And the response are things like Facebook Instant that
introduce a babysitter against ambitious developers and dysfunctional
organizations.

But it's really something that should be built into the web. Sites that eat up
a lot of bandwidth and CPU can be marked. Sites that legitimately require more
resources, are experimental in some way, can be marked by the developer as
such, warning users. An article would rarely be one of those.

If no centralized measures are taken against poor UX, only relying on culture
to maintain UX across the board, the web stack will meet the same fate as
Flash. Competitors like Facebook Instant will route around it.

~~~
olalonde
On a related note, Google penalises websites with longer load times.
[http://googlewebmastercentral.blogspot.nl/2010/04/using-
site...](http://googlewebmastercentral.blogspot.nl/2010/04/using-site-speed-
in-web-search-ranking.html)

~~~
userbinator
I understand the reasoning but I don't agree with their implementation,
because it essentially favours large sites run by those who can afford the
faster infrastructure, while penalising the small ones that may actually have
more relevant and detailed content but not very fast servers. Sites which load
much faster get a ranking advantage even if they offer only superficial lower-
quality content.

Maybe if they didn't look at server response time, and only counted things
like JS execution time, it would be a fairer ranking to those looking for
substantive content. I'm quite willing to wait a lot longer, if it means I
will spend more time reading a lengthy page once it loads, than to find a
fast-loading page with little content.

That article was written 5 years ago, and while speed might not have been a
huge factor back then, they could've changed it since --- certainly there is
no evidence to suggest otherwise, and my experience with how Google's results
have changed over time agree.

~~~
anilgulecha
Interesting take.

I'd hedge that 95-99% of alexa top 1000 sites could be delivered under 30kb --
just content+ css. (assuming ajax/video/fonts are asyncronous in google's algo
and not counted towards load time).

In which case it's easy even for the cheapest AWS server to host and server
instantaneously. I think the bigger sites have a distinct disadvantage -- they
have so many tracking and ad resources, that it kills their load time.

~~~
jakobegger
Yup. For my company website I made sure that most pages require just a single
request to render: CSS is inlined and (small) scripts are loaded at the bottom
of the page. This means that even when hosting it on a crappy host, without a
CDN, you get stellar page load performance.

------
gregweng
The root of all evils, as you commented: "People don’t want to learn
JavaScript".

However, I think it's because DOM/BOM API sucks. JavaScript, namely the
language itself, doesn't even specify any event driven or asynchronous things.
Not to mention why NodeList horribly isn't an array, or the need to write the
long and annoying 'getElementsByClassName' rather than a simple $( ).

(Yes I know things like 'querySelectorAll' are much better, but it's too late
for most of prejudices)

~~~
omouse
someone at Google or Mozilla should already have been writing an API that's
similar to jQuery that just "compiles down" to the DOM methods so that the
loading of jquery isn't necessary. I'm surprised no one's pushed for using
jQuery's API as an additional DOM API. Old browsers you load jquery, new
browsers you don't and your code doesn't change. It would cut down on a few
seconds of load time for sure and it'd be better than using a CDN.

------
supster
People are using the canvas now to achieve fast speeds on mobile web. See
Flipboard's post on this [1]. The big issue ppl brought up was that using
canvas instead of DOM meant losing accessibility. Btw Flipboard released an
interesting framework called React-Canvas[2] based on their efforts.

1\. [http://engineering.flipboard.com/2015/02/mobile-
web/](http://engineering.flipboard.com/2015/02/mobile-web/) 2\.
[https://github.com/Flipboard/react-
canvas](https://github.com/Flipboard/react-canvas)

~~~
thekingshorses
Flipboard assumed DOM is slow. I don't even think they created a prototype to
see how slow DOM is.

I implemented this in few hours. Works on iOS, Android 4+ and WP

\- Demo for mobile: [http://premii.com/play/flipboard-style-
news/](http://premii.com/play/flipboard-style-news/)

\- I am using it in real app here.
[http://reddit.premii.com/#/r/news/](http://reddit.premii.com/#/r/news/)

~~~
squeaky-clean
Your demo works, but I get an average of 10fps on my LG G3 and 20fps on my
iPad, with the occasional freeze. The Flipboard demos are incredibly smooth
with no lag or hanging at all.

~~~
thekingshorses
iPad was released 5 years ago. I have iPad2 and iPad air 2, and both works
fine.

I don't have LG. I have tested on Asus, Samsung S3, and Nexus 5. Try it on
Chrome browser, and see how it works.

Some images (Washington post, latimes, etc) are bigger than 20 MB each. All
browser freezes when you render those images. My demo specifically includes
those to see how bad it performance on different devices.

One of the benefit of being a big company is you can control everything.
Flipboard scales down those images to 300kb before sending it to browser. I
use original images.

~~~
squeaky-clean
iPad Mini 3 if you really want me to be specific. I was trying it in the
latest version of Chrome on my LG, that's where I get 10fps. Using the stock
browser it freezes every other slide (and won't progress past slide 4).

> Some images (Washington post, latimes, etc) are bigger than 20 MB each

Why not use resized ones for your test? Sort of invalidates it if you can't
tell if it's being janky because of the DOM or because of image size.

------
archagon
1\. Regarding native complexity: surely, using CoreGraphics is just as easy as
canvas?

2\. That game looks like it's running at about 15fps. I'm not sure that
constitutes very good proof of the argument. The best apps and games run at
60.

3\. I'm not sure who's using web MVC frameworks en masse and to what end, but
I would NOT want to write a data-driven web app like Discourse from scratch.

~~~
w0utert
Agree, similar arguments came to my mind when I read the article. Two other
relevant points:

4\. Even if a game like the one in the article performs reasonably well using
HTML5 Canvas + JavaScript, that still doesn't mean it's 'efficient'. A native
application could be much smoother still, use less battery, etc.

5\. The article is titled 'Web App speed', but it's only (somewhat) relevant
in the context of sprite-based games. With WebGL it's possible to do quite
complex 3D-based graphics in the browser as well, but anything non-game
related is a whole different matter. Surely 'boring' apps with lots of
buttons, scroll bars, other widgets etc. will not be easier and more efficient
running inside a web browser, compared to using native controls.

------
jscottmiller
Absolutely agree with respect to ads. While the steady state performance of
most banner ads is fine, loading new ads often force large javascript files to
be fetched and parsed as well as needless reflows. On our game building site
[1], we try to refresh ads only during natural breaks in gameplay (level
changes). While this doesn't improve the loading time, it does help to shift
the disruption to a time when it is less likely to be noticed.

1\. [https://www.1dash1.com](https://www.1dash1.com)

~~~
logotype
well, it's not surprising web apps are slow. as an example, 1dash1.com
mentioned above: 57.480 lines of JS across 8 separate domains. and the result?
site is loading slow (at least on my 500MBit/s fiber connection) and the
browser is executing a lot of stuff, most of which is not really efficient or
time sensitive (or required). this is how most sites does it, it has to be
better ways.

------
bla2
Sunspider does next to no processing, it's really not a good benchmark for
judging computation speed of a device. Arguably, it's not a good benchmark,
period. Kraken or Octane are more interesting benchmarks for compute-heavy
workloads such as games.

------
z3t4
Reading this article was a fresh breeze.

The Canvas is very easy to work with. You can forget about JQuery and
everything else concerning the DOM and just write pure JavaScript. I've even
started working on a canvas based text editor.

~~~
pavlov
_I 've even started working on a canvas based text editor._

Careful! Mozilla tried that as far back as early 2009 (the project was called
Bespin, you can google for "mozilla bespin canvas" to find out more).

The canvas-based approach was eventually abandoned and the project merged into
Ace, a more traditional DOM-based editor.

~~~
z3t4
I did some research before I started and read about Bespin. I don't think they
merged into Ace because of the Canvas. It was probably because another team on
the same company also worked on an editor, witch had the same goals, and aimed
for the same market.

------
justin_d
There are issues however with the way that some browsers render canvas,
particularly stock browsers. I've had wildly different fps on different
browsers.

------
amelius
But how would one implement e.g. progressive rendering in the background?

