
The famo.us controversy - andreypopp
http://blog.siliconpublishing.com/2013/12/the-famo-us-controversy/
======
bhouston
This article is sort of rambly but I agree with it.

WebGL is the best way to talk directly to the GPU not through CSS3 transforms.
Any experienced developer would be pretty insane to try to supplant a 20 year
old community effort (OpenGL) for efficient GPU communication, especially one
that has involved deeply all the GPU vendors.

NOTE: My company has bet big on WebGL and Three.JS so I am biased in this
regards: [http://clara.io](http://clara.io)

~~~
lhnz
What I don't understand is why we don't just skip the DOM. And by that I mean,
currently people are using virtual DOMs like that provided by React.js however
this still has to access the DOM when it wants to render, but couldn't we
rewrite a DOM/CSS renderer with WebGL (or canvas)?

Or perhaps just skip that step and come up with something nicer for specific
use cases.

Why aren't we leaving all of this float/clear stuff behind us?

~~~
bhouston
> couldn't we rewrite a DOM/CSS renderer with WebGL (or canvas)?

Of course we can. :) But writing full UIs are tons of work.

Why are there no OpenGL UI libraries in wide use on desktops? The same
argument should apply to a degree? Or is it just that HTML5 is way less
efficient that GUI+, Swing, etc.

~~~
potatolicious
There are "OpenGL UI libraries" :P It's called OSX.

The main reason why we haven't seen more is because OpenGL up until now has
been largely in the sphere of gaming, where standardized UIs aren't a
priority.

That being said, even in gaming there are a large number of libraries
available that help you implement UIs.

~~~
bhouston
Ah, cool. I'm not an OSX developer.

The OpenGL UI libraries I've seen on Windows are pretty horrible/limited.

Those in games are usually based on Scaleform and are also relatively limited
in terms of built-in functionality, although it has wide freedom of design:

[http://gameware.autodesk.com/scaleform](http://gameware.autodesk.com/scaleform)

------
harwoodleon
I am perplexed. I for one am running a project that would have absolutely
embraced famo.us as part of the UX. There have been lots of smoke and mirrors,
a few emails dotted in to ensure we didn't think the development team had
died.

I smell fear here, whether it is investor fear or team fear of losing IP and
control when(if) they open source the project.

Whatever the issues they are harming the credibility of the platform. I was an
evangelist to begin with - a library built in JS that can silence the compiled
crowd is a sure feat.

Now I am more of a skeptic. If this is how long it takes to release the
codebase, imagine what updates will be like. I know in the newsletter Steve
has been trying to appease, but all that it sounds like now to me is a flash
of arrogance over owning a toy that no one else can play with, in an "i know
you need this, but hang on" kind of way. I know it is not meant in that way,
that would be silly.

WebGl is maturing and if they don't push what they have then they will lose
the traction they have for sure.

Hopefully they will just release what they have soon and be done with it.

~~~
X4
Assuming you're right, then that fear is about money, if your claim stands
right. But, if the fear is money, why don't they just let every developer test
it early for a fee? Σ that up and you end up with a great way to fuel your
startup, without causing premature investor-friction.

Here's a number: $ 5 × (100.000 × ¼) = $ 125.000 in Free Capital

What that means for their customers (Early Adaptors) is this: $ 5 + EULA +
(Freemium of sorts in the future, for loyalty) = WIN

TBH: I thought the same, I mean the decision to lock-out so many potential
developers willingly is very bad, or just ignorant. Personally I value their
startup high as a replacement of <Enter-Mobile-Framework-Here>, but time is
not everything, sometimes you can loose a lot of customers, because of the bad
first impression you have made in the beginning.

~~~
karellan126
Good thought but the numbers they're playing with and hence
pressure/expectations are much higher.

$5.1m invested dollars in from big VCs already
[http://www.crunchbase.com/company/famo-
us](http://www.crunchbase.com/company/famo-us)

VCs were betting on the longshot that famo.us delivers and actually works as a
layer/graphics engine for non-native apps (a $100m+ play for sure). We all saw
those bets and the founder certainty as a guarantee this technology would
actually live up to the hype where I think we should look at this one more
like the VCs did. A 1 in 20 shot and changes everything if it happens vs.
expecting it to happen and factoring it in our plans.

Hate to say all this because I'm building a browser limit pushing project
myself that could have used famo.us...

------
United857
I first saw them demo this at the October 2012 HTML5 Developer's Conference.

We attended their last two SF "meetups" in December 2013 where they promised
they would release the code right then (or the day after).

Still no release, and no further mailings. This is effectively vaporware in my
view, and they can't indefinitely dangle this carrot in front of developers
before we get fed up.

------
AndrewDucker
I wonder why they can do things faster through Javascript than the Browser
makers can manage through C++.

Are they cutting corners because they can ignore the DOM and associated
events? Or is there something else going on?

~~~
dmethvin
Generally, this happens through simplifying assumptions. For example, let's
say you need to often check whether an element matches a specific CSS
selector. That's easy with `elem.matchesSelector(selector)` but the browser
has to assume the selector could be something complex like `#content li > ul
input[type="number"]` just as easily as it could be `#phone`. Plus it has to
parse the selector every time (or maintain a complex cache). If you as the
developer know the selector always an ID you can just check for `elem.id ===
"phone"` in JavaScript and that will whip the pants off the C++ code that has
to deal with an arbitrary selector.

I'd like to see what assumptions famo.us has made and whether they could have
just been addressed at the CSS level. An example of that would be things like
overlapping elements with rounded corners and drop shadows, which are very
expensive for the rendering engine. So perhaps it was a lack of understanding
of CSS jank factors that led them to decide a total circumvention was in
order? We'd really need to see a side-by-side of the two and understand where
the conventional solution was bogging down.

------
Sephiroth87
The slowness problem when making an hybrid app (at least on iOS) is entirely
caused by the javascript engine, since the webview doesn't use Safari's Nitro,
so I don't see how rendering the page that way would make it faster...

Unless they are purely talking about websites, then I don't see why would they
even bring app native/hybrid apps...

~~~
dmethvin
> entirely caused by the javascript engine

I've profiled a lot of web pages/apps and this hasn't usually been true in my
experience. Occasionally JavaScript is the culprit but that is more often to
be due to really bad O(n-squared) algorithms and script that forces layout. In
contrast, the rendering engine combined with complex CSS is often the cause of
"janky" page behavior. Do you have some profiler runs that show a JavaScript
bottleneck on a webview that doesn't exist when using Nitro or some other
browser?

~~~
Sephiroth87
[http://blog.dave.io/2012/07/chrome-for-ios-is-crap-but-
that-...](http://blog.dave.io/2012/07/chrome-for-ios-is-crap-but-that-might-
have-been-the-plan-all-along/)

Ok, it may not be ENTIRELY the JS engine's fault, but it's definitely the
biggest part of the problem...

------
cnp
Ugh, this resonates with me completely. Every time I see these guys I think:
Ok, I can use Greensock's TweenMax and get every single bit of performance as
these dudes are getting in their crazy hyped render engine. Its just totally,
completely, silly IMO -- but that said, I'm also willing to be proved wrong.
(It also makes me think of Mr. Doobs Periodic table that he put out after the
very first demo using the Three.js CSS3 renderer that he wrote. How many lines
of code was it, 10, 25? Both clocked at 60fps.)

~~~
electrichead
If you've tried to use GreenSock on a mobile device, you'd see that it
stutters quite a bit depending on the number and size of elements. I've tried
all the famo.us demos on basic mobile phones and for the most part they all
perform well in terms of fps. That is their main advantage, I think.

------
pron
I am not familiar with clientside JavaScript. Could anyone explain what
famo.us is doing and whether it has merit? Their demo looked less impressive
than WebGL ones.

~~~
bhouston
Something like direct CSS3 transforms combined with their own UI library? I
think they are going to get steamrolled by WebGL personally and also by native
apps, or probably more accurately they have already been steamrolled by these
two technologies.

~~~
harwoodleon
you might be correct. It is a h4ck for sure. A really good one and one that
may have found its way into standards.

------
puppetmaster3
DOM animation is done native, by GPU.

WebGL... you can't even do fonts.

And DOM is more accessible by creative designers.

------
err4nt
TL;DR the solution to HTML5 performance is to not use HTML5.

They use JavaScript instead and get great results, but that's hardly fixing
the problems of HTML5 like the article made it out to be. HTML remains slow,
they are just putting forward the idea that JAVASCRIPT apps may be as fast as
native apps.

~~~
arethuza
"HTML remains slow"

Approaches to updating the DOM seem to be getting a lot smarter - tools like
ractive.js are smart enough to make the smallest (fastest) changes to the DOM
rather than the traditional deleting and recreating large chunks of the tree.

Edit: React also has a virtual/shadow DOM for high performance updates.

