
A new approach to web performance - kostarelo
https://www.ampproject.org/how-it-works/
======
ducuboy
What annoys me is how Google is selling this project. At least Facebook is
honest about their Instant Articles.

This isn't an "open standard" guys, and it's not about performance! It's about
the single piece of js that's allowed on AMP pages and the amp-tags such as
<amp-ad> and <amp-pixel> that only Google controls. Performance is just the
story they are selling in exchange of absolute control of the Web. After all,
any publisher can easily clean up their websites as they ought to do to make
it AMP compliant, but use valid HTML tags instead of the invalid amp-tags.
Google could have easily promoted performance by applying penalties to heavy
pages via Page Rank. But it's not about performance. It's all about the money.

~~~
ducuboy
AMP is just broken HTML, hardly a standard.

[https://validator.w3.org/nu/?doc=http%3A%2F%2Fwww.theverge.c...](https://validator.w3.org/nu/?doc=http%3A%2F%2Fwww.theverge.com%2Fplatform%2Famp%2F2015%2F10%2F7%2F9467149%2Fgoogle-
accelerated-mobile-pages-caching-preview)

~~~
ducuboy
What about RSS? That offered good performance, was widely adopted and was a
real _standard_ working for both publishers and consumers.

Remember who actually killed it? Yeah, that was also Google.. by discontinuing
their Reader - a very good tool with lots of users but no revenue. That's how
much open standards matter. Too bad @aaronsw is not around anymore, he would
have said something about AMP.

~~~
oneeyedpigeon
I was as annoyed as the next person when Google killed Reader but, to be
honest, RSS was killed off for typical users way before that when browser
manufacturers dropped/hid support for it. I still use RSS every day, as do
many of our customers.

~~~
ducuboy
Indeed, also Google Chrome.

~~~
magicalist
Chrome never had support. RSS support was always (and still is) from
extensions.

------
radicalbyte
Maybe it's a sign of me getting old, but I have never understood the thought
process behind building systems which server static content - newspapers,
blogs - which require active processing for every request for every user.

These are systems where the read/write ratios are often 1000:1.

Ten years ago this was symptomized by systems which pulled the same content
out of a database for every single request. We treated the symptoms by
introducing caching servers and horizontal scaling.

In modern times the symptom is displayed with heavy use of client-side
rendering. Because, you know, this client-side rendering is "free" for the
publisher.

I don't understand how any "Engineer" can see 1000:1 R:W ratios and suggest
doing all of the work on the R side of the ratio.

~~~
alkonaut
It's easy to forget that html isn't the end product, the end result is lit
pixels on some screen (usually). If we send json then that needs to be
converted to html and that html needs to be layout and rendered. If we send
html we just skip the first step. Displaying a web page isn't "free" for the
client just because it's html.

The question is this: what kind of overhead is a template rendering task (in
js) for a modern browser, compared to the task of layout and rendering a
finished html page? It will depend of course on the complexity of the data and
the resulting page, as well as the browser, templating engine etc. I have no
clue whether it's 1%, 10% or 50%.

~~~
odiroot
Crazy thought: detect viewport size on first load and serve pre-rendered PNGs
to skip the rendering on client part.

Someone probably already developed that.

~~~
masklinn
That is essentially what Opera Mini does/did: requests went through Opera's
render farm which would request page and render them (with adaptations) to a
packed binary format and send that back to the device. There was also
rudimentary JS support in the rendering farm, the device essentially got an
image with an interaction map, and interactions would be sent to the server
for processing.

Not very configurable though, can't really e.g. customise font size, observe
the page or pass it through a screen reader.

~~~
uptown
Not to mention the enormous privacy trade-off of having Opera monitor and
screenshot every webpage you visit.

------
betageek
I'm tremendously not keen on this, but am willing to be persuaded otherwise.

I don't want to have to build my sites using Google's AMP framework and their
custom elements just to get good SEO.

I don't want to cache my content in Google's network so they can 1) track
everything my users are doing without any indication on the front end and b)
start serving ads from the cache and make every ad blocker stop working
overnight.

This looks like Trojans bearing gifts in my (maybe paranoid) mind.

~~~
noir_lord
No, this is exactly my concern with this as well, particulary the ad-blocker
thing (most sites already use google analytics).

------
joosters
I hate it when pages suddenly change their scroll position, it makes reading
the site painful. So any project that aims to fix this gets my approval!

One more pet peeve of mine: websites with giant non-scrolling banners at the
top of the page. Thanks for wasting my screen space just so you can show your
stupid website logo. As well as the pixel wastage, half the sites can't even
make their banner stay still, it wobbles slightly as you scroll and it tries
to reposition itself. Or, it breaks page scrolling - page up/page down will
skip content hidden under the banner.

And while I'm moaning about the web of today: Also annoying are the sites that
hide/shrink the banner as you scroll down the page, but make it pop back up as
soon as you scroll upwards. If you accidentally move slightly too far down a
page, you shift back up a row or two to catch a missing line of text, only to
find a banner popping up and obscuring the bit that you specifically tried to
view. Argh!

~~~
yoodenvranx
> I hate it when pages suddenly change their scroll position

This problem gets worse and worse :(

I also hate the trend to forcefully remap my old trusty middle mouse click
from "open link in new tab" to "open link in current tab" via javascript.

------
gorhill
I personally choose to block ubiquitous 3rd parties by default[1], to fight
both bloat AND privacy exposure.

This AMP project reduces the bloat but at the cost of increased privacy
exposure. If I globally block `ampproject.org` when visiting a AMP-enabled web
pages, the pages do not render at all.

My understanding is that now all "Google and its partners" have foiled the
ability of visitors to protect their privacy with this AMP project -- and all
these partners are given access to my browsing history.

This goes completely counter to "The Internet is for End Users"[2]. I hope
people will boycott by blocking wholesale `ampproject.org` as 3rd party --
this needs to fail.

[1] Examples: `twitter.com`, `facebook.[com|net]`, `gravatar.com`,
`fonts.googleapis.com`, etc.

[2] [http://mnot.github.io/I-D/for-the-users/](http://mnot.github.io/I-D/for-
the-users/)

~~~
tombrossman
I'm curious why you say _" when visiting a AMP-enabled web pages, the pages do
not render at all"_. I'm not seeing the same result. The first two 'AMP' pages
I tested loaded very quickly and were easily readable.

I use Firefox with NoScript, which is definitely blocking the JS from
ampproject.org. I also use Privacy Badger and uBlock Origin (thanks btw) and
I'm not having any trouble at all. Here are the figures for the two pages I've
tested with so far:

1- Visiting [https://www.ampproject.org/how-it-
works/](https://www.ampproject.org/how-it-works/) loads a perfectly readable
page weighing 16.05kB in 40ms. It's just one file, the HTML document.

2- Visiting The Verge's demo page
[http://www.theverge.com/platform/amp/2015/10/7/9467149/googl...](http://www.theverge.com/platform/amp/2015/10/7/9467149/google-
accelerated-mobile-pages-caching-preview) also loads a single perfectly
readable HTML document weighing 17.6kB in 184ms.

I've used NoScript for quite a while and while I don't expect every page to
work perfectly with it, the speed gains are incredible! AMP and NoScript
aren't directly comparable but they seem to be accomplishing the same thing
for me. Only one of them is also blocking third-party tracking and delivering
feather-light pages which nearly always render acceptably.

Can you please clarify whether you mean the pages are totally blank, or if
they just don't render the same as if you had allowed them to execute JS?
(obviously if you globally block ampproject.org you can't access even their
index.html page but that's not the info I'm after...)

~~~
gorhill
This is what I get if I block `ampproject.org` (which -- given its purpose --
is meant to become ubiquitous): A blank page.[1]

In reality, the speed gain (which is the advertised benefit of AMP) is just a
matter of avoiding to pull all the optional 3rd parties.

For example, a non-AMP random article from the same site[2], the page loads
quite fast when I block all 3rd-party scripts/frames, except those from `vox-
cdn.com` and `voxmedia.com`. The page renders fine -- with the virtuous "side
effect" that 18 more 3rd-party are not logging network requests from my IP in
their server logs (in this specific case).[3]

[1]
[https://cloud.githubusercontent.com/assets/585534/10397383/6...](https://cloud.githubusercontent.com/assets/585534/10397383/6c93aeae-6e75-11e5-918e-39a587ecb888.png)

[2] [http://www.theverge.com/2015/10/9/9486505/california-
calecpa...](http://www.theverge.com/2015/10/9/9486505/california-calecpa-
electronic-surveillance-warrant-law)

[3] The total number of 3rd-party in this specific case climb to at least 48
without a blocker.

------
mhw
This is obviously Google's response to Facebook's Instant Articles and Apple's
News. Behind all the technology is the main business goal of delivering a
content platform that has the sponsoring company's ad network baked in to it.

~~~
ducuboy
Exactly.

------
tempodox
I suspect that the issue of web performance is less a technical problem than
an organisational one. We have the technology to build performant pages right
now, as well as best practises — they're just not used pervasively enough. As
long as the business processes that produce the slow pages don't change, no
shift in technology alone can save us.

~~~
cramforce
Exactly. AMP is very much about addressing that issue.

------
bryanrasmussen
That the page is itself an AMP HTML document would be really impressive if it
wasn't really simple to make a completely valid html page that looks the same
as that one that loads extremely fast on every platform.

~~~
TazeTSchnitzel
I guess this makes my lowly blog now advanced technology, because it doesn't
use JavaScript.

------
mattmanser
This really comes across as some sort of elaborate joke. Their solution to a
slow web is to circumvent the standards and use a javascript library?

My only contribution to the conversation is utter bewilderment someone thought
this would be a good idea.

The improvements are because they've removed everything.

Also, side note but I thought there was already a proposal in HTML for
reusable elements?

~~~
ducuboy
No joke, it's all about the money.
[https://news.ycombinator.com/item?id=10358945](https://news.ycombinator.com/item?id=10358945)

------
jeena
I don't seem to understand what this is for. If you want a fast loading page,
don't add slow stuff to it.

What new is AMP bringing to the table? It seems to be some kind of a
framework? Is it code or is it a preprocessor or what is it?

------
bshimmin
"With this in mind we made the tough decision that AMP HTML documents would
not include any author-written JavaScript, nor any third-party scripts."

Quite a bold decision there! Ultimately this will fail, of course, because the
vast majority of sites that attempt to make money will need third-party
scripts in some form. A much better move would be to try and produce a
standard which third-party library authors could adhere to so that their
scripts behaved nicely.

~~~
danieldk
_Quite a bold decision there! Ultimately this will fail, of course, because
the vast majority of sites that attempt to make money will need third-party
scripts in some form._

I don't think it will necessarily fail, if AMP has a good ad story.

I think some publishers realize that ad blocking is taking off in an
accelerated pace since iOS 9. If they have to choose between a universe
between no ad revenue (because ad blocking) and more behaved and subtle
advertising, they will choose the latter.

Given that AMP is pushed by Google, it's pretty obvious that they did think
about ads:

[http://www.wsj.com/articles/what-googles-amp-project-
means-f...](http://www.wsj.com/articles/what-googles-amp-project-means-for-
online-advertising-1444247579)

~~~
pjc50
"Google is not yet sure exactly how advertising will work within AMP" says
that article.

~~~
JustSomeNobody
They can say that, but this is clearly targeted at the ad blocking
conversation that is occurring.

------
pavlov
AMP reminds me of "i-mode", a Japanese mobile HTML variant originally launched
in 1999 that was hugely popular in the country for many years:

[https://en.wikipedia.org/wiki/I-mode](https://en.wikipedia.org/wiki/I-mode)

Then, as now, regular HTML pages were considered too slow and complicated for
mobile phones. I-mode was conceived by the large network operator NTT Docomo.

The big difference is that NTT Docomo were the exclusive gatekeepers for
publishing content. Google also gives you the option of publishing AMP content
on their CDN, but it's not mandatory.

------
manigandham
What's sad is that this is just optimized HTML + CDN packaged up by Google and
heralded as some big innovation.

If publishers spent a few hours they could make extremely lightweight pages
too (even with media and ads).

Here's an AMP page:
[https://amp.gstatic.com/v/mobile.nytimes.com/2015/10/08/us/r...](https://amp.gstatic.com/v/mobile.nytimes.com/2015/10/08/us/reassurances-
end-in-flint-after-months-of-concern.amp.html?amp_js_v=0)

~~~
stdbrouw
Well, most publishers actually use fairly optimized HTML/CSS and most
publishers use a CDN. But there's only so much you can do if you don't want to
get rid of those 5 ad networks, 7 analytics services, 2 sidebars and just
generally a whole bunch of crap that has nothing to do with the article.

The innovation of Facebook instant articles and AMP isn't technical, it's that
they have been able to convince publishers that getting rid of all the bells
and whistles will actually benefit them – and that if they don't do it, it
will harm them because huge traffic drivers like Twitter won't put up with it
anymore.

~~~
manigandham
I disagree - most publishers I've seen (including large and mid-level) have
some of the most bloated HTML with megabytes(!) of javascript and CSS just for
rendering + all the other stuff you mentioned. They are definitely not
optimized.

I dont think there is any innovation at all with either Facebook or Google,
it's all about keeping the user on their service longer and by hosting the
content themselves, they increase speed and claim more user time. It's a pure
monopoly play with Apple, Google, Facebook having far more leverage than
publishers (who have really none these days). Pubs need traffic to survive
which is why they are willing to work with this, but the revenue potential is
still up in the air so long-term sustainability is not anywhere near proven.
I'm not convinced that this will work as a viable model, beyond the technical
implementation.

~~~
zenocon
> I disagree - most publishers I've seen (including large and mid-level) have
> some of the most bloated HTML with megabytes(!) of javascript and CSS just
> for rendering + all the other stuff you mentioned. They are definitely not
> optimized.

[http://www.nytimes.com/interactive/2015/10/01/business/cost-...](http://www.nytimes.com/interactive/2015/10/01/business/cost-
of-mobile-ads.html)

------
jackgavigan
_> Resources must declare their sizing up-front_

This is _such_ a great idea. Jerky web pages that jump around while you're
trying to click on something are maddeningly annoying.

------
peterjmag
Two scrolling behaviors that I found odd on this page and on the NY Times
example[1] on iOS:

\- You can't scroll to the top by tapping the status bar.

\- The scrolling momentum is different. There's less friction, as if you were
scrolling through a list view (e.g. Mail).

Anyone know how/why? I thought it might be some -webkit-overflow-scrolling
trickery[2] with a full-size container div or something, but I don't see
anything too weird like that.

[1]
[https://amp.gstatic.com/v/mobile.nytimes.com/2015/10/08/us/r...](https://amp.gstatic.com/v/mobile.nytimes.com/2015/10/08/us/reassurances-
end-in-flint-after-months-of-concern.amp.html?amp_js_v=0)

[2] [https://css-tricks.com/snippets/css/momentum-scrolling-on-
io...](https://css-tricks.com/snippets/css/momentum-scrolling-on-ios-overflow-
elements/)

EDIT: Turns out there _is_ some -webkit-overflow-scrolling trickery going
on[3]. But again, why?

[3]
[https://github.com/ampproject/amphtml/blob/77d9f31263866e56a...](https://github.com/ampproject/amphtml/blob/77d9f31263866e56a83a7ee7194f1ec46bed65f9/src/viewport.js#L533-L559)

~~~
joosters
The scrolling feels unchanged for me, but the top shortcut doesn't work.

In general though, why sites feel the need to change / reimplement scrolling
is beyond me. Every reimplementation is worse than the native behaviour.

------
JustSomeNobody
I feel this is Google's response to Apple's ad blocking push (maybe push is
too much here, what's a better word?).

Neither AMP nor ad blocking are what is needed. We need sensible ads that are
unobtrusive. We need web pages that do not track the consumer. We need smarter
designs that don't use huge images everywhere. We need JS where it is needed
and not just because it's cool.

We don't need two behemoths fighting over the control of our web.

------
insanebits
TLDR; I mostly agree with monopoly idea.

We had WML for mobile devices before mobile devices were powerful enough to
render HTML. Then we got smartphones better than my computer bought 5 years
ago. AMP in my oppinion is a step back, yes you can do a lot of things in CSS.
But in reality there are cases where it needs to happen dynamically. For
example: javascript input validation, you cannot validate data using CSS. Are
we then going to reinvent Javascript using some new fancy name?. Also, how
about backwards compatibility(IE7-8 I'm looking at you)?

Those companys could spend their resources on better things, like implementing
ways to load viewport dependent images(it seems it's already done using
picture tag). Because multimedia takes huge chunk of the website.

And like always there are ways to write performant websites using existing
technology and even more ways to write slow websites using all the "bleeding
edge" technology. It's up to the developer mostly.

------
TazeTSchnitzel
This isn't an HTML subset, it's some freakish pseudo-HTML JS library
weirdness.

Why not define an _actual subset_ of HTML? You know, like XHTML Strict Mode?

Why must everything be a JavaScript library?!

~~~
spankalee
It _is_ an HTML (and CSS) subset.

JavaScript is used to implement custom elements, like amp-image. That's just
how custom elements work.

~~~
TazeTSchnitzel
It's not a subset, it's not a superset. It lacks some HTML things and adds new
ones.

~~~
spankalee
Custom elements are a part of HTML, they can be part of a subset.

Disqualifying custom elements from an HTML subset would be like claiming
function declarations can't be in subsets of JavaScript.

~~~
TazeTSchnitzel
That's not really the same... HTML elements are more like keywords than
functions.

~~~
spankalee
Not really. To continue the analogy, in HTML doesn't really have keywords, but
has syntax in the form of brackets, quotes, =, etc. Tags are like function
calls, and the built in tags are like the standard library. Until custom
elements you couldn't declare "functions".

------
ised
"Performance is just the story they are selling in exchange of absolute
control of the Web."

To be fair, they are not the only ones that use this "story" as a ploy to get
more control over users (and hence gather more saleable personal information).

Phrases like "make the web faster" are disingenuous and should not pass any
intelligent user's BS filter.

The very reason the web is slow is because of these companies which need to
serve ads and other crud to survive. That means more DNS lookups, more TCP
connections, more HTTP requests, more Javascript, longer URL's, more unwanted
IMG's (e.g., beacons), more tags, etc., etc. The list is so long I cannot even
hope to capture it all. That end result is simple: staring at a screen waiting
for the computer to respond. Not to mention frequent leakage of personal
information.

As a user of netcat and text-only browsers that retrieve pages in
milliseconds, I am astounded at how long users today are willing to wait for
their content (i.e., "page loads"). I also run my own web servers at home to
serve content to my family's mobile devices. I am well-aware of what (i.e.,
who) slows down "the Web".

The user does not start from the assumption that she needs to (down)load
"resources" (as in Uniform Resource Locator) from a number of advertisers for
every page she views. Those are the assumptions that the web company starts
with. Those are the constraints they must work within. Not true for users.

I do not need Google DNS. I run my own locally, primed with all the domains I
routinely visit. No remote DNS cache is going to be faster than my loopback.

Nor do I need HTTP/2\. I just use HTTP/1.1 pipelining to retrieve 100's of
pages of content, usually 100 pages at a time. HTTP/2 is something the web
companies may need to accomplish their goals of serving ads and collecting
personal information. But it is not something users need.

To drive adoption they must convince users that users need these
"improvements". So the web companies purport to offer "solutions" to the
problem they themselves created: a slow, bloated www.

~~~
Certhas
So I as a user put up with all that loading of resources, because I don't have
the time and money (resources) to set up my own version of the web. And
because I consume a lot of content I don't otherwise pay for.

To pretend that the infrastructure that allows companies to pay for the
content I want to consume is incidental to the true purpose of the web (text
documents??) is pretty narrow.

------
coldtea
I have a 45 Mbps line (with low latency).

Even so their crappy webfonts took like 1-2 seconds with a FOIT before that,
while all the text was available almost immediately.

------
z3t4
The problem is that publishers want to know everything about their users, so
they add tons of "spyware" that slow down the site.

Who does already know EVERYTHING about your users? But can they sell it?

startup idea: Make a blazingly fast CMS that tracks everything and show pretty
graphs.

------
AnbeSivam
With custom_element and web_component support not being complete in other
browsers, will the polyfills provide as much benefit in these other browsers
as it does in chrome.

------
Dirlewanger
The only thing that's getting accelerated here is the filter bubble.

The Internet and all of its neutral standards are being eviscerated by
faceless avarice, and no one gives a shit.

------
ramon
It's not 100% on Google PageSpeed Insights, it's 75%. They needs now to focus
on render blocking, space between the characters and gzip.

------
gildas
I am wondering why v0.js is compiled with Babel if Google really wants the
JavaScript execution to be fast.

------
est
WML reborn?

~~~
ducuboy
Oh the good old days..

I was rendering WML/CHTML/XHTML with Wurfl depending on UAs from hacked
ASP.NET.

Then iPhone appeared and ruined everything ;)

------
anuraj
Nothing new - Ever heard of CHTML, WBXML etc? 90's standards.

------
mtgx
Does this mean Google will get to centralize tracking through its servers even
more?

~~~
ducuboy
Yes.
[https://news.ycombinator.com/item?id=10358945](https://news.ycombinator.com/item?id=10358945)

