It's prerendered (via a static site generator). In total, it loads 692 KB (I din't do anything to optimize it, the images are quite large etc.). It loads from a small server, and images are loaded from Twitter, meme.com etc.
Here's an AMP page: https://www.google.se/amp/s/www.usmagazine.com/celebrity-new...
It loads a whopping 2.9 MB , and keeps loading as you scroll down. If you open it from Google's search, it opens instantly. Because parts of it were already preloaded on the search page. And the page itself (including almost all images) is served by a ridiculously powerful geographically distributed CDN.
1. How is that fair to people who actually build their pages and host them on their servers?
2. What is open about this web?
3. How will Web Packaging solve this issue if I can't afford to build a geographically-distributed CDN on par with Google's for my own cache?
 It actually changes on every reload. The lowest number I've seen is 1.6 MB, but then, in a second or two, it starts loading additional stuff, going up to at least 2.2 MB
So much for "small APM pages". Actually, as I'm clicking around, rarely is a page below 1 MB. Even for pages that are not that different from mine: only images and text.