Being forced to load everything you read via Google is worse for privacy. There are already plenty of solutions for blocking 3rd party scripts, like Privacy Badger and umatrix.
Google already has your IP; it's their page. Preloading resources from it's own CDN doesn't tell them anything they don't already know. Preloading resources from someone else's domain would.
Google doesn't have a list of everything you do online unless you're loading resources from their servers after you leave their site. AMP always loads from Google. If you block those 3rd party scripts, AMP pages literally take 8 seconds to load.
With an extension such as uMatrix or NoScript, blocking first-party scripts will cause `noscript` tags to be rendered, and one of these tags disable the CSS animation, causing the page to appear immediately.
When I found out about this, I tried to find the reasons for this artificial "delay" in the AMP documentation: I can't find any valid reasons for the artificial delay.
The net result unfortunately is that most users wanting to block `ampproject.org` out of privacy concerns are going to feel the need to whitelist `ampproject.org` to "un-break" a site making use of it.
Just how do you imagine they'd "CDN people's non-AMP content"? There's no mechanism by which they could tell the browser to load nytimes.com but to replace the URLs of random resources with different ones.
They'd need to host the actual page on Google.com. And after solving all the problems that doing this introduces, you've pretty much got AMP already.
Even if you can't package up and ship all of your traditional site to Google's CDN, you could do most of the burdensome/heavy bits. But then Google doesn't get to control your website and define the way it's allowed to look, which is what AMP is really for.
So it was not possible when AMP launched, is not possible now, and might or might not be possible sometime in the future after some specs are finished, but only in some browsers. Doesn't sound very practical, to be honest...
I also can't imagine the amount of shit Google would have taken if they'd started just randomly doing that kind of thing for existing web pages. Instead they introduced a totally new mechanism (i.e. AMP) where the caching was a core concept from the start.
> Even if you can't package up and ship all of your traditional site to Google's CDN, you could do most of the burdensome/heavy bits. But then Google doesn't get to control your website and define the way it's allowed to look, which is what AMP is really for.
But "heavy/burdensome bits" are exactly the things that matter the least for this use case. Ideally they would not exist at all. If they do, they should not be speculatively prefetched.
It'd also mean that these pages are now tied to Google's CDN, no matter what. Have a user click through the link from some other source than a Google search result? They'll still end up loading the resources from them. Is that really what you want?
That's true. I don't see any reason why they couldn't cache non-amp content with a combination of checking for speed benchmarks and schema markup. When you think about it that way, it seems like they are more concerned with controlling the user experience than they are with speed improvements.