Hacker News new | past | comments | ask | show | jobs | submit login

AMP does not seem like a walled garden tech. any crawler can find and use the AMP version of a page through this tag:

<link rel="amphtml" href="https://www.example.com/url/to/amp/document.html">

Thus all google is doing is rewarding sites that have implemented this and as a result load fast, no? imo, its drastically different to Instant Articles because you can never ever find the Instant Articles version of a web page. its private and only fb knows it.




I am not saying exactly that Google is evil, and maybe they don't even have bad intentions (heck, we are talking about a corporate identity here, how far can we apply morals?). But, what I am thinking is that AMP and its rivals (such as Instant Articles) hurt the free & open Web. For instance, Google Search is invaluable, but as a side effect, a website that doesn't end up on the first page of a Google Search result for instance, is nearly non-existent.

Also, the notion of hyperlinks are crucial to the Web and these projects break it. Hyperlinks originate from an article from 1945, "As We May Think"[1], and the term itself is coined by Ted Nelson in 1965 for Project Xanadu. That's how you create a web that is now called the Web. That's also what Google's PageRank algorithm depends on, at least at the beginning. Now, letting Google host your content, under a link such as `https://www.google.com/amp/www.example.com/amp/doc.html`, you are breaking hyperlinks: it is no more linking different websites together.

[1]: https://www.theatlantic.com/magazine/archive/1945/07/as-we-m...


i would expect that content producers will start putting in abreviated versions of their content in AMP format to satisfy the AMP requirement as minimially as possible, and then add a link back to the original site for the rest.


I'm going to guess Google would penalise that behaviour for showing different content, justifying it as "harming the user experience" or something.

If they don't do this already, they probably will if they keep going down the AMP rabbit-hole.


I believe reddit does something like that where they include only a portion of a thread with a big “view full thread” button.

I think that’s a good idea, but I fear some users won’t realize there’s more or may not understand the abrupt change in layout/design/etc.


That's exactly what allrecipes.com used to do, and they have fairly high Google juice for random recipes. I thought they still did, but when I went to verify it didn't show up. It'd be interesting to know why they stopped.


Sites that are traditionally called walled gardens (facebook, AOL, compuserve...) often combine four features; (a) all the content is hosted under the walled garden's domain; (b) the walled garden designs key parts of the navigation and layout, understandably acting in their own interests; (c) the walled garden has content policies and declines to host certain content; and (d) users must have accounts and be logged in to see the content.

AMP does (a) and (b); I don't know about (c); and doesn't do (d).

Still, I can see why people see echos of walled gardens in AMP.


The feature that defines a walled garden is the wall. AMP doesn't have that wall: it is perfectly usable by any other search engine/spider to replicate the experience. The website itself can link to the AMP version if that makes sense for users. Mobile browsers, or web servers, could check for and redirect to the AMP version.

There's a slightly-higher wall for advertisers, I believe. But Google is rightfully afraid of antitrust issues and seems eager to get advertisers besides themselves signed up. But it is a wall in that it requires whitelisting, as far as I remember.

There's a lot to criticise about AMP, but I don't think the "walled garden" metaphor fits.


Here's the wall: could any competitor to Google itself ever be on AMP? (E.g., a competitor to search, Gmail, YouTube, or adsense/adwords.)

Or would it have to be something on the open web? And if only AMP pages get ranked well on Google in the future, how much harder would it be for such competitors to ever get noticed in the first place?


> And if only AMP pages get ranked well on Google in the future

Currently, the only thing on Google results explicitly limited to AMP is the mobile news carousel. Presumably AMP pages also do well on the "loads fast" evaluation that affects normal rankings, but in theory it should be possible to do equally well without AMP. If this is changed, or turns out to not hold up in practice, then there is cause for concern, but I have not seen evidence of either.

News sites, of course, don't generally compete with Google's core services - unless you count Blogger, but in that case all news sites do). The one thing I can think of is that they might embed videos hosted on competitors to YouTube. When it comes to that, AMP is a mixed bag. Unlike on the open web, video players that use custom controls/iframes (like YouTube) need to be explicitly approved, since there's not much alternative without granting a blanket license to put arbitrary sites in iframes, which would (mostly) defeat the purpose. So Google acts as a gatekeeper. On the other hand, the spec [1] already lists like a dozen random video hosts you've never heard of; that's not evidence that Google wouldn't try to block a more serious competitor to YouTube, should one ever spring up, but there's certainly no evidence that they would.

[1] https://www.ampproject.org/docs/reference/components/amp-vid...


Unless I'm mistaken, they did experiment with various attempts to highlight their own video offerings beyond what popularity dictated before finally giving in and buying YouTube.

And we all remember the kinds of crap Google did to try to foist Plus on users who had no interest in it. At one point 1/4 of the annual bonus of everyone at the company was tied to it.

http://www.businessinsider.com/larry-page-just-tied-employee...


What other CDN than Google's can host AMP content and still get in the Carousel?

Hint: none.


AMP has the wall.

Can I get the same display in the Google Search with or without AMP? No.

That’s the wall.


Then is HTTPS also a wall, given that HTTPS results are favoured by Google search?


It would be if you had to use Googles as your CA, immediate or intermediate, whatever.


What? With HTTPS, I get full control.

With AMP, I get a different search ranking if I use Google’s AMP version, or if I self-host the AMP scripts (to prevent users being tracked by Google).

I have to allow every user of my site to be tracked by Google if I want to get the AMP ranking advantage.

I can’t fork AMP.

The ranking advantage is given only if you use Google’s AMP cache.

How can you seriously compare this with HTTPS?


Can you elaborate? sites that dont implement AMP still appear in google news and even above AMP results as far as I can tell


It only affects the carousel at the top of the site


Which, on mobile, takes ~40% of the available page. Often catapulting a page 13 result to #2


Try regular google search, instead of news.


I don't think it's fair to claim that (a) applies here, certainly not like Facebook at least. Facebook makes an effort to keep user-created content inside of their platform. Google does not make any effort like that with regard to AMP, other search engines could use the AMP data just as well. Serving it from their own domain is just a technical matter.


But it doesn't do (a), either. Google hosts a cache of the content, but the original is hosted by the publisher, and Google's cache isn't exclusive. Microsoft, Facebook, and Twitter, IIRC, operate their own caches, and so can anyone else who wants to (and there's a fairly strong incentive too if you host something that functions as a high-volume portal.)

AMP makes it harder for real walled gardens to provide an attractive performance advantage to end-users, making it not only not a walled garden, but also a potent weapons against walled gardens.


So, how do I get my page into the carousel without using Google’s cache?

How do I get my page into the carousel when using a fork of AMP that reduces the JS load?


> AMP does not seem like a walled garden tech. any crawler can find and use the AMP version of a page through this tag

So "any crawler" who is willing to slavishly follow whatever Google is currently doing, and currently doing in the open (who knows how long that will last), can technically get access to the same content so it's not a "walled garden".

To me it sounds like Google doing yet another non-standard thing without asking the rest of the web-community for input, and using their weight as a way to push this through in their usual, hostile manner.

And here on HN people are defending it as perfectly reasonable. The mob clearly has something to learn from these guys.


So lets say AMP was not invented. Web pages have been notoriously slow on mobile for a decade. Where is this open standard to make pages lod faster? where is W3 or Mozilla? Its been a decade. if there was a competituve open standard I would be against AMP but there isnt.


> Where is this open standard to make pages lod faster?

Not filling a page of what is effectively 80KB static text-content with 20MBs of JS, fonts, tracking-code and other "assets" would be a good start.

The web isn't slow. People deliberately create slow web-pages because they care more about user-tracking and "fancy" technology than they care about user-experience.

And making Google a pre-requisite for any web-page is NOT the solution to that problem. That's the wrong way around. We need less tracking. Less JS. Less Google.


At some point... god, most of a decade ago now, I guess, it seems like the kinds of people doing web design changed and this new crop didn't care about or understand bandwidth constraints like the old ones did. Strangely, this was around the same time "digital native" instead of "print influenced" design got big, so you'd think it'd have gone the other way, but it definitely did not.

In fact, I'd say the much-derided Flash aesthetic/inefficiency kind of won, right around the time we were all celebrating killing Flash. Everything's whiz-bang shiny and so damn slow and resource-hogging. Plus flat design is a disaster in all but the most capable hands, so I'd say UX generally has suffered over the last decade.


> At some point... god, most of a decade ago now, I guess, it seems like the kinds of people doing web design changed and this new crop didn't care about or understand bandwidth constraints like the old ones did.

This is a cyclical problem because people tend not to measure performance until they notice a problem, which means it's a function of both technical factors and user expectations. The rise of mobile added a confusion point since it basically dropped back to dial-up/DSL-class networking after the overall web community had had a decade to get used to cable modem-grade performance and made wasting bandwidth a direct cost rather than just inefficiency. In the mid-2000s, using a big JavaScript toolkit wasn't great but it wasn't so bad when you could assume that most users had better latency than even LTE delivers and user expectations hadn't adjusted for the post-IE era.

The other big factor is the ongoing decline of advertising as a viable business model. The worst offenders I see are either ads or the measurement tools publishers use to document their site's performance, and that's been getting continually worse as everyone keeps chasing diminishing returns.


You also need to account for all that JS to have a runtime-impact.

More JS will make a slower site, especially on mobile.


> Not filling a page of what is effectively 80KB static text-content with 20MBs of JS, fonts, tracking-code and other "assets" would be a good start.

I imagine if you expand and codify this idea, you will come up with something very close to AMP. I'm not sure if you're against the idea of AMP/AMP-like standard, or against Google's execution of it.


josteink is probably against the idea of a big corporation hosting the publishers' websites, but in favour of the design guidelines of AMP.


Hopefully tech people will start to ditch companies with such a bad behaviour.

> https://twitter.com/aral/status/877998804678524928


And until Google and AMP came along, there was no stick to make authors do this.


Actually, there is a competitive standard to AMP: Mobile Instant Pages (MIP) from Baidu - https://www.mipengine.org


Since that standard is only targeting the Chinese markets, I don't think it is a viable standard for the rest of the world.


When you load an AMP page in your browser, you never actually connect to the original publisher. That seems like a walled garden to me.

Personally, as a consumer, I absolutely hate the fact that I can't quickly check my browser's domain bar to see what website I'm reading (to assess its credibility for example).


>Thus all google is doing is rewarding sites that have implemented this and as a result load fast, no?

No. And Google could not care less if pages load fast -- it could incenticize making pages faster with traditional methods instead. It's all about capturing more eyeballs and proxying more traffic.


But why? AMP allows other advertisers. They can track clicks just as easily with a redirect. They could quite easily change the UI of the Android browser to allow for fast switching through search results. Nobody cares what CDN the bytes are served from if the content is identical.

I agree that it's somehow yucky, but I don't see how it can be ascribed to malice intent, instead of an effort to improve the user experience that maybe goes too far.


They already did that, and web "developers" did not do the work. The period in which Google incentivized faster page loads was the period of some of the worst page bloat growth.


It depends what "market" you are looking at.

In some areas people understand care about such things, in those areas we did and still do see some optimisation effort. Far from everywhere of course, even within that subset, but I'd wager there was improvement overall (or at least things got worse less than they otherwise would).

In other areas though the available audience either doesn't understand or doesn't care (because their connection and CPU are fast enough to cope for instance, and they aren't on an expensive metered link) or both. I say "available audience" because for these sites the "target audience" is every-human-alive-several-times-over-good-god-we-need-more-ad-impressions-and-clicks - the likes of buzzfeed and the other click-bait filled "news" and "entertainment" sites where four sentences of content can be stretched over sixteen pages of large, slow loading, auto-playing, obnoxious adverts. The people who care about load time (and/or relevant technical or privacy matters) simply don't follow those links. The people who do follow the links aren't worth making page load optimisations for because they simply won't care or notice, so from the site's PoV the time is better spent on adverts-hung-onto-into-each-morsel-of-content optimisations instead. And that was when Google ranking was the main thing that mattered, more recently the operators of such sites are more interested in distribution via social media so Google ranking you lower for a slow load is less of a concern, or if it is a concern there are tricks (not all of which Google can easily filter for - it is an on-going war of attrition) to show Google different versions of the content that does load quickly.


>They already did that, and web "developers" did not do the work.

It's not up to them to decide after that.


I would argue that they didn't do that, or at least not enough. I obviously have no insight to how they sort, but IME they aren't putting small and fast sites at the top of the list. Those are usually 3 or 4 pages back, and the first results are usually the slow and bloated ones.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: