Hacker News new | comments | show | ask | jobs | submit login
AMPstinction (adactio.com)
134 points by valeg 4 months ago | hide | past | web | favorite | 72 comments



There is a growing confusion with web developers about what a webpage actually is, the conflation with web applications is getting rather harmful.

If you are building a webpage (a web document) its probably the goal of the user to go there and read some information and move on, they aren't there for an "experience" and if your javascript weighs more than the useful amount of text on that page you are probably doing something very wrong.

Oh well, you reap what you sow, I hope you're happy with your new proprietary web.


That doesn't really have a whole lot to do with AMP, though.

95% of the problem with web page performance today is ads. Google dominates the online ad business, and if they wanted they could have solved this problem from the ad provider end long ago. It wouldn't have been easy, but it would have been a tremendous use of their market clout.

Instead they decided it was an opportunity to squeeze all publisher content through a new shiny pipeline they created, that also coincidentally solves the issue of poorly performing ads. And then manually manipulated their search results page to pressure publishers into using their new format.


> 95% of the problem with web page performance today is ads.

No. I use extremely strict ad-blocking and many website's performance is still shit, both concerning load times as well as CPU hogging.

The problem with web page performance is people believing that by distributing all their (mostly unnecessary) assets over two dozen unrelated origins they speed things up. They don't. That's what makes things slow.

The problem with web page performance is people believing they need some mirrored btree-reverse-hash-indexed shadow DOM, 1.5 MB of Angular, four to five more JS frameworks and 20000 NPM packages with a "ng build" time no less than 50 seconds to show a news article.

The problem with web page performance is people believing a page would render faster if, instead of sending HTML, you send a giant blob of JS which, after almost choking a parser to death, then does a XHR to actually fetch the content and then starts to render templates into actual HTML.

(You know, before everything went apeshit, we used to tell people to put <script> tags last in the document, because that way the browser can render your page without blocking on downloading, parsing and executing your scripts — no one would notice if the menu bars lacked soft animations for 0.3 s after a page load, but people would of course notice if the page stayed blank for another third of second.)


> The problem with web page performance is people believing that by distributing all their (mostly unnecessary) assets over two dozen unrelated origins they speed things up.

Ironically that is exactly what web developers were told to do for years:

https://developer.yahoo.com/performance/rules.html#split

I can't deny that a lot of web pages bundle far too much JS, but if you've been using a strict ad blocker for a long time you perhaps don't realise how bad the ad code has gotten. React loads in the blink of an eye compared to the multi-second load times for five-level embedded popover ads.


That link recommends no more than 2-4 domains and gives an example of static.yourdomain.com, it's hardly suggesting embedding 20 third party scripts.


I assumed the OP wasn't talking about third party assets since no-one chooses to distribute them across different domains, that's just how they have to be used.


Then perhaps I've misunderstood what they mean by mostly unnecessary assets.

I was thinking of the choice to embed third party scripts or not, rather than what domain they're on.


> The problem with web page performance is people believing a page would render faster if, instead of sending HTML, you send a giant blob of JS which, after almost choking a parser to death, then does a XHR to actually fetch the content and then starts to render templates into actual HTML.

"It's asynchronous, and that makes it soooo fast!"

So many people drank that async kool-aid, but didn't understand that the network is slower.


I don't think that anybody believe that Angular/React/Whatever are fast. They may be fast relative to another framework but certainly not to a simple webpage that has almost no JS.

The problem is mostly that most clients nowadays don't want a webpage, they want a web application. Something similar to what they have on their phone, computer, ... But JS is no C++ and so Angular cannot be Qt. I use to make Qt application, now I work has a webdev. I do really see how web app are slugish and tend to lag everywhere. But the thing is, for most client, they don't really mind. What they see is that they can access the app from all their computer, phone and tablet, and that matters a lot more for them.

Plus, we will probably never come back to the era where most rendering was done server-side, it just cost too much.


To be fair, front-end heavy web apps decrease a lot of human delay: it used to be that I clicked "submit" and then the screen went white and when it came back I had to recontextualize and connect the results to what I had previously inputted. Modern one-page applications with edit-in-place textboxes and content that reacts to draggable maps, etc. are much easier on the human brain.


> if your javascript weighs more than the useful amount of text on that page

Addendum: ... or if your content-based site fails to display more than a blank canvas without loading a bunch of scripts from third-party origins ...


Damn right, what the hell happened to progressive enhancement?


According to thought-leaders, it isn't dead, but it smells funny.

https://nolanlawson.com/2016/10/13/progressive-enhancement-i...

[Note: I very strongly disagree with this position.]


It's seen as a waste of time that is holding back innovation. When arguing for progressive enhancement, I've been essentially told "We're focused on customers who have a real browser, not old nerds who have javascript disabled".


It would be a good argument if this "innovation" actually had any value for the users, but it does not. I'll take a static webpage any day over some shit full of Javascript.


"I hope you're happy with your new proprietary web."

Absolutely, and it is close to impossible to discuss rationally without the conversation being dominated by rhetoric.

https://dennisforbes.ca/index.php/2017/09/05/embrace-amp-or-...

The point of that piece is that AMP is filling a role -- a minimal web for document publishing -- and denying it, as always happens in AMP discussions, simply makes an easy road for AMP's domination.


I agree, and looking at boards frequented by young "web developers" [1], there's the daily clueless "What framework should I be using?" question by novices (or bots?). Any attempt to tell them "it depends on whether you're doing a content-driven site vs an app" ends in downvotes, "it's 2018, dude" arguments or worse by wannabe developers too much in love with their framework of choice (currently React).

But as with anything build on belief, social affinity, and fear of professional irrelevance, the tides are turning.

[1]: https://old.reddit.com/r/webdev/


> AMP is filling a role -- a minimal web for document publishing

I always thought HTML 1.0 filled that role.


HTML has had the potential to fill that role since day one, but while today we're talking about sites sitting blank for 5 seconds because they're downloading megabytes of garbage, back then we had pages sitting blank for 60 seconds because they were downloading 101KB+ of totally insane HTML and animated GIFs of dancing babies and nuclear explosions.


I'm helping a friend with a site which needs some 'cleverness' (by my lowly standards) playing audio whilst pages are browsed.

Checking out a competitor's site, I was flabbergasted to see its source was about 8 lines of HTML, one of which linked to a Mb's worth of entirely inscrutable and indecypherable Javascript, leaving me with no means of working out how they did what they dun.

Coupled with crap like Instagram with it's locked-in walled garden content you essentially need to sign up for to view (and then, practically, install an app to update (yes, i know there're workarounds!)) we've somewhere over the last few years encouraged a large chunk of the internet to become quite obnoxious and antithetical to everything I understood it was supposed to represent.

Bah blast and humbug!


> leaving me with no means of working out how they did what they dun

I guess it doubles as a DRM mechanism preventing competitors from easily copying it. ;)


Sounds like the "competitor" made the right call then, wouldn't you say?


ha, i suppose you could look at it that way. I was simply trying to work out what presumably off the shelf tech they used.

When I say 'competitor' I more meant 'equivalent' - it's a broad and unremarkable field and the 'competition' as such, actually fairly magnanimous.


> If you are building a webpage (a web document) its probably the goal of the user to go there and read some information and move on...

I'm surprised with how vehemently I disagree with what you're suggesting. A big chunk of my web development experience was in content marketing, specifically in the top 5 most contested ad keyword space.

This is where 1) the most money is being spent on online content, 2) performance most matters, and 3) more time on site directly translates to more conversions.

People browsing the web for legal matters, car purchases and rehab aren't looking for small bits of information and to move on, they're looking for information and reassurance to help them make major decisions. Many of these users will spend hours on your site reading.

AMP is bad, like directly-harmful bad, for all of these businesses. So is Google's direction towards only giving you their top recommended result.


"Meet the new boss, same as the old boss."

Google is the new Microsoft. Embrace, extend, and extinguish is alive and well, it just found a new home[1].

[1] https://en.wikipedia.org/wiki/Embrace,_extend,_and_extinguis...


I share this view.

"Sorry this website only works in Chrome"

"Google Hangouts / Google Talk will no longer support XMPP / Jabber"

"Try our not-so-open source available mobile OS" We'll follow you across the web and in real life because our Open Source platforms are published as proprietary.

"Try our proprietary browser for which we release the code for but heavily modify before it gets to you"

At least Microsoft has open source projects that you get what they share published, no crazy surprises, telemetry aside for .NET tooling (which imho is seriously minuscule compared to what it could be).


I have to agree, Google gives new meaning to "open source". Open source as "we share some source but the actual meat is in closed proprietary codebase that you cannot inspect". I still remember the AOSP maintainer (Google employee) leaving because he couldn't do his work properly because of Google's politics w.r.t. Android.

Or was that the open source that RMS warned us about?


Free as in freedom and not free beer indeed. I mean I don't mind some proprietary uses of Open Source (how else will you fund the project!) but there should be limits to how you do this. I prefer extra paid features / paid support. Google could of made Android a proper Linux distro and charged manufacturers for OS package releases / updates or something of the sort, something reasonable. I wish Android was more Linux like where I can install w/e I want without being forced anything specifically. Replace the UI completely as I so desire as well.


Android is really a good image of what happens when Google design something from top to bottom according to what they want. There's a tiny piece of it open-sourced, the bare minimum and everything else around is closed source. They made sure that the whole ecosystem is complicated and convoluted enough that you can't easily modify it for yourself.

We could have had a Linux distribution tailored for mobile, instead we get that mess that is Android.


To put it less crypticly, the AOSP maintainer left because AOSP was capable of running on exactly zero android devices, including Google's own flagship devices.


> Or was that the open source that RMS warned us about?

Exactly


And VS Code, which is actually not a free software when downloaded from the official website if I understand things correctly.

Here is the license: https://code.visualstudio.com/License/

And yes, VSCode from this website includes tracking.

Though they clearly advertise this as "Free. Open source.". Which seems wrong to me. In my opinion, it's based on open source code at best.


Really? You’re going to complain about an Electron app like VSCode? AMP is way worse, and Microsoft has made a lot of strides in OSS, for example by acquiring GitHub.


I was just replying to the last paragraph. I haven't said anything else and please be assured I'm not going to complain.

I'm not attacking Microsoft and I'm not endorsing google. I actually agree with the rest of the parent's comment and with amp being bad. I do think selling anything as open source when it's not in a subtle way is shady and I won't comment on which is worse, I'm not in a position to compare. And VScode being an Electron app isn't relevant to me.

I also won't comment on Microsoft buying github, I'm not a fortune teller. As far as I'm concerned, I chose not to host my stuff on github anyway from the begining, since the platform is not open source. I won't rely on any feature of gitlab that is not in the community edition for the same reasons. I which gitlab sold support and hosting instead of proprietary software.


[flagged]


Troll alert much?

See:

https://github.com/Microsoft

https://www.microsoft.com/net

http://www.typescriptlang.org/

https://github.com/Reactive-Extensions/RxJS

https://github.com/Microsoft/BashOnWindows

https://github.com/Microsoft/ChakraCore

and so on... not to mention their own contributions elsewhere including Electron itself. I know some of their Azure Cloud solutions are open source as well.


>"Sorry this website only works in Chrome"

Because the Chrome team is consistently the fastest at supporting newly introduced standards.

A more accurate reading would be: "Sorry, this website only works in Chrome right now. Try Firefox in two weeks, and Edge and Safari in about six months".


Even though some people have changed their Firefox user agent and found that the web page still works under Firefox. Chrome isn't the only browser doing things to extend the web. Firefox and Mozilla work on new features too. Chrome didn't come up with WebAssembly, Mozilla developers did. Edge isn't as behind as it once was either, pretty sure they called it Edge for "Bleeding Edge" and you can see the source for their JavaScript Engine on GitHub.


>Chrome isn't the only browser doing things to extend the web. Firefox and Mozilla work on new features too.

I don't disagree at all. It's actually extremely respectable how much collaboration is involved in the process.

>Even though some people have changed their Firefox user agent and found that the web page still works under Firefox.

I'd blame site owners for that one.

>Edge isn't as behind as it once was either

Mixed thoughts. Edge is 100% better than IE used to be, but it still largely lags behind the other vendors.


> I'd blame site owners for that one.

I installed a Firefox plugin to support my YubiKey for two factor authentication. It worked fine for github, but Google login just locked me out saying Firefox doesn't support it.

So yeah, I blame the website owners, which means Google.


> Because the Chrome team is consistently the fastest at supporting newly introduced standards.

Yeah, that's an interesting way to see it when Google/Chrome dominates/finances the whole WHATWG web "standardization" process directly after "what Webkit is doing". Not saying there isn't bona fide work, but basically there are many ways to derail meaningful standardization efforts other than by not playing ball: by making it so fscking complicated and prohibitively expensive that nobody can't compete. HTML is over 25 years old. There should be no need to rush features all the time. If that's happening anyway (WHATWG's "living standard" nonsense), then something is seriously wrong with the scope and organization of WHATWG's work.


For many publishers (I work for one), AMP pages aren't any faster than their pages from a cold start. The advantage AMP has is that it doesn't load from a cold start; most visitors to AMP pages come from search where google starts pre-loading & pre-rendering the page with service workers. There is no real way for a publisher to do that on their own. Discussed here in the context of the difficulties of measuring AMP performance: https://www.ampproject.org/latest/blog/measuring-amp-perform...

I agree AMP is an abomination and despise google for introducing it. They've abdicated their duty to nudge the advertising world forward.


AMP being marketed on "performance" was always a hook primarily to sway end-users to associate AMP with performance, build mindshare, and cut down on opt-outs, a strategy helped by the fact that every developer, publisher, and decision-maker is also implicitly an end-user.

In truth, once you peel away layers of marketing, AMP's purpose becomes clear [1] as an answer to Facebook's Instant Articles, which tries to chart a have-your-cake-and-eat-it-too course for Google to build consensus around lightweight payloads on the wild [2], open web, while morphing more and more of their products [3][4] to serve as windows to others' content.

It is useful for them to position it as a publishing platform, in a sense, because it furthers the ecosystem they're trying to encourage.

[1] https://news.ycombinator.com/item?id=14529691 [2] https://news.ycombinator.com/item?id=14465801 [3] https://news.ycombinator.com/item?id=14529691 [4] https://news.ycombinator.com/item?id=16367197


I hope publishers understand they're only in for their own extinction by looking at Google/AMP, and are undermining their own efforts to outlaw link preview of news articles on SERP and aggregation pages in EU ([1]). To those that don't: good riddance. I don't care what you've to say when you're displaying this kind of utter media incompetency and hostility towards an entire generation's work on Web standards; same with "publications" on Fb (hello EU state-owned broadcasters).

[1]: https://news.ycombinator.com/item?id=17260148


It's the prisoner's dilemma. If they were all not participating it would be fine. But because some are, and Google is ranking them higher-up, all of them now have to participate.


This is surely a monopolies issue, Google using its search dominance to weigh into a separate market (newspaper/magazine publishing).


AMP crossed over into monopolistic territory when Google search engine decided to rank them better. In fact as a user, I always found Google Assistant opening a news article as AMP annoying, because, I couldn't read the comments about the article. Worse still, there was no link to the original article in AMP. So, when the search engine ranks them better, I wonder what the criteria is? Do users actually want the AMP articles instead of original ones, which is not true in my case? If that's the case with everyone, then it is of no one's interest except Google [1].

[1] https://www.politico.eu/article/google-amp-accelerated-mobil...


There's a simple remedy: use DuckDuckGo and Firefox (like all people in the civilized Web). I have yet to see a single AMP search result. How come we're being brainwashed into thinking it's a good idea to rely on search results produced by the world's largest ad network? Because Google never would send you to the pages having the most ads, would they?

Of course this doesn't address Google's potential anti-competitive behaviour which is a case for antitrust investigation not witch hunt.


> like all people in the civilized Web

Friend, I use DDG and Firefox exclusively and still feel like this is not a useful thing to say


We were "brainwashed" by the fact that Google is the best search engine.


As always, discussion around AMP is missing anything about users. I've had friends tell me they always click AMP links first because they're so much faster. Why wouldn't users love AMP?

Also as always, discussion is void of alternate solutions.

I find it funny that so many of these threads fall back to, "If publishers would just stop stuffing megabytes of Javascript into their web pages we'd be fine!"

Isn't that what AMP is about? It's an open standard, one Javascript codebase that can be delivered once and cached.

AMP also doesn't "break normal links": https://news.ycombinator.com/item?id=13467736

If you think the future of the web --- even just reading articles --- is plain HTML and CSS, then it's no wonder you see AMP as a bad thing and not a good thing for users.

I'd love to hear alternate solutions. How can we deliver article content to users lightning-fast, and still deliver them the things they want like recommendations, sharing, image carousels, etc?


Users loved AOL and Compuserve too. Doesn't mean it was good for the web as a whole, and ultimately users were better served having an open web.


I hear your argument, but my issue with it is that AMP is in fact open. They're accepting pull requests on Github[1] and other providers have already implemented AMP on their own, no Google involved[2].

HN is just bursting with NIH on this issue. I'm not seeing any substantive arguments, no alternatives offered, and no consideration for what users actually want.

This article in particular seems to be droning on about "intentions", "messaging", and "long-term solutions".

To all of HN: if you don't like AMP, shut up and code :)

[1] https://github.com/ampproject/amphtml

[2] https://blog.cloudflare.com/accelerated-mobile/


I would have zero issues with AMP if it wasn't hosted on Google.com.


AMP can be and is hosted from multiple locations. Anyone can launch an AMP cache.


the issues with amp arent that its not closed source, the issues with amp is that:

1) you either use amp or you lose out massively in google search rank (as in the links that appear at the top of the page, regaurdless of what semantic games you want to identify them with)

2) by using amp, you hand over all your search traffic to google.

No amount of "shut up and code" is going to fix this, because it is a business decision by Google, not a technical decision that they are going to entertain arguements about.

The open source aspect is a total smokescreen.


As mentioned multiple times elsewhere in this thread, AMP doesn’t have that much going for it in terms of performance, that’s more of a marketing play.

It’s not that hard to build a page using any of a handful modern frameworks that can outperform it. The trump card is Google’s preloading in searchresults, which AMP makes easy, but wouldn’t be impossible without it. If they stopped enforcing their own CDN/caching, wrapping the pages with their own UI, rewriting links, you could argue it’s a good thing, as a standard that anyone could adopt without commiting to one company. Until then, it’s just a path for vendor lock-in.


AMP is so obvious anti competitive, the page rank algorithm should treat equal pages that load as fast, now a plain html page, no JS or adds would have less rank then a fancy AMP page because Google can't track people.


>obvious anti competitive

AMP pages are only potentially perceived as 'fast' when you access them through google's search. When you do that all the AMP links' assets are pre-loaded in the background so it seems fast when you click through. But AMP pages themselves are just the same speed as anything else or slower when access without google's monopoly position pre-load.


Google claims they will prioritize any fast pages, not only AMP pages. https://mashable.com/2018/01/17/google-search-speed-update-r...


I have not looked very into AMP very thorough, but isn't does AMP simply to make your webpage lean-and-mean, which can be done without AMP and simply with some common sense?


> The same process is almost certain to occur with React—it’s a good bet there will be a standardised equivalent to the virtual DOM at some point.

That makes no sense...


To play the devil's advocate: it took a decade for jQuery to "make itself obsolete". AMP is still young.


Stop crying and use duckduckgo, problem solved.


This does not solve the problem that your webpage will be buried under a ton of AMPified results when most of the people search for it.


So DuckDuck go needs to prioritize non AMP pages?


How does that solve the problem that 99 % of your (potential) website visitors do not use DuckDuckGo?


So people want to use Googles platform but don't want to use their technology. This doesn't make any sense.


Why doesn't it make sense? People rely on being in Google results for better or worse, that doesn't mean they have to like everything Google does or demands to shape those results.


"People rely on being in Google results for better or worse". Use duckduckgo if they don't like it. Google doesn't owe anybody anything. It like wanting to use Facebook resources but not wanting to use their sdk and tos. The entire conversation about AMP is pathetic. Don't like it don't use it.


Again, what does the site owner using DDG change about their visitors?

And "Google doesn't owe anybody anything." is just an incredibly lazy argument and not a good reason for why people shouldn't criticize what they do.


So the way it works... You create a site, submit it to a search engine. You do this so your site can be found. Why is this so hard for you?


No, the way it works is that if you want people to go to your site from Google, you have to spend time making a version for their own new web format which they control because of their claims that it is faster (it isn't, they preload all AMP pages as soon as you see the result in the search).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: