It's pretty easy to make usable web sites with regular HTML, and it's almost certainly possible to make just as shitty AMP sites. However, AMP solved the political/bureaucratic problem by providing a strong carrot to make a non-shitty version of the web site, and made it easier to make a non-shitty one than a shitty one.
The publishers already have to spend time remaking their web site, and the goal they get provided with is "keep it simple and lightweight", and doing shitty things is made hard/expensive. So they grumble and do it, getting it mostly right. (Although they seem to have realized the "potential" and started shittifying the AMP pages too).
They could have a normal experience that is very similar. Yet, they still serve the non-AMP, ad- and tracker-laden main page that takes 20x as long to load, jumps around while doing so, and tries to blast you with an autoplaying video that remains stuck to your screen while you scroll.
I don't get the hate for AMP - it somehow succeeded in the impossible quest to get publishers to make less shitty web sites, and it does improve user experience by preloading the pages.
It's just like with the "acceptable ads" program - some web sites participating in it serve significantly less shitty ads to people who only allow "acceptable ads", while serving a worse version _of the same ad_ to everyone else.
So maybe, to answer your question: Build a standard (or extend the AMP standard) that works just like this, but is enforced by the browser. Much harder to do and unlikely to get enough momentum/incentive to actually succeed, though.
I hate privacy abuse as much as anyone, but I thought it was a standard for less insane HTML pages that any “renderer” could show. Didnt cloudflare etc do this?
Basically can someone ELI5 where the actual hate comes from (aside from being annoyed because you use google search)? Because I am all for news sites being anything other than what we get today (20mb of js ads and tracking crapware etc)
edited for clarity in my questions
Is that proven? I highly doubt it.
"The Top stories carousel requires that your content be published in AMP. For more information, see AMP with structured data."
I talked about this and more here: https://unlikekinds.com/article/google-amp-page-speed
I would stress that amongst most publishers they are focused on the UX experience now unlike they were a few years back (but yeah, there is no shortage of bad news sites still).
Its a little bit of a bitter pill that Google, responsible for all the ads and ultimately the shitty ad experience, invents a new format to restrict publishes from implementing the shitty ad experience that Google allows, and then gets the publishers to built it.
"AMP for Ads" makes sense. AMP in general does not.
That being said, it is the fault of these publishers to under invest in developers. It's like not investing in the printing press.
Bad actors are usually flagged manually and there are typical a ton of genuine mistakes that happen (someone loads unoptimized images etc)
A AMP-like framework would catch that.
uMatrix and NoScript do a rather coarse-grained version of this, as you can configure them to block JS by default (and whitelist some domains, eventually default-allowing JS served from the same domain as the page, or one of the subdomains.)
But when you tell someone their website doesn't work without JS, most of the time they either ignore it or tell you it's 2019 and you shouldn't disable JS.
How do these sites survive when it takes 30 seconds load the page (which freezes your browser) and then takes another 20 seconds to find the content hidden between creepy ads from ecommerce sites you recently visited?
The last time I remember the internet being this horrible was the search engines back in 1999 and that was quickly fixed when Google came along.
>uMatrix and NoScript do a rather coarse-grained version of this
What about something like: window.MAX_ALLOWED_JS_STATEMENTS == 1.mio ? User could set this value in settings for each domain/subdomain. Developers could decide based on that value what should be executed and what not.
I haven't had a single instance where I was happy with AMP. The very best scenario I've run into is that it's neutral, but most of the time it's an annoyance, because it forces me to constantly use desktop mode on my phone so I can actually get usable search results.
AMP is utter garbage. It's so bad that I switched search engines purely so I wouldn't get AMP results. I don't want a competing standard, because I want my search results to lead to the REGULAR WEBPAGE. You know, the one that actually works, unlike most AMP pages.
We have a wonderful standard for documents - simple HTML.
But Google are pushing thier ugly AMP solution instead.
I think this shows one clear area Google is evil: they are showing the colour of their money: advertising is far more important than content.
Clearly users also choose rich applications, but that is not what I was talking about here. I was talking about text articles.
You sound as if "most people", aka non-technical users, are like kids who like shiny things and cannot appreciate clean and elegant design.
You create that audience for yourself, that way.
This targets more of the performance side of problem than the tracking & privacy side, but it does assign cost and works as signaling mechanism back to publishers when they do something that harms the user experience.
You realize that for such standard to succeed it would have to implemented by Google Chrome right? And they are definitively NOT going to do that because its against their best interest, so no I don't think that is a solution; the only slight chance would be to make AMP illegal so Google is forced to use some kind of alternative protocol.
I've never been blasted with an autoplaying video that stays stuck, on my phone. Is this even possible?
Here's a very recent link that does it on both my phone and computer: https://www.independent.co.uk/news/uk/politics/brexit-no-dea...
Though I live in a rich country, I used to have "only" 500MB/month of mobile data. I could open many small text/image websites, but open one of those autoplaying videos (which sometimes serve even high definition on small screens) and my mobile data for multiple days is gone in seconds. I can only imagine how worse it must be for people in poor countries.
Ironically, much better because they tend to have a much better mobile infrastructure. For example, in India someone pays "less than $3 a month for unlimited free calls. With that he gets 42GB of 4G data, at 1.5GB a day, which he uses for viewing videos and for WhatsApp calls to his family and friends in the state of Bihar"
Even if you multiply that by 4 to account for purchasing power parity (with the US), that's dirt cheap for such a plan.
Challenge for WordPress: make it work faster without AMP than with AMP.
Should be table stakes for sites nowadays.
Been competing against google since 2005, I’ve seen a thing or two.
AMP isn't popular because users love it. It's popular because Google shows AMP-only results for the first screen (often two or more screens) of search results on their site. You can make as beautiful an experience as you want, but users won't see it when they search, so they won't ever get to experience it.
Users search for news/article content with Google because the experience is qualitatively better than many of the alternatives. If there was a news website that showed trending and search-based results (and there are several), the presentation of the article after SERP clickthru would still need to be comparable.
Facebook did a more AMP-like thing of migrating the article content onto Facebook, providing the least offensive (if you are already using Facebook) presentation.
News websites in 2019 almost all auto-play video upon load and many times the video is unrelated to the article content. They load clickbait partner links in the footer. They still have dozens of ad networks and tracking beacons sprinkled on every page. I can't reliably trust my ad blockers and tracking beacon blockers or Reader Mode to work all of the time.
As much as I would like competition with Google+AMP and Facebook+articles, the current state of the web is gross and these two clean it up a little.
The problem is all the people and big websites that don't seem to care at all about their own content pushing amp as some sort of great solution for website speed. Sad, but it seems to have died down.
Right now nine of the first ten results for "Walmart shooting" in incognito mode stock Chrome on Android, are AMP articles.
Do that search, if you're not seeing those AMP articles then you're doing something very unusual with your setup.
So major news outlets are still on the AMP bandwagon? And they complain about revenue...
Reddit uses AMP. If you search via Google for a Reddit thread, you'll get an AMP link. I sometimes see Redditors posting AMP links to other Reddit threads, presumably because Google search is more effective than Reddit's own search.
> or if it's not time sensitive, it will be in the news 2-3 weeks later, at which point it's probably worth reading
Google shows AMP for all search results, not just the current-news carousel. If I search for 'Sandy Hook shooting' on my phone, I see a Wikipedia onebox, the first search result which is Wikipedia (no AMP), a handful of news articles from the last few weeks, and then the remainder of the actual search results: a Business Insider article from December 2018 (AMP), an ABC article from 2014 (AMP), a Reuters article from a few weeks ago (AMP), an NBC article from a few weeks ago (AMP), a CBS News article from 2017 (AMP), a Britannica article (no AMP), and two YouTube videos (no AMP).
I tried that search, none are AMP for me. Tried "walmart shooting", none are AMP, even CNN live updates. Android/Firefox.
Tried Chrome, not signed into Google, no AdBlock, still No AMP.
Strange... May be Internet speed related?
They don't have a choice. If they aren't then Google won't show them, and they won't get any traffic.
Since then I’ve only seen two more. All three time I’ve clicked away almost immediately.
I think the reason I don’t see them much is because I don’t use google for search often, and I never use google news.
I'd expect the Safari feature to work quite similar.
I really don't think this is the reason Google had spent millions on making this.
A more realistic explanation is tvis situation gave someone the idea and management accepted it as it would give them yet another way to corner the market.
There's a real Stockholm syndrome prevalent here. Is the request so unreasonable?
And it's even worse. I browse with NoScript, no JS by default. Often I'm presented with a blank web page. I simply do View -> Page Style -> No Style in Firefox and usually I get a perfectly readable page.
Which means that countless sites go out of their way to be hostile to people not using JS. That's now acceptable?
Basically I get more done faster with google search, however much I hate the goog panopticon. I’ve also switched to Apple News and Maps so I’m very much trying to dump goog, but search is where they still really excel.
I do wish I’d come back to Firefox sooner though, absolutely love it post quantum.
I switched to google and instantly grabbed some ones from 2019
This happens way too often, and I like to think not because of personalized results.
And when I do use !g results (especially tech related) are not that good ! Maybe because Google have not enough info to set me in the correct "bubble" ?
DDG does the job for me.
Just to be contrarian... maybe there's some useful signal there about why you might prefer technologies without so much churn.
If anyone here is at Adobe, please for the love of god hammer it into the heads of your PMs to do three or four sprints dedicated only to resolving bugs and crashes.
Stop using Google, everyone! It's not necessary and you'll be tracked far less. Plus you're helping to save what is left of the web from one of the most odious attacks on it ever.
Last I checked, most news sites still throw dozens of ad networks and tracking beacons on their pages. If they use Google Analytics, Google Fonts, Google Maps, or other Google web APIs, Google still tracks (or has the capability to) many of your movements across the web.
I'm not against making incremental changes to my own life, but there's no clear value in doing this yet.
I'm not personally as worried about being tracked by Google APIs, but UMatrix will still block the majority of them -- even Youtube embeds and many Google fonts. That means you get to choose which sites are allowed to load those APIs, which is a huge improvement over just giving Google access to everything.
Honestly, while fingerprinting is still a huge concern, blocking the majority of Google's tracking on 3rd-party sites is pretty easy, since Google tracking is served from consistent domains.
The remaining pain points are stuff like reCaptcha and AMP, which are obviously a big problem, but are not used universally enough to erase the privacy gains you'll get by switching off of Gmail and Google search.
If you're already running a (good) adblocker, you will genuinely be tracked less if you switch off of Google. There are ways to mess that up and erase those gains (cough running stock Android with default settings cough), but it's very feasible for most tech-literate people to reduce Google tracking right now.
There is lots more that could be done but small, incremental changes do add up over time.
I’ve had the same experience. I’ve wondered what the cause of using !g less is.
Is it because I’m giving google less and less of a diet so it’s ability to cater to me is eroding?
Or perhaps DDG is just truly getting better?
Or is there a subtle feedback loop between man and search engine where we learn how to search better and better given a certain tools behavior?
Would love to hear what others suspected.
I mainly use DDG, but whenever I have to go to !g, I find I get better results if I phrase it like a brain damaged lunatic who can't spell and didn't start thinking about the query until two words in. I believe it keeps Google happy, if you throw it a little bone by letting it second-guess your query. Make it feel all smart and artificially intelligent, so it gives you the good results.
In DDG I usually just enter some keywords.
Older G users also rely on it but to a lesser extent. Having a couple weeks of appending !g when a search query does not match eventually hits the spot where "hey, they actually don't know what I'm thinking now".
When using DDG a couple of years back, I noticed that after a while, throwing in a !g became my default so I figured I might as well switch back to Google. Don't get me wrong, I'd love to love DDG but for my searches, I would simply have to revert to Google too often.
That some techies avoid Google, won't kill AMP. That does not mean I am against it, I use DDG as default engine and fall back to Google only if I tried different queries and scrolled down and still can't find it, and I recommend DDG to everyone. But this question is about killing AMP, which this won't significantly help towards unless we can get non-AMP search engine market share a lot higher than it currently is.
Google is a data-led org, where metrics are king. If AMP’s metrics go the wrong way it’ll be dropped.
Google just has a really good results, it’s not that they’re customized for me.
It's not just academic journal articles either: their results are, by and large, less complete - it's what initially led me to do the experiment in the first place.
People go with what their “expert” friend recommends all the time. If I want to buy a car, I ask a friend who is super into cars for advice, if I want a blender I ask a foodie friend. The same effect can help promote other technologies.
Change the default search provider on every family computer you do tech support for.
That gets us up to 2 or 3%.
That + ease of doing backups and migrations on an unlocked device.
Similar setup here, but recently switched back to iOS. It's a shame that extension support is impractical because of WebKit.
I run a leaderboard of major news publishes (mostly English language based ones). It relies on WebPageTest.org and tests about 60+ articles pages nightly on 3G and 'Fast 3G' speeds. The myth that you cannot have a fast web page AND have ads on it is a myth. Several organizations do it and do it well (DotDash dominates the board with their sites).
Before I hit API limits on WPT I was also testing against the AMP version of the page too. The speed differences between the regular page load and the AMP page load was often very similar. I recall in some cases (Quartz, Guardian, NYT) that regular pages loaded faster than AMP.
That aside, assuming a regular web page took 10 seconds to load (a top 10 article) you would expect that the AMP page would be faster, say down to 2 or 3 seconds in Load Time in order to make the effort of having yet another template/format to support and to justify the effort to re-implement analytics, pay-wall, and ads?
Very often it was the a saving of only a second or two. It all adds up, but as someone who works with resource strapped publishes thats not worth the resources. Thats especially true when I could have spent all that time optimizing our regular pages instead of this other project.
Why do people think AMP is faster? Pre-caching by Google.
The thing is Google (IMHO) could pre-cache regular web pages too. They don't. They don't even issue guidelines on how to make your site cache friendly, they insist on this whole specification/implementation and insist on hosting it remotely and create all sorts of barriers.
For example, here is an example conversation at a web company before AMP (and this is not really hypothetical - I had more than a few conversations like this):
Marketers: We need you to add these 623 tracking pixels from these 300 ad networks to the page.
Developers: But that will kill page performance!
Marketers: But you guys are smart, make it work!
And after AMP:
Developers: Get bent. AMP doesn't allow that, and without AMP our SEO positioning will tank.
Marketers: Oh, OK.
Developers: Get bent. If we add tracking pixels our SEO positioning will tank.
I'm not saying Google doesn't have perverse incentives (and with AMP, they do). But it wasn't fixed before Google and AMP is one of the counterweights to web ads + tracking beacon overload.
Well, yes, I did say that.
> in a world
This is a huge part of it. It is not unusual for an HTML page to load faster than an AMP version when loaded directly by a browser ( render blocking on the 3rd party amp JS bits can result in it taking longer to start rendering ).
The part that often allows AMP to win is that Google caches it to their own CDN and Chrome will pre-fetch it from there. What is interesting about that is that none of that has anything directly to do with format/spec of AMP pages.
If the server is the same as the linking page, for example Google search result -> Google Cache, there is no new information transmitted. That server already knows the user did query X and it knows that the page was going to fetch cache page Y, since the query page X instructs the browser to do so.
If the server is distinct from the linking page, for example Google search result -> https://healthsite.example/, then the browser will make a request to the healthsite server without the user having clicked the result. The healthsite server will learn the IP of the user, and some information about the query from which page is loaded, all without the user ever "visiting" that site. This is a major privacy violation.
AMP Pages solve this by a) being loaded from Google's cache and b) Guaranteeing no off-cache subresources will be loaded before the page is navigated to. (a) requires Google's cache to serve the page. (b) requires that the document author gives up some control over scheduling resource loading in the prefetch.
Until recently, AMP was the only game in town that could achieve this. Chrome recently shipped with Signed Exchanges, which is a network-level technology that could allow prefetching arbitrary content from a cache. This would still involve a Google cache, but does not require coordination with the document loading. Google AMP now supports this (https://webmasters.googleblog.com/2019/04/instant-loading-am...), but it would also work with non-AMP pages.
1. Can serve dynamic content
2. Doesn't leak any user data or metadata to a third party before they explicitly consent (in other words, executing a search on Google shouldn't ping cnn.com, this makes things like link rel=preload not work).
3. Allows caching pages client side for "instant" loading
I've challenged a few users this way, and they always construct something virtually equivalent to amp.
I think what has sucked most publishers in is the belief that if they don't play ball and implement AMP, they'll suffer in Google's rankings.
Google has said over and over that AMP is not a ranking factor. If someone could definitively prove that it's not, I think publishers would be less likely to bow down to AMP. If someone could definitively prove that it _is_, then we'd have evidence of Google both lying and promoting their own technology.
Of course the "Top Stories" carousel _is_ an unfair advantage for those who use AMP -- more people just need to call out Google on this. It should either be renamed "Top AMP stories" or they should not prevent non-AMP stories from appearing in it.
AMP is a defense against the increasing noise about copyright violation by including snippets and caching. It lets them say "But, Your Honor, they provided an AMP version knowing that it's intended specifically for this purpose."
What I think actually needs to happen is a standard for deferred navigation where the UA can be told to load a bunch of resources and then choose one of them to actually navigate to (basically what AMP does). The problem here is that Google is (as we type here) actively coming up with horrible standards like signed exchanges so that they never have to send users away from their own domain, so I don't think they would be fans of a standardized system that killed AMP.
So content-producing organisations are increasingly strong-armed into building websites the way the platforms want.
Also, having large tracts of the web built using the same small set of severely limited components means we get dull, samey websites susceptible to the same hacks or bugs where things like interactive features are much more difficult or impossible.
I wrote about this and more here: https://unlikekinds.com/article/google-amp-page-speed
No, it is clearly bad. You seem to even agree. The stated reason for creating AMP may be good, but the technology and standard itself is bad.
It is? What do you like about it? What does it help you do more easily?
AMP is in no way necessary to build "user-first" fast loading pages.
AMP was created to allow aggregators to ensure they could cache and serve content without the user leaving the portal.
As publishers are desperate for the traffic, they have adopted AMP to make the aggregators happy.
The user is never given a choice, they just end up stuck with a cached AMP page and confused about how to reach the publisher's real website.
This "user-first" framework is really more "user-last"
"We are a community of individuals who have a significant interest in the development and health of the World Wide Web (“the Web”), and we are deeply concerned about Accelerated Mobile Pages (“AMP”), a Google project that purportedly seeks to improve the user experience of the Web."
Do so with good arguments why it's a bad move, not with "omg Google is evil".
Do the same thing as a user: if you are using a site that has AMP deployed, write them. Them them why it's bad for them and for their users, and how this pushes you to other alternatives.
"AMP keeps users within Google’s domain and diverts traffic away from other websites for the benefit of Google."
No, it keeps users within the link aggregator's domain, whether that is Google, Bing, Baidu, or a link aggregator made by a third party. The publisher is OK with this because it makes loading from all of those link aggregators instant, just like when they publish for Apple News; but in this case, they can publish once and support multiple aggregators.
Compared to Apple News? It serves the same purpose but gives more control to publishers.
Find some way to get a charismatic US Senator spun up about AMP.
AMP pages are hosted on a google.com uri. Get some kind of controversial content up on an AMP page, and start a grassroots "OMG, Google is hosting this bad content via AMP" campaign going with some tech-ignorant group. Maybe works better if AMP is fronting some other Google controlled user-content domain.
The "promise" of AMP is fast delivery of the page and ease of use on mobile devices.
Once we (as in "content creators" or whatever) start building our sites that way, AMP will get used less and less.
One side-effect is AMP now provides a reason not to do anything about your regular crappy pages.
That is the sad part and why it needs to die - it won't stop. Google prioritizes their own system regardless of your work.
I don't think it's possible to block AMP in Safari on iOS.
As a coder, if I'm honest, money speaks louder, unfortunately.
If they pay me I'll do it, as others obviously do, but that's something we could have a hand on changing collectively.
Agreed. The company I work for is doing AMP and are getting real business value from it.
As such money does speak louder and if the users of our product are engaging with it more because of AMP, why would we go against it?
Philosophical reasons aren't enough. To give people and companies a reason to oppose AMP you'd need to frame the reasons in a way that includes business considerations.
Do you have an idea how are they monetizing the extra traffic. And how much more traffic is it? If google is strongarming people to use amp, isnt that fundamentally breaking net neutrality?
How would it be unstoppable?
Google isn't a government. They don't own or control the web, nor do they have the force of law or men with guns pointed at web developers, forcing them to use AMP. It's no more likely to "seize the entire web" than ActiveX or Silverlight.
Really? name a long tail website that doesn't SEO for google
And they don't even really control the visibility of online businesses, so much as have outsized influence over it. It's possible to run a business and not be the first search result on Google, and other search engines do exist, as well as other venues for advertisement and promotion.
It's not important, 2 or even 5 more seconds of load speed is irrelevant (and you probably don't want the visitors who can't wait 5 seconds anyway) for good content, and you lose most of your control over your own stuff.
Stop playing into the hands of Google, Facebook, Microsoft, et al
Without regulation, I'm fairly convinced the big tech companies will continue to find ways to abuse their power.
For other cases you do need some scripts and stuff, I have suggested a "widget" attribute nd <widget> element. Both <script> and <widget> elements support the widget attribute. If recognized by the browser and enabled by the user, then the element and its contents are replaced by something implementation-dependent (and not necessarily representable in normal HTML). Otherwise, a <widget> element acts like <span>. This also improves speed, as it can skip loading a script if it has its own version, and possibly also use a native code version specific to the browser or an extension. It also allows better user customization, for example if a special text editing widget is used on the web page (rather than a normal <textbox>), then it can be replaced with one that has vi or emacs key bindings, or to implement it without animations if the user wants to implement that script without animations.
The other alternative, for stuff that doesn't need HTML and HTTP, is to use Gopher. Then, no need to consider what kind of user interface is used and any other kind of accessibility; it already is!
In other words if you use a content blocker AMP can be slower than the page would be without it.
They didnt win publishers because of their wonderful implementation, they are literally blackmailing them with deranking if they don't (I suppose I could use a carrot analogy but i m more suspicious)
There is so much evil that can happen if AMP becomes the de facto walled garden of the web. I imagine if that happens, then we 'll end up needing google's review before websites can even be reachable. Or a government could order google to effectively shut down a bunch of sites instantly. The current version of amp is worthy of destruction before it grows to become indestructible
AMP is an admission that the open Web has failed. The number of trackers, tag managers, popups and other crap to "monetize" pages has gone out of control and made majority of pages painful to open (at least without some crap filter, like an adblocker — or AMP).
If you manage to restore the health of the open Web, it'll be hard to sell a walled garden instead.
*gets thrown out of the window*
If anyone were to propose this it’d probably be Bing or DDG.
But finding the right article to read may require browsing through many pages, and after dozens those seconds can add up.
Also the solution cannot be to restrict all JS and web pages. It needs to be an open and opt-in framework.
I'm one of the few people who doesn't have too much of a problem with AMP (obvious bias aside). Take this advice AMP devs who might be reading: Listen to the community, and communicate back without being condescending. A bit of humility goes a long way.
Your developer marketing over AMP is awful and you need to fix it. Not working with the tech community is causing immeasurable damage to the AMP/Google brand for the people who want to work at your company (engineers) who feel personally attacked by this.
Here’s a ridiculously frustrating example of how it ruins the web experience I ran into yesterday:
I was looking for supplements, searching google with something like “site:reddit.com best supplement site”. Google directs you to amp reddit. All good so far.
So you find a thread full of links in comments. You scroll down and click one and it takes you off reddit. Good. Now you swipe back. What happens? You lose your scroll position entirely.
In a huge reddit thread full of links you literally are re-scrolling the page over and over as you try different ones.
It completely breaks the web. From the company that literally should be the champion of the web. It’s so backwards and hostile to the platform it’s hard to fathom what they are thinking. Amp could easily achieve all its goals by not forcing that stupid frame, and by letting it be hosted anywhere (google can monitor if they are using a decent CDN based on simple heuristics).
The problem is that it is tied to Google.
We don't need to destroy it, we need to make it into an open standard that all browsers and search engines can implement for the same benefits.
That pretty much kills the "open" bit.
Perhaps the right place is to lobby the WHATWG for an explicit exception in the definition of the script tag to handle these ampproject.org URLs in an API-compatible way.
Just keep doing your own thing and let Google turn itself into a platform for idiots.
Until you can accomplish that, destroying amp will only make the internet worse.
Google gains control by finding problems and solving them their way. You can't gain that control back by dismissing those problems. if you want to reduce Google's control you have to solve a problem (that affects other people, not just you) before they do, or in a better way than they can.
Solving a problem has never been justification for a solution that brings even more.
For example, an totalitarian dictatorship would solve a ton of problems with the US political system today. But that doesn't make it an acceptable solution.
When your solution brings more harm than it resolves, it's better not to have it in the first place.
Also you can create an extension for yourself which rewrites the URL and opens the slow version of the same page.
At the end everyone is free, for the good or for the bad.
Does this require lawyers and negotiations with publishers?
edited to add links and further comments.
I didn't know what it was, so I did a quick search and found this rosy picture painted in 2016.
and here is a Register Link decrying it as worst thing ever.
It may sound like science fiction, but I think we need to replace the web browser with an intelligent agent which can "scan" content for us and only show us what is new, interesting, non-toxic, etc.
Your cure is worse than the disease.
Your solution would appear to require me to only publish static documents which are "new, interesting, non-toxic, etc," under some arbitrary and proprietary guidelines, because that's all people would (or should) be allowed to see.
No thank you. I'd rather have freedom from all of the gatekeepers of web culture, be they FAANG or contrarian hackers.
NNTP at least you can scan what is new, and other criteria you can program yourself (it can't automatically know what is interesting, so you will have to program in your own criteria for that).