Bang on. Saying that "there isn't anything out there anymore" is missing the point: Google's algorithms created this situation, intentionally or not. Before Google, people linked to what they wanted and communities would naturally cluster around topics of interest. Google came in and made reputation into a currency which effectively destroyed all these communities through incentivizing selfishness.
"When a measure becomes a target, it ceases to be a good measure"
-- Goodhart's Law.
Google's algorithms didn't create this situation; people chasing high Google rankings did. Had Google used completely different algorithms yet became equally dominant, people still would have poured their hearts and souls into getting higher rankings.
Basically, an application of the tragedy of the commons. Or: "why we can't have nice things".
But that's taking for granted that Google would have become dominant. Perhaps if they hadn't chosen the algorithm they did then they wouldn't have been as overwhelmingly successful. Instead, I could imagine a world in which there are multiple search engines and none of them are all that good. In fact, that's the world I remember from before Google existed. Search was bad but communities were strong and life was good.
Then Google came along and we all found it a lot more convenient than the bad search engines we were used to. And of course, we all know where that led. In some sense, Google built an 8-lane superhighway and bypassed all the small towns.
We all traded away paradise in exchange for convenience. Now we have neither.
On the glass-half-full side of this: we're getting those communities again! Here on HN, on reddit, for certain topics on various social media (there are pearls there too), on Mastodon, various blog authors, Ars Technica, Quanta, etc. [1]
It's just fragmented - i.e., catering to a specific group. Because if it isn't, it's awesome for 5 minutes and then monetization rot sets in.
[1] None of these work for everyone; conversely, all of these are seen as great things by some and have people who prefer that one thing over others for its quality.
The trouble is, you are no longer "surfing" the Web, you are digging through your RSS feeds and links to interesting sites, fediverse subscriptions etc,.that's not good UX, perid.
>Google's algorithms didn't create this situation; people chasing high Google rankings did.
You're technically right. You'd be more right if you said people chased the highest spots on search engines for the widest breadth of queries.
If there were implicit alphabetical ordering of search results I guarantee you'd end up a bias toward A's, Z's or otherwise in people trying to get top spots.
How is transparency worse than smoke screen that we have today? For example healthy and good websites could rank according to good content, good optimization, variety of multimedia content, decent design and UI etc. You can't have too much of good things and qualities. That would be something like writing a too good book or making a too good product.
Because the rank algorithms are subjective heuristics, not absolute metrics. All rank algorithms always have been. It started with the link metrics, then people started gaming that. It's been a signal/noise war ever since.
It's also dangerous to ask for the exact criteria because they are ever changing. Google et al don't want to be prescriptive about what a good site is, they want to recognize what a good one is. You make a good one, they'll figure out how to recognize it.
They can't sit down and publish "The Definitive Guide to a Good Website". That's just not their role and it will be out of date before it's published.
A big problem is that a lot of community content went behind Facebook. Instead of creating webpages or forums people started using Facebook pages and Facebook groups. This is the main reason I have been anti Facebook for over a decade. Not because of privacy reasons as many are but because I saw that Facebook will put the web behind it's closed doors. Even today some of the best reviews about any product or service are usually in enthusiast community forums. But a lot of that activity has gone behind closed doors of facebook and now reddit. Most of the current thriving forums are those that pre existed Facebook.
That's why mechanism design [1] exists as a field of study. The whole idea of that field is to provide the proper incentives to steer the participants towards your objective. Yes, considering they will try to "game" the system however they can.
I'm pretty sure google could do strictly better (i.e.: better in all reasonable accounts) than they do now if they focused on the users' experience instead of revenue for a couple terms.
Paul Graham says Google doesn't want to follow anybody down that road (human intervention in search). But ISTM the problem is that even though they don't, they can just throw a giant pile of money at it if they needed to crush a competitor. Also, VC will refuse to invest in anybody doing it because Google.
I wonder if punishing presence of advertisements would filter out most pages that are SEO'd to the max and instead promote "labors of love" type pages.
This is an interesting idea because it would create a type of non- or anti-commercial SEO that could counteract the commercial one. However, Google would never do it because they sell most of the ads that would be (not) hosted on these sites.
Wouldn't it be reasonable from Google to show how their ranking algorithms work so all webmasters and content creators know how to behave on the web. Now we have black box that's causing confusion and is misdirecting websites and web users.
I don't think so. If the problem is people gaming the system, making it easier to game isn't going to improve the situation. It's not going to put good content creators on a level playing field with spammers, because good content creators simply don't care as much as spammers about search engine gaming.
But people are already gaming the system, and on any search from product reviews to code snippets, I see SEO-optimized spams populating the top results. Good content creator don't have the time or the technical inclination to reverse-engineer the ranking algorithm (or to brute-force by creating tens of thousands of sites and see what sticks in a giant multivariate test).
Knowing the actual rules might give them a fighting chance, since the bad guys already know these rules anyway.
Some would even say it killed the web by centralizing all the content in the hands of a few [0].
Which is the direct consequence of everybody optimizing to better show up on Google/Facebook/Amazon/Microsoft and ultimately even migrating all their hosting to these companies.
So everyone worried about SEO became afraid to link to anything except:
1) Their own website 2) High reputation sites like NYTimes, etc.
It's sad. Makes it harder to navigate the web.