
Google begins penalising domain leasing - LewisMCYoutube
https://www.seroundtable.com/google-begins-penalizing-domain-leasing-28137.html
======
AlexMuir
I'm often searching information on auto parts and Google's results are quite
literally 100% webspam. Here's an example - The entire first page is the same
provider squatting domains:

[https://www.google.com/search?q=scrambler+50cc+parts+format%...](https://www.google.com/search?q=scrambler+50cc+parts+format%3Apdf&oq=scrambler+50cc+parts+format%3Apdf)

Wiring diagrams in Google images - literally every result is webspam in the
same format split across squatted domains.

[https://www.google.com/search?q=mazda+miata+wiring+diagram+s...](https://www.google.com/search?q=mazda+miata+wiring+diagram+stereo+pin&tbm=isch&source=univ&sa=X&ved=2ahUKEwjrgYyl8KrkAhVso4sKHXwfBtUQsAR6BAgJEAE&biw=1440&bih=728)

These aren't even particularly niche searches, nor is it hard to detect that
all these sites are the same.

~~~
gregschlom
Curious what did you expect of a query such as "scrambler 50cc parts
format:pdf"? Why the PDF filter?

You expect that some companies are publishing a parts catalog in PDF format?

The query just seems odds to me, and so the garbage results don't seem too
surprising

~~~
aspaceman
Actually about 2 years ago this is precisely the query I would have chosen
too.

You could always find a parts catalog, service manual, that sort of thing
scanned and posted online. PDF (used to) select for the scanned documents.
Used to be my main usage of google was finding service manuals for auto and
electronics.

Nowadays the spam just pretends to be a pdf cause we used to always search for
them.

------
ohashi
Now could they penalize all the paid links that are 'sponsors'?

Apache Foundation and many others are even in on it. They specifically say
they are selling a do-follow link.

Look at the companies at the bottom buying them:
[https://www.apache.org/foundation/thanks.html](https://www.apache.org/foundation/thanks.html)

LeoVegas Indian Online Casino Online Holland Casino Start a Blog by Ryan
Robinson The Best VPN Top10VPN PureVPN HostChecka.com HostingAdvice.com Web
Hosting Secret Revealed

It's just paid links. Apache isn't the only one doing it, a lot are. I'm glad
they are getting money, but it's paid links pretty openly on super highly
ranked pages.

~~~
namirez
Is this problematic because of the links? Because FOSS is funded by donations
and it's common for non-profits to acknowledge their sponsors. For example EFF
does the same thing [1], but FSF displays only the logos with no links [2].
Are you advocating for the FSF version?

[1] [https://www.eff.org/thanks](https://www.eff.org/thanks)

[2] [https://www.fsf.org/patrons](https://www.fsf.org/patrons)

~~~
troydavis
> FSF displays only the logos with no links [2].

That's why FSF's listed sponsors actually care about sponsoring the FSF. The
ASF "sponsors" pasted by the first comment and many of the listed EFF
"sponsors" aren't donating money because they care about the foundations.
They're purchasing links in order to rank higher[1], and a link just happens
to require donating.

[1]: EFF uses rel=nofollow on all links to sponsors, so a listing on this page
probably doesn't actually do what the buyer intended. Google covers this in
[https://support.google.com/webmasters/answer/66356?hl=en](https://support.google.com/webmasters/answer/66356?hl=en).
In a sense, EFF gets free money from people who think they're buying better
ranking but actually aren't. OTOH, ASF's decision to specifically sell
rel=follow links - which do pass PageRank - is surprising.

------
bencollier49
How would Google differentiate between a subdomain or folder that I use on my
website for a different purpose, and a leased subdomain? Given that they're
describing this as "penalisation", and not just "treating the subpart as its
own entity"?

~~~
turc1656
It's possible that the description Google used is a bit misleading. Meaning
that they might actually be doing what you said - splitting out the subdomains
as their own entities and the natural effect of this is to have a separate
ranking, thus looking like they got penalized for using the subdomain but
really they were just reclassified.

~~~
pizzapill
I've had a small stint into SEO in 2012 and Im pretty sure that Google treated
Subdomains as their own entity back then. I would be suprised if this changed
in the meantime.

~~~
Implicated
I spent a lot of time in the blackhat SEO world prior to 2014 or so... I can
assure you that subdomains reaped a bountiful reward from their association to
the main domain.

They certainly weren't equal but they got more than a little
(trust/authority/ranking - call it whatever you want) help from their parent
domain.

To the point that this was the case:

If I bought a brand new, never used .com and built 500 pages of spammy content
and linked those pages/content from thousands of other super spammy pages...
that domain is de-indexed within 2 weeks and the pages never really ranked/got
organic traffic worth the cost of the domain name.

If I built those same pages on a subdomain and linked them in the exact same
manner but the parent domain was very strong - the pages would rank and
traffic would flow steadily, indefinitely. (As long as the parent domains
footprint/linking was large enough to absorb whatever penalty that my spammy
pages brought).

It's still happening today where people buy up domains that are established
with good link profiles - go to archive.org and download the sites old
content, re-host it and keep it alive. Then they build spammy subdomains on
the parent domain - profit.

~~~
pizzapill
I've read that in multiple docs back then but since you observed this it
could've been false. The rationale I had for it was that there are many sites
with completely independent subdomains or technical resources like asset being
served by subdomains. Buying established Domains and scraping old content does
not rely on Subdomains. Affiliate, ads or your own product can be placed on
the TLD. I dont doubt you tough and I think I will try it sometimes. Old
school fun

------
chiefalchemist
Penalise is the wrong concept here. What The Google is doing is getting
smarter and that intelligence is being applied. Call it a correction if you
need a name for it.

Long to short, Google just wants to be human [1], but it wants to do that
automatically and at scale. It seeks to reflect what a human would approve or
reject. If you look at the history of SEO / the search algorithm this is the
nature of the progression. Obviously.

These sites aren't being penalized. Hiding in a blindspot and being discovered
is not a penalty. It's simply reality; a reckogning; a day that only the naive
would believe was not coming.

[1] Actually, Google wants to be you. The attempt to personaling search is
Google wanting to recommend to you if you could recommend to you.

------
chatmasta
I'm confused. Does this affect the ranking of sites like alice.github.io or
bob.netlify.com? Does it penalize github.io and netlify.com? Both?

~~~
pluma
I think github.io is explicitly registered as a shared second level domain
(similar to co.uk), which also ensures browsers don't allow cookies spanning
multiple subdomains.

~~~
chatmasta
Registered with whom? How does one register their website like that?

~~~
tialaramex
The Public Suffix List.

There are two halves to the list. The first ("ICANN") half are suffixes that
work like a TLD from the perspective of the DNS registry. Nominet handles
registration in .co.uk or .org.uk or .net.uk the same way another country
might choose to handle their whole TLD.

The rest is names whose registered owners do the same thing as a registry for
the public or some subset of them, like Blogger or a cheap bulk hosting site
where you don't pay for a name.

PSL listing has a variety of effects. Don't do it hoping to achieve one of
them and then being disappointed by the others. Do it because you have a real
public suffix.

For example many browsers won't let Cookies escape a suffix, so company
A.some.example and companyB.some.example can't share cookies if some.example
is on the PSL. Let's Encrypt name quotas care about the PSL so
companyA.some.example certs wouldn't share quota with companyB.some.example.
DMARC won't work on a public suffix, but HSTS preloading does.

------
unreal37
There is some significant disagreement that Google is in fact doing anything
here.

There's somebody who wants them to do something, but Google themselves seem to
say subdomain leasing is OK. [1]

[1] [https://medium.com/@loish/a-response-to-danny-sullivan-
and-g...](https://medium.com/@loish/a-response-to-danny-sullivan-and-google-
about-white-label-sites-in-the-search-results-25d0dc2931f8)

------
celeritascelery
Maybe I just don’t understand the issue here. What is wrong with sub domain
leasing?

~~~
bencollier49
I assume that previously, Google's ranking algorithm would attach a score to a
domain name, and by leasing out subsections of the site, the sub-owner would
get an artificial ranking boost.

So now perhaps the Googlebot notices if a subsection of a site is radically
disconnected and treats it as an independent publication.

~~~
criddell
That's not really a penalty then.

~~~
amelius
Anything their algorithm does that reduces the rank for a webpage can be seen
as a penalty.

But it's not a penalty in the legal meaning of the word.

~~~
criddell
If a bank incorrectly deposits $1 in your account and then removes it later,
you aren't being penalized.

------
Animats
Does this include sites hosted on Google Drive, Google Spreadsheets (yes, you
can host a site on Google Spreadsheets, by putting HTML in a cell), Google
Docs, etc.?

~~~
ehsankia
But are those spreadsheets/docs indexed by and show up on search results?

~~~
luckylion
Yeah.

 _site:docs.google.com_ finds approximately 4m results.

~~~
ehsankia
I see, but I assume Google has custom ranking since I never see them show up
in any search result, they clearly aren't getting Google.com's rank score.

~~~
luckylion
Absolutely. Subdomains may get _some_ extra credit, but they still need
incoming links. "Leased" subdomains and directories don't come naked, they
usually get lots of links from the main site, often site-wide in
header/navigation, sidebar or footer.

Since the publishers get paid for leasing out the subdomain _and_ linking to
it, you could argue that they are essentially selling links. That's apparently
not what Google is saying though, because they'd penalize the sites selling
the links, too, and only the leased out subdomains/directories have been
affected, the rest of the sites are still ranking as they always did.

------
buboard
“Google fixes problem it created itself”

