
A technical guide to SEO - dividual
https://ma.ttias.be/technical-guide-seo/
======
jonesb6
I had an essay for a Philosophy course in college (titled Critical Thinking)
where the professor provided the following slide, aptly named "Who should I
mistrust".

Point two of four said: "Mistrust the first internet sites that pop up when
you search for debatable issues".

Hurray for SEO.

------
rcarrigan87
Solid resource but ultimately this kind of stuff plays a very small part in
rankings. Targeting the correct keywords and building backlinks are what it's
all about. This article came out not long ago, thought it was really good.

[http://www.siegemedia.com/strategy/increase-website-
traffic](http://www.siegemedia.com/strategy/increase-website-traffic)

~~~
davemel37
With all due respect...if search engines can't crawl and index your
site...everything else you do is worthless for SEO. Get the technical part
right and you are a mile ahead of the pack.

Building backlinks? Targeting correct keywords? This advice is obsolete for
half a decade already.

With hummingbird and rank brain Google understands context, topics, entities
and their relationships, etc...

There is only ONE sustainable seo strategy outside of getting the technical
parts right and that's to be an indispensable resource for your target
market...with content that cannot be found anywhere else...and that your users
cannot live without!!!

~~~
rcarrigan87
I wish that were true.

------
facepalm
I think AdSense used to warn of possibly lower earnings when switching to
HTTPS, because somehow less ads are available over HTTPS. Is that still the
case?

~~~
dazc
I've switched a handful of sites from http to https over the past 7 months and
the drop of adsense earnings has been negligible.

~~~
Mojah
Same scenario here (author of the original post): moving from HTTP to HTTPs
did hurt my revenue for the first couple of months (I switched 2 years ago),
but advertisers seem to have caught up.

For tech-oriented sites, I don't see a difference in revenue for HTTP vs HTTPs
sites anymore. In fashion or other markets, that may still be a different
deal.

~~~
brianwawok
Despite being 2016, Bing can't read a ssl sitemap served on HTTPS sni.
Supposed to be fixed soon I guess...

------
graeme
Great overview, thank you. This gave me several things to fix I hadn't been
aware of, and my site generally already does well with on-site SEO.

Actually, what do you recommend for ongoing monitoring of the things flagged
by wget scripts? I could assign it to my programmer. I also think Moz or
similar services handle that automatically. It might be more effective just to
outsource it to them?

My current solution has been to avoid monitoring it, which isn't optimal.

~~~
wtracy
Maybe it's not what you're looking for, but it would be easy to just write a
cron job that runs wget and emails you if the error output is non-empty.

------
Maarten88
> Content is, as always, still king. It doesn't matter how technically OK your
> site is, if the content isn't up to snuff, it won't do you much good.

Shameless plug: friends of mine have been building Webtexttool [1], a tool
aimed at helping content writers (who often are unaware of SEO technicalities)
write optimized content. They are still adding features but it already works
quite well.

[1] [http://www.webtexttool.com/](http://www.webtexttool.com/)

~~~
davemel37
Shameless rebuttal: it's 2016...isn't it about time people stopped polluting
the Internet with content written for search engines?!

~~~
obstacle1
This isn't a "rebuttal" of anything. It is just an expression of your distaste
for SEO.

~~~
davemel37
I do SEO for a living! No distaste here for anything other than spam and
decades old strategies that are a net negative for the world and the SERPs

~~~
techdragon
I literally cannot agree with you more. I've noticed that the bulk of my
"searching behaviour" of about the last 2 years, has been a constantly
increasing level of "aggressive specificity" to fend off the SEO spam.
Multiple quoted segments, parentheses, OR keywords, the insanity that is the
"required term flag", all used with increasing frequency to get decent results
out of an increasingly irrelevant pile of garbage that I get when I perform
basic "simple searches" for facts, references and other topic related
information while working.

Don't get me started on the disgusting amount of stack overflow close sites I
see frequently rated higher than SO itself due to content spam tactics,
keyword stuffing and other grey hat SEO methods to make the sites more
"appealing" in technical topic search results.

~~~
davemel37
5 years ago we could easily differentiate between thin content(I.E. MAHALO)
and quality content...but now there is so much "not quite spam, but overly
redundant" content, regurgitated listicles and other keyword targeted content
spawned by content writers instead of subject matter experts that I can't seem
to find the really good content from search anymore.

Carefully Curated content is how I find the good information. Search used to
be a convenience, now it adds layers of complexity to my search needs.

------
brightball
Great list. Can't think of much you're leaving out there from a technical
standpoint.

------
imaginenore
Note that forcing HTTPS will reduce your ad revenue. Google itself warns about
it:

 _" Some important things to know about the SSL-compatible ad code: HTTPS-
enabled sites require that all content on the page, including the ads, be SSL-
compliant. As such, AdSense will remove all non-SSL compliant ads from
competing in the auction on these pages. If you do decide to convert your HTTP
site to HTTPS, please be aware that because we remove non-SSL compliant ads
from the auction, thereby reducing auction pressure, ads on your HTTPS pages
might earn less than those on your HTTP pages."_

------
atirip
Google plays go (learnt yesterday), develops self driving cars and robots. And
it still cant understand which piece of text on simple website is the header.
Without SEO. Right?

~~~
gk1
Why make it play go with your site when you can make it play checkers?

Nobody's claiming Google can't figure out the visual structure of a page or
determine the canonical page within a set of duplicates, but evidence[0][1]
has shown that making it easier for Google to do those things has a positive
effect.

[0] Search Engine Ranking Factors 2015: [https://moz.com/search-ranking-
factors](https://moz.com/search-ranking-factors)

[1] Anecdotal evidence: Just two weeks ago one of my clients' sites had a
temporary issue with 301-redirects, causing some old URLs to return 404
instead of redirecting to new URLs, and the www version of every page not
redirect to the non-www version. Search traffic nearly halved the following
week (this past week). I patched up those issues and we're already seeing
things resume back to normal.

------
known
Good one.

------
pibefision
Search Engine Optimization starter guide from Google:
[http://goo.gl/zTsclA](http://goo.gl/zTsclA)

~~~
tbirdz
That link is broken. Here is the real link:
[https://static.googleusercontent.com/media/www.google.com/en...](https://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-
engine-optimization-starter-guide.pdf)

~~~
watchdogtimer
Thanks for the correct URL.

FWIW, I find the original article easier to read than this SEO guide from
Google. Google's guide has so many different methods of highlighting important
body text (different combinations of bold, blue/dark blue/red, underlined, and
yellow background!), it's difficult to read. Reminds me of amateur desktop
publishing back in the 90's.

