Hacker News new | comments | show | ask | jobs | submit login
A technical guide to SEO (ma.ttias.be)
239 points by dividual on Mar 12, 2016 | hide | past | web | favorite | 26 comments



I had an essay for a Philosophy course in college (titled Critical Thinking) where the professor provided the following slide, aptly named "Who should I mistrust".

Point two of four said: "Mistrust the first internet sites that pop up when you search for debatable issues".

Hurray for SEO.


Solid resource but ultimately this kind of stuff plays a very small part in rankings. Targeting the correct keywords and building backlinks are what it's all about. This article came out not long ago, thought it was really good.

http://www.siegemedia.com/strategy/increase-website-traffic


With all due respect...if search engines can't crawl and index your site...everything else you do is worthless for SEO. Get the technical part right and you are a mile ahead of the pack.

Building backlinks? Targeting correct keywords? This advice is obsolete for half a decade already.

With hummingbird and rank brain Google understands context, topics, entities and their relationships, etc...

There is only ONE sustainable seo strategy outside of getting the technical parts right and that's to be an indispensable resource for your target market...with content that cannot be found anywhere else...and that your users cannot live without!!!


I wish that were true.


I think AdSense used to warn of possibly lower earnings when switching to HTTPS, because somehow less ads are available over HTTPS. Is that still the case?


I've switched a handful of sites from http to https over the past 7 months and the drop of adsense earnings has been negligible.


Same scenario here (author of the original post): moving from HTTP to HTTPs did hurt my revenue for the first couple of months (I switched 2 years ago), but advertisers seem to have caught up.

For tech-oriented sites, I don't see a difference in revenue for HTTP vs HTTPs sites anymore. In fashion or other markets, that may still be a different deal.


Despite being 2016, Bing can't read a ssl sitemap served on HTTPS sni. Supposed to be fixed soon I guess...


Thanks, that sounds encouraging!


Great overview, thank you. This gave me several things to fix I hadn't been aware of, and my site generally already does well with on-site SEO.

Actually, what do you recommend for ongoing monitoring of the things flagged by wget scripts? I could assign it to my programmer. I also think Moz or similar services handle that automatically. It might be more effective just to outsource it to them?

My current solution has been to avoid monitoring it, which isn't optimal.


Maybe it's not what you're looking for, but it would be easy to just write a cron job that runs wget and emails you if the error output is non-empty.


> Content is, as always, still king. It doesn't matter how technically OK your site is, if the content isn't up to snuff, it won't do you much good.

Shameless plug: friends of mine have been building Webtexttool [1], a tool aimed at helping content writers (who often are unaware of SEO technicalities) write optimized content. They are still adding features but it already works quite well.

[1] http://www.webtexttool.com/


Shameless rebuttal: it's 2016...isn't it about time people stopped polluting the Internet with content written for search engines?!


This isn't a "rebuttal" of anything. It is just an expression of your distaste for SEO.


I do SEO for a living! No distaste here for anything other than spam and decades old strategies that are a net negative for the world and the SERPs


I literally cannot agree with you more. I've noticed that the bulk of my "searching behaviour" of about the last 2 years, has been a constantly increasing level of "aggressive specificity" to fend off the SEO spam. Multiple quoted segments, parentheses, OR keywords, the insanity that is the "required term flag", all used with increasing frequency to get decent results out of an increasingly irrelevant pile of garbage that I get when I perform basic "simple searches" for facts, references and other topic related information while working.

Don't get me started on the disgusting amount of stack overflow close sites I see frequently rated higher than SO itself due to content spam tactics, keyword stuffing and other grey hat SEO methods to make the sites more "appealing" in technical topic search results.


5 years ago we could easily differentiate between thin content(I.E. MAHALO) and quality content...but now there is so much "not quite spam, but overly redundant" content, regurgitated listicles and other keyword targeted content spawned by content writers instead of subject matter experts that I can't seem to find the really good content from search anymore.

Carefully Curated content is how I find the good information. Search used to be a convenience, now it adds layers of complexity to my search needs.


Great list. Can't think of much you're leaving out there from a technical standpoint.


Good one.


Search Engine Optimization starter guide from Google: http://goo.gl/zTsclA


That link is broken. Here is the real link: https://static.googleusercontent.com/media/www.google.com/en...


Thanks for the correct URL.

FWIW, I find the original article easier to read than this SEO guide from Google. Google's guide has so many different methods of highlighting important body text (different combinations of bold, blue/dark blue/red, underlined, and yellow background!), it's difficult to read. Reminds me of amateur desktop publishing back in the 90's.


404


Google plays go (learnt yesterday), develops self driving cars and robots. And it still cant understand which piece of text on simple website is the header. Without SEO. Right?


Why make it play go with your site when you can make it play checkers?

Nobody's claiming Google can't figure out the visual structure of a page or determine the canonical page within a set of duplicates, but evidence[0][1] has shown that making it easier for Google to do those things has a positive effect.

[0] Search Engine Ranking Factors 2015: https://moz.com/search-ranking-factors

[1] Anecdotal evidence: Just two weeks ago one of my clients' sites had a temporary issue with 301-redirects, causing some old URLs to return 404 instead of redirecting to new URLs, and the www version of every page not redirect to the non-www version. Search traffic nearly halved the following week (this past week). I patched up those issues and we're already seeing things resume back to normal.


Note that forcing HTTPS will reduce your ad revenue. Google itself warns about it:

"Some important things to know about the SSL-compatible ad code: HTTPS-enabled sites require that all content on the page, including the ads, be SSL-compliant. As such, AdSense will remove all non-SSL compliant ads from competing in the auction on these pages. If you do decide to convert your HTTP site to HTTPS, please be aware that because we remove non-SSL compliant ads from the auction, thereby reducing auction pressure, ads on your HTTPS pages might earn less than those on your HTTP pages."




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: