Point two of four said: "Mistrust the first internet sites that pop up when you search for debatable issues".
Hurray for SEO.
Building backlinks? Targeting correct keywords? This advice is obsolete for half a decade already.
With hummingbird and rank brain Google understands context, topics, entities and their relationships, etc...
There is only ONE sustainable seo strategy outside of getting the technical parts right and that's to be an indispensable resource for your target market...with content that cannot be found anywhere else...and that your users cannot live without!!!
For tech-oriented sites, I don't see a difference in revenue for HTTP vs HTTPs sites anymore. In fashion or other markets, that may still be a different deal.
Actually, what do you recommend for ongoing monitoring of the things flagged by wget scripts? I could assign it to my programmer. I also think Moz or similar services handle that automatically. It might be more effective just to outsource it to them?
My current solution has been to avoid monitoring it, which isn't optimal.
Shameless plug: friends of mine have been building Webtexttool , a tool aimed at helping content writers (who often are unaware of SEO technicalities) write optimized content. They are still adding features but it already works quite well.
Don't get me started on the disgusting amount of stack overflow close sites I see frequently rated higher than SO itself due to content spam tactics, keyword stuffing and other grey hat SEO methods to make the sites more "appealing" in technical topic search results.
Carefully Curated content is how I find the good information. Search used to be a convenience, now it adds layers of complexity to my search needs.
FWIW, I find the original article easier to read than this SEO guide from Google. Google's guide has so many different methods of highlighting important body text (different combinations of bold, blue/dark blue/red, underlined, and yellow background!), it's difficult to read. Reminds me of amateur desktop publishing back in the 90's.
Nobody's claiming Google can't figure out the visual structure of a page or determine the canonical page within a set of duplicates, but evidence has shown that making it easier for Google to do those things has a positive effect.
 Search Engine Ranking Factors 2015: https://moz.com/search-ranking-factors
 Anecdotal evidence: Just two weeks ago one of my clients' sites had a temporary issue with 301-redirects, causing some old URLs to return 404 instead of redirecting to new URLs, and the www version of every page not redirect to the non-www version. Search traffic nearly halved the following week (this past week). I patched up those issues and we're already seeing things resume back to normal.
"Some important things to know about the SSL-compatible ad code: HTTPS-enabled sites require that all content on the page, including the ads, be SSL-compliant. As such, AdSense will remove all non-SSL compliant ads from competing in the auction on these pages. If you do decide to convert your HTTP site to HTTPS, please be aware that because we remove non-SSL compliant ads from the auction, thereby reducing auction pressure, ads on your HTTPS pages might earn less than those on your HTTP pages."