Never, ever, let marketers send newsletters and promotional e-mails from the same IPs that the websites are hosted on. A rogue employee who violates the CAN-SPAM act may result in the entire website being blacklisted.
But the problem is all someone needs to do is just get the IP for the website from the email link and blacklist that too, so a different IP will not help much. The best solution for blacklisting is cloud hosting which has many IPs , so the odds of all being blacklisted are slim. Amazon s3 servers have like 100+ ips
That's why so many startups and agencies fail miserably with their blogs; because their content is third rate crap that no one really wants to read. They wonder why say, Moz or Sitepoint gets all the traffic, and you want to say "have you actually looked at your own work?" given that the latter two companies offer useful, actionable advice and they don't. It's like they expect the magic Google fairy to take their uninteresting work and hey presto, it suddenly appears on page 1 and get millions of views.
Okay, it's not quite that simple. Sites with a bigger marketing budget tend to do better, as do sites and companies willing to put a lot more time into getting content out on a regular basis. And yes, some people will get temporarily (or sometimes semi permanent) advantages through dodgy things like private blog networks and paid links.
But actually trying and putting in the effort/treating the SEO side as a full time job is really the minimum here, and many don't seem to even bother.
If you have content that people dig, Google will automatically send you more traffic.
If you have mediocre run of the mill startup content blog, you are shit out of luck because you are doing the same thing everyone else who failed is doing
I don’t get what it means for the entire website to be blocked from an SEO perspective. Google don’t remove you from the index because an IP address ends up on a spam list and that’s also where you website is hosted, right?
Also, you’re only hosting on s3 if you’re hosting a totally static website, which very few people do.
I think this is the danger of the SEO industry. On the one hand you have bad actors trying to game the system, but on the other, people selling you legitimate SEO are often claiming rules as fact with little or no evidence - because it’s different to verify.
Oh, and make sure you actually use heading tags properly. God, so many sites and apps seem to be designed with no thought put into heading structure, with random h2s and h3s strewn about at random. Especially don't forget the h1 tag, and try and make sure it (and the title tag for the page) are both unique on every page.
Basic I know, but come on. Too many SPAs seem to be built with no thought put into even the basics of HTML structure or on page SEO.
My only objection would be this bit (below). Specifically, "demands." How many engineers are in a positiin to demand anything? The truth is, for the most part, it's not engineers who are ruining the internet with bloat and privacy infringements. It's designers and marketing. They're more likely to forget the end user and opt for so nasty excess.
> "Demand that someone take ownership of each tracking pixel and tag that's added to a page, then, make these stakeholders justify their tags every six months. If you don't, people will ask you to add junk to the page until your team gets blamed for a slow site."
And to counter your point, I often see sites built by engineers that ignore the end user by not employing any of the best practices the UX/UI and marketing communities have uncovered over the years.
No argument from me. There's plenty of title inflation, and turds wearing lipstick.
re: "And to counter..."
I have to push back. Unless there's a glaring technical omission, then whatever you see was likely not the engineer's decision. That all happened prior to construction. And even a technical faux pas might not be the engineer's. More than once I've stuck my neck out and said, "...that's not ideal..." and was overruled and marginalized.
A marketer is brand building. Could ux/ui be part of it sure but as long as feeling x is associated with product y they have done there job.
A design's job is to express that feeling through design.
UX/UI could be part of it depending on the group targeted. But it's like a red tie it could help but isn't required for that important business meeting.
Can someone expand on that one? The blacklist would only apply to the IP address being sent from and not the domain?
You can sidestep these issues by using something like Mailgun or Sendgrid as your delivery mechanism for server-generated emails (password resets, account registration confirmations, etc). And always set up your SPF records to include these services as permitted senders.
You can use the same services to send marketing emails or there are of course services like MailChimp / Constant Contact / Campaign Monitor / etc that provide all the UIs for marketers. Same thing applies though, you gotta set up SPF records.
Why wouldn't they blacklist the domain if they know that IP is associated with your domain?
Would the domain blacklist only happen after you've had several IP addresses blacklisted?
I know that spam email is coming from an IP; I don't necessarily know that the IP is officially associated with a given domain. Sure, I could do a dig to see if the A record matches the IP, but blacklisting an entire domain is a pretty draconian step. I need my blacklist to be accurate, with a minimum of false positives, or its market value diminishes. Adding domains based on IP association is going to be more likely to induce false positives and increase the administrative overhead involved with my blacklist, even if that overhead is mostly automated through "get me off this blacklist" forms and the like. /speculation
As an aside, I've noticed that if a cloud provider recycles IPs (Digital Ocean) the chances of pulling a blacklisted IP off the heap is pretty good. At this point, even if all the server is doing is sending the occasional password reset email to a handful of internal staffers, we run the email through Mailgun. It just isn't worth dealing with the hassle of trying to get an IP off a blacklist.
So it feels like a piece of JS that says I have seen everything that should load load ok, then reports home with the page seems a good idea.
Probably quite doable - can JS see the network load like web developer tools ?
Even if it does, in-page problems doesn't affect your result not even by 1%. 99% is backlinks and their quality (or course).
Example: check techcrunch.com and debug/inspect their homepage. 15 JS errors, 6mb in size, no Description/Keywords, awful <title>, bad HTML elements. Can you even remotely beat them in the keyword "startup news"? No way. No matter how hard you will try to "SEO" your website.
I will tell it again. "SEO" was always a gimmick marketing thing. It's crazy that even now ppl don't understand that the only thing that matters are quality backlinks, as a result of quality content.
It’s much harder to write a guide on how to engage and delight your audience, and how to get the attention of other sites and the authority and trust to get linked to.
"Checked his post history. Russian bot confirmed."
Maybe black hat/spammy SEO shouldn't exist, but at a certain level, good SEO is basically just about designing a good product and marketing it well.
My unsolicited advice is to list some specific SEO fixes and improvements in the app description. It's not clear to me what it does and the screenshots are hard to parse.
"Consider using the "304-Not Modified" response code on large websites with lots of pages that don't change very often."
This is the only thing that, from an SEO perspective, I'd challenge. I can't even begin to see where the supposed benefit of this would be.
For context, I've only come across a HTTP 304 status once in 9 years of SEO and crawling websites on a daily basis. I've no first hand experience of them being deployed in this way at scale on a live website and so haven't seen any server log analysis that demonstrates the efficacy of their usage etc. But it's an interesting idea nonetheless.
With dynamically generated content you’re more likely to just see 200s, but I think Nginx sets Etags automatically on static content so it’s common to see 304s there.
I’m pretty surprised you haven’t seen it often, but I’d guess it’s more to do with whatever crawlers you’re using (they’d need to be caching content and headers), rather than the scarcity of the status code.
That irked me a little. I'm assuming this includes fonts & images (not a bunch of ajax requests), but it still seems high.
Robots stops crawling not indexing
In other words, publish content people want to link to, and get big players to link to it.
That’s it. Other SEO advice is usually a distraction or snake oil.
For example "SERVER STABILITY AND DOWNTIME".
How come taking case of your server stability is SEO???
SEO doesn't exist because the ONLY way to optimize your website is to write good content and get quality backlinks. To be precise, 99% of "SEO" is backlinks and 1% is your HTML quality (in-page SEO).
To give you an example, you can have the worst HTML, no internal linking, to Sitemap, no nothing, but have 10-20 quality backlinks and still be #1 in Google.
I'm doing internet marketing for more than 15 years now and the proof is out there. Quoting something else from the article:
So he author claims that a relative path to a URL (inside a JS call) will effect your organic results. I wonder if anyone understands that this makes no sense and especially SEO-wise.
Google has at least 50-100 variables it checks and assigns different weights to.