Hacker News new | past | comments | ask | show | jobs | submit login
Minimum Viable SEO (priceonomics.com)
197 points by omarish on Jan 6, 2012 | hide | past | favorite | 67 comments

I feel like I'm taking crazy pills every time I read something about SEO. I keep expecting there to be more to it or that I'm missing something. Other than that, SEO is just making sure you have reasonably semantic markup and clean URLS. Isn't that just plain old best practice anyway? What am I missing? I swear there must be something!

Do you know what keywords are generating the most traffic for your competitors, what keywords you'll most likely be able to rank for, what keywords have the highest conversion ratio, are you using video sitemaps, etc.

It sounds like your just constantly reading material aimed at beginners rather than diving in depth into the subject.

Semantic mark up and clean URLs are a big part of SEO. But there's a lot more to it. There is obviously the whole incoming links thing as a few other comments have mentioned.

The biggest issue that I see on client's sites is site architecture, organization and internal linking. Most commonly that means multiple pages with the same content and different URLs. Amazon is a good example of this:

This URL floats around internally - http://www.amazon.com/gp/product/1416571760/

But it's rel="canonical" is http://www.amazon.com/Greater-Journey-Americans-Paris/dp/141...

Amazon is essentially passing internal link juice around to non-canonical URLs, then "fixing" it with a rel canonical hack. Not that amazon needs the link authority, it's just an example.

Another good one is folks who append things like ?ref=category to a URL to track visitors through the site. That's a unique URL to google, though you can specify what url parameters are legit in webmaster tools. If you really need to do something like that, use onclick events and keep the href attribute to a pretty, canonical URL.

Those don't seem like big issues individually, but imagine a 3,000,000 page site that never links to a canonical URL anywhere internally (this happens).

Internal anchor text can also play a roll in what you rank for. If you happened to make it through all that stuff up there, here's an SEO gem: use breadcrumbs, but your home link doesn't need to have the anchor text "home". Instead use a keyword that's relevant to your site's purpose. If you sell dog food, use "dog food" as the anchor text of "home" breadcrumb link.

An example: do a search for "online shopping" on Google. You'll probably see overstock.com in the top results. Why? Are they really more well known and authoritative than Amazon in that arena? Every home breadcrumb on overstock.com has the anchor text "online shopping". Millions of internal links with perfectly optimized anchor text pointing to a single page.

You're definitely missing the incoming links part.

Ok, but what the hell does that have to do with building the site. Inbound links are obvious. Wow, really, is that all? And there are entire communities devoted to this? Seriously, I really want to know what I'm missing because I feel like I must be whiffing pretty hard here not to see it.

Backlinks are everything. Nowadays, I think you have to try pretty hard to produce markup that Googlebot can't interpret.

I rocketed my forum community to the top of Google by generating backlinks. Some people consider it grunt labor, but I found it a lot of fun, especially if you get a program that keeps track of your rankings. It's addictive.

And this is forum software that is rife with duplicate content, has URLs like `/?forumdisplay=22952920&v=92348232`, and is so unSEO that a third party company started up to sell an expensive product that SEO's the softwre via plugin.

Guess what, despite all the little marginal tweaks and SEO obsessions I don't sweat over, Google miraculously figured stuff out. Backlinks are obvious and they work. It's the reason why you can overtake encumbent #1 rankings that have no backlink campaigns.

That's sort of my point. I mean, getting links to your site isn't exactly a secret, this has been key since Google first came on the scene. Sure, it's hard work to have a reputable site and to promote it correctly, but it's not rocket science. In fact, it's pretty typical common sense.

I get the point about duplicate content, that's something I had heard but not really delved into because, well, decent architecture usually means not having that sort of stuff around.

The more I hear SEO "experts" go on about things, the more I think it's a bit of a scam. Scam is probably the wrong word, as I'm sure many SEO people are well meaning, but the more I look the more it all seems like snake oil.

I think what you're getting at is that the majority of SEO is pretty straightforward - I know what you mean about rarely discovering something new from SEO sites. The final x% is a bit more challenging.

At one point I put together a pretty basic site with one backlink and barely any attention to content and accidentally ranked #1 in Australia for "make money". One particular page on that site is four paragraphs I threw together and it's made $24k in four years for no effort since. I've had a few other events along those lines too, mostly leveraging the past property I built up and using that as a backlink.

You might be looking at the wrong places or listening to the wrong people...

There are plenty of great search resources out there. I would suggest to start at http://searchengineland.com to get a better sense of the landscape, especially everything Danny Sullivan writes

It's a lot more complicated than you are letting on.

What you are saying is like saying architecture is just drawing diagrams or good journalism is just writing about stuff.

Like many industries, there's an art to SEO and plenty of nuances far beyond clean coding and user friendly design.

I think many SEO type people have done a terrible job of marketing their field, because the place they occupy in most people's minds is not a particularly nice one.

I think emphasizing statistics and analysis would make a bit better impression on some of us. It turns on a light that says "aha, there is something real there!".

the problem is there are so many hacky, cold-calling SEOs that the majority of people you're likely to run into in SEO suck. when you really get into it (obviously biased) i think the combination of statistics, data mining, creative content and competitive analysis of your position in a perpetually changing market is a fascinating blend and you can learn as much as you want and take your skills and strategies in a ton of different directions.

it'd be nice if we could be completely transparent with analytics while iterating a search strategy for a large site, because the learning experience would be great whether you're a big-name, experienced SEO, looking to get further into SEO, or an anti-SEO hacker looking to just call us all out for being full of shit :) it'd be cool. too bad the economic reality of releasing complete analytics data to the entire internet makes this an impossible dream.

It's not easy getting links. More importantly, if you have a great product where everyone else has great products, the marginal benefit of having the best product with the most amount of authoritative links is huge, because the # of clicks that come from the top result over #2 is monumental. So even if you get thousands upon thousands of links, like those that rank for "cheap flights", it's the site that gets thousands and thousands +1 that will win the day, and make millions more on a long enough timeline.

Agreed. In my experience, people linking to you is 90% of the work. Everything else (canonical URLs, etc.) doesn't really matter. Case in point: I now rank close to #1 for the terms "startup conference". Why? Because since I launched the new name for the conference, enough people linked to it. 6 months ago, I barely made it to the first page of Google results.

And by the way, there is one repeatable way to get high-ranked sites to link to you: produce quality content. So karma and SEO are not that far off.

What's the point of clean URLs? Everybody parrots this "clean URLs" meme, but do we have a definitive proof that clean URLs help and that unclean URLs penalise a site?

SEO fundamentals: google is smarter than you are, and has invested a lot of effort into discovering the sort of websites that their users want to visit. If you make the sort of website that users want to visit (relevant content with clear navigation), google will send their users to it.

uh... in-bound links?

Don't do SEO.

Do accessibility.

Seriously, don't worry a single bit about SEO. Yet, if you make it so that your site's content, navigation and URLs are screen-reader friendly, then you've probably made a site that has solid SEO and is accessible to people with vision problems.

Yet, just doing SEO doesn't make for a very accessible site all the time.

Once this is done, make good content, be real and responsive humans to your customers and you're on your way to a good business model and website.

I read this, and all I can see is:

"Don't do marketing." "Just make a good product and people will find it."

I think this is the downfall of many great products. The current startup world has an idealistic belief that better products always win and if you build something people want, they'll find it, when in fact, this has never been true. The "best" politicians consistently lose to those better at marketing and branding. The "best" ideas in organizations often fall to those trumpeted with more style or by those with more clout. Many great startups shut down because they couldn't find a repeatable, low-cost way to acquire customers (and SEO is exactly this).

More here: http://www.seomoz.org/blog/i-disagree-with-fred-marketing-is...

And here: http://hackersandfounders.tv/RDmt/rand-fishkin-inbound-marke...

I believe parent was trying to express the sentiment that search engines inherently favor sites that are well-structured and present the relevant keywords in a format that is easily parsed by web crawlers.

Totally agree with you Rand. My sentiment was down-voted for being too hostile, so I'm glad you're weighing in on this.

Climbing search engine rankings has less to do with on-site SEO perfection yet much more to do with backlinks. Look at what porn and gambling kings are doing to compete with eachother.

Too many posts about SEO get everyone obsessing over marginal website tweaks/details when a single backlink to their webite doesn't even exist outside of their HN account.

I don't mean to say that you shouldn't be doing PR, Marketing and Social Media efforts; because you should. Yet, I don't think of press releases, making awesome products and participating on social networks to be SEO. But if you do them right, you'll get link backs as well.

I absolutely agree... that this is how it should be.

The only problem is that it isn't currently this way (despite what Google and Matt Cutt's might say). It is naive to think that this approach is going work.

Furthermore, you're competitors are going to do accessibility + SEO, and they are going to trounce you. randfish said this better than I could, so I'll leave it at that.

You can get away with what you are suggesting if you have staggeringly awesome content, but even then - why risk it. If you integrate SEO into your culture from the start, and have the right process then it doesn't have to be that hard.

This is a simplification. Yes, I agree that making your site accessible improves SEO, but there are lots of things that fall under SEO, and not accessibility that you should do. E.g. "What am I ranking for?" will tell you what the wider internet thinks your site is good at. "What search terms are bringing people to my site?" will tell you what your visitors really want. "How many people are searching for X and Y?" will tell you how popular X and Y are.

Another SEO non-accessibility advantage, you get to learn what your customers call your product/service.

e.g. I like going on motorbike holidays, like "Long Way Round" (but not as extreme) that's called "motorbike touring". Not motorbike holidays, or trips, or travelling, but touring. One step of SEO is to see what people are searching for, which will tell you what your customers call your service.

We've spent a lot of time doing whitehat SEO on our website, and it's paying off (2x search traffic in 6 months). I really would recommend most startups spend a few days reading up on it from good quality sources. Nearly 40% of all our traffic comes from search so it makes sense to pay it some attention.

It's not hard. What is hard is navigating your way through the crooks/spammers/quacks who will try every trick in the book to try and sell you something you don't need. Startups do not need to hire experts to do it for them. What they need to do is solve problems like they solve other problems, with no money.

People who sell SEO usually:

- Have the gift of the gab

- Prey on peoples ignorance

- Prey on peoples greed

There are good SEO people out there, but they are rare. Also I struggle to imagine a situation where smart people just can't read up on it themselves and execute it themselves.

As a startup you should be focusing on good quality content and sustainable growth. So play to your strengths and don't pay a lot of money for magic potions that offer short term benefits. Play the long term game.

SEOMoz is a good place to start (their free blogs etc). Executing good technical SEO on your own website is pretty easy.

Tibbon made a good point, 'dont do SEO do accessibility`. I'm not sure I'd go to that extreme, but it's a good way of looking at it. Google is your most disabled user, it can't see very well, it can't really understand things very well either. If you make your site highly accessible you're well on your way to good SEO.

For me, spending money on SEO doesn't make much sense.

However, spending time does.

Most of the big wins on SEO are a combination of strategy and time. As in : have a strategy to get links, and spend time on that strategy.

Sure, you can pay someone to do this for you, but anyone who is smart enough to develop software is more than smart enough to do SEO.

The big mistake people make is thinking it exists outside of marketing. In fact your SEO strategy and your marketing strategy are just two dishes on the overall meal of gaining customers.

There's no question that whitehat SEO pays off. And referring to startups, you mention "I struggle to imagine a situation where smart people just can't read up on it themselves and execute it themselves."

It has everything to do with your opening line: "We've spent a lot of time doing whitehat SEO on our website".

It always comes down to TIME. If you're a new startup, you're trying to work on your business, not in it.

Then you need to find someone who can help you with whitehat organic SEO. And as you mentioned, the hard part is navigating your way through the 'quacks'

Most startups as far as I know don't have much money, but if they do have money then fair enough they can spend it on SEO if they think it would be beneficial. As you say though it's going to be hard to find someone decent and honest.

2x search traffic over 6 months? That is not something to be proud of. A real SEO would have got 6x easy on a site with no existing SEO.

All depends where you are measuring from doesn't it! Where do you think I was measuring from when I said 2x? A site with 0 traffic? That would be quite easy to x6 sure.

Also your post smells exactly like the sort of sales pitch people seeking SEO should try to avoid. Big grandiose speculative statements.

When I read that, I can't help but think

"Grow 6 inch...errr times, or your money back! Real SEO!"

where did you read that they had no SEO before?

The article mentions the beginner's SEO guide, but coming in at 2 pages, the Web Developer's SEO Cheat Sheet from SEOmoz is my go-to resource:


The post is not intended to be a SEO guide. It is priceonomics's doing their SEO (getting inbound links, etc.).

Not that there is anything wrong with that.

Not arguing with the importance of some of those techniques, but just a word of caution that the document hasn't been updated since 2008.

I'm seeing only bad links to the PDF on that page.

Yep, same here. Google isn't helping either. Anyone want to upload the pdf for all?

seo pages can't be found by google.


Try it again. I was just able to access the pdf.

I did a quick SEO analysis on priceonomics.com. Most of these tips fall into the 20% category. Maybe they can use some of these tips to perfect their SEO strategies.

1. Blog on a subdomain.

Though sometimes easier for administration, subdomains might dilute your SEO efforts. Links to a subdomain don't count 100% as links to your main domain. A /blog/ could help with the generation of incoming links and addition of fresh content to the domain you want indexed a lot.

2. Employ Canonical

www. redirects to non-www. Trailing slashes get added automatically. So far so good. But it is still easy to create duplicate URL's by adding random dynamic variables.

Without Canonical an URL like /boats/?dupe=content will point to the same resource as /boats/. Here you might introduce a canonical problem. (http://googlewebmastercentral.blogspot.com/2009/02/specify-y...)

3. Optimize your site for speed

Though not that many search queries are affected by the site-speed algo, site speed remains very important for your visitors, and so indirectly for your SEO/marketing efforts. Google Site Speed plugin, Yslow or these guidelines (http://developer.yahoo.com/performance/rules.html) might help you fix some of these issues and gain a few seconds.

Mostly loading javascript just before </body>, turning on caching, and compressing and combining resources.

4. Robots.txt vs meta robots

/search is disallowed in robots.txt. If you disallow it on a page basis, with meta robots, you can specify: "noindex, follow". That way if people link to your search results, link juice will keep flowing through your site.

5. Breadcrumbs

Add rich snippets mark-up. For product information and reviews, but an obvious contender is the breadcrumb. Link to your twitter (and future Google+ profile) with 'rel=me' to signify ownership of your graph.

6. Images

Add an alt-attribute to the site logo. Specify the dimensions for faster rendering.

7. Don't critique ehow.com if you fill Google's index with 245.000+ automated results.

Or put less bluntly: Write more unique content to introduce bigger categories. Add more relevant content to your listings (reviews, search/trend data, price watch).

8. Make clear if an item is "already sold".

If I click on 10 entries and I get 10 times "item already sold" I start to doubt the usefulness of the application. I compare this to a job site, where the jobs are mostly filled: You happen upon such a site through Google, because Google still thinks these listings are relevant.

9. Quality

The site is mostly void of trust factors. Due to some listings being in ALL-CAPS, some result pages can look a bit spammy. Add more trust factors, and try to repair spammy listings.

Agree entirely with your summary - lots of good advice.

The amount of actual content on the priceonomics product pages is super thin. google is not going to be a big fan of throwing out every page of search results into the index - you're pretty much ONLY creating duplicate content at this point. also the taxonomy has a ton of copies of what is actually the same item, depending on how it was written in the listing. http://priceonomics.com/headphones/sennheiser/hd-595/ http://priceonomics.com/headphones/sennheiser/hd595/ Sorting the taxonomy into distinct products and then adding some sort of content to beef up at least the category pages, which right now are 100% navigation, would be a big step forward. Lots of things you can do going forward, in thinking about this vertical, it's an interesting market space for SEO, lots of challenges and possible strategies.

I'm assuming eventually some sort of comparable product cross-linking will be built into the system?

On point 1 - if you do have your blog already stuck on a subdomain (blog.site.com) this a neat solution to serve it through site.com/blog that might work for you: http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-c...

Reverse proxying an external application isn't always as trivial as just proxy_pass in vhost.conf and go - we recently moved our company blog from blog.picklive.com to picklive.com/blog and it caused issues with WordPress relating to cookies for the admin control panel and invalid cookies set by WordPress caused us huge problems with our actual app.

These issues were fixed by doing slightly more complicated proxying rules and by filtering the cookies as part of the proxying.

Reverse proxying is a really powerful technique (you could sit things like mod_security in front of your relatively vulnerable WordPress install, for example), but unless you have an experienced sysadmin on hand to help if it all goes wrong it's probably not something you want to try based solely on an infographic.

Are you saying DO blog on a subdomain or DONT blog on a subdomain. Sorry it wasnt clear for me.

Do it in a subdirectory, the links to your blogpost will count as links for the various ranking indicators. Your blogs role is to feed traffic to your main site.

Onsite SEO is essentially good web design and content strategy, any developer worth his salt will produce a site that already ticks all the boxes.

All the work is in producing great, relevant content and then link building.

It's not what you say, it's what people say about you. Kinda like a popularity contest in high school. If you are quarterback screwing the hot cheerleader, most people are looking/following you (whether they "like" you or not). So, get links. Lots of 'em. source: http://www.seomoz.org/article/search-ranking-factors

Yes, setting up your site based on broad or long tail terms is good foundation...but you can rank your site for words that aren't even on your site. Why do you think disney ranks for 'exit' and 'leave'? http://blog.searchmetrics.com/us/2011/04/14/the-evolution-of...

A bit off topic, but relevant to the suggestions in this post: a few of my friends are finding it more effective with Google to use a flat file structure rather than the highly nested folder trees that are usually recommended. Anyone have any thoughts/experience with this?

It really depends on how many pages you have. If you just have 10 pages off the domain (/signup, /login, /about, /jobs, /how-it-works, etc) then there's really no benefit of having a hierarchy. If you have > 1k pages, then a flat structure will confuse google/users and do a poor job spreading your site's link equity amongst your pages.

Is that what you're asking?

From my experience it all depends on the size of the site. If you have less than 200 pages on your site I'd recomment that you try and stick to no more than domain.com/folder/page whenever possible.

But, one other thing I would strongly recommend is to never create extra or redundant folders unless they really add a lot of value to your site usability. And, never, ever, create empty folders ... it seems like a no-brainer but it happens a lot more frequently then you might think with people using them to force a specific heirarchy on their site or for the sake of stuffing keywords into their URLs or just because they think the final URL looks 'nicer' that way.

Last thought on why flat is normally better, homepages are likely to accumulate more links than any other page on your site and often more links than the rest of the site combined. So the closer an interior page is to the homepage the more benefit it will derive from the inbound links pointing to the homepage.

I don't think the actual urls matter. In terms of site structure, I actually don't recommend having too many nested folders/directories if you don't have a lot of inner pages. I see a lot of sites with category and subcategory pages with just 5-10 links in each. Google won't crawl that deep if your site is new. and the deeper a page is, the less important Google deems it (based on flow of internal Pagerank).

who recommends highly nested folder trees? i've not heard this and have certainly seen it hurt sites. all things equal, flatter is better. my rule of thumb is go as flat as you can while still having an organized, navigable URL structure.

I sometimes recommend nested folder trees, but never deeper than 3 folders. 3 folders deep is enough for millions of pages.

I like nested folders for a couple of reasons, some of which are:

- Clarity. A good structure will instantly signal what the page will be about, before a user or searchbot follows that link.

- Bolded keywords in URL in the SERPS.

- Using pages to flow juice to categories.

- Benefit of targetted anchor text, when using the URL for anchor text.

Compare: big-ecommerce-site.com/yoga-pilatus/accessories/yoga-mat/



And search queries like: "pilatus mat" or "yoga accessories". Which will work better for these queries?

- Structure. Your URL's will reflect in your breadcrumbs, allowing users to navigate your site through URLs (By removing a folder and going to a higher level in the site).

- Silo's. After the Panda update, and this is speculation on my part, Google might lower the imporance of "parts" of your website. Let's say only your blog is of very low quality. Google could discount /blog/ when you have nested folders. With a flat structure your removed "parts" or higher level categories from your site, and don't allow for this.

Ofcourse there are benefits to going flat too. But nested folder trees are not neccessarily a bad thing.

I have to agree that most people ignore SEO because of the lack of understanding. May be they are intimidated by the very word. The three steps mentioned in the article: URL, Navigation, Page title go a long way. This is were content management systems like WordPress really help as these three things are taken care by design. In addition to this, internal linking structure is important to. Again, a proper CMS makes it way easy to get a decent SEO going to begin with.

In addition to the 3 minimal actions I would also add, to use relevant keywords in each page's content. Often I notice clients creating content without actually using the keywords, by which they want the specific pages to be found. Also check the grammar, especially when creating multi-language content.

Great discussion, thanks. Now im off to the urology forum to ask about my sore elbow. Theyre all doctors, right? They all went to med school. A body part is a body part...skin and bones and theyre all connected anyway.

I have a small nit to pick in an otherwise great post: Codecademy is not the same as Code Academy.

It's also noteworthy that codecademy.com ranks higher than codeacademy.org in the google search for "Code Academy".

Well played indeed.

It's practically impossible to reverse engineer the SERP, but if you look at the Domain Authority for both codecademy.com and codecademy.org, you'll find that the former is orders of magnitude higher (65 vs 35). This might explain a lot.


You can also lookup how many inbound links the sites have using different anchor text like "Code Academy".

I think Codecademy needs to get CodeAcademy.com as soon as they can.. otherwise they're gonna rely on mostly search traffic to get visitors since most ppl won't remember their url

It seems that they do already - see http://www.codeacademy.com

I first read this as Minimum Viable CEO and double face-palmed


If you working on an Internet startup and you're not thinking about SEO from day one: you suck.

Any decent marketing strategy is at least going to give a nod to SEO. Is your team is so consumed by your awesome product that you think it's more important than marketing and distribution?

I really hope that isn't true of anyone here. If championing product over marketing and distribution does sound like your team, I have GOOD NEWS! — you'll be too busy jerking each other off to notice that you're not making any money.

There's no need to employ this tone; we're all adults here.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact