1. Blog on a subdomain.
Though sometimes easier for administration, subdomains might dilute your SEO efforts. Links to a subdomain don't count 100% as links to your main domain. A /blog/ could help with the generation of incoming links and addition of fresh content to the domain you want indexed a lot.
2. Employ Canonical
www. redirects to non-www. Trailing slashes get added automatically. So far so good. But it is still easy to create duplicate URL's by adding random dynamic variables.
Without Canonical an URL like /boats/?dupe=content will point to the same resource as /boats/. Here you might introduce a canonical problem. (http://googlewebmastercentral.blogspot.com/2009/02/specify-y...)
3. Optimize your site for speed
Though not that many search queries are affected by the site-speed algo, site speed remains very important for your visitors, and so indirectly for your SEO/marketing efforts. Google Site Speed plugin, Yslow or these guidelines (http://developer.yahoo.com/performance/rules.html) might help you fix some of these issues and gain a few seconds.
4. Robots.txt vs meta robots
/search is disallowed in robots.txt. If you disallow it on a page basis, with meta robots, you can specify: "noindex, follow". That way if people link to your search results, link juice will keep flowing through your site.
Add rich snippets mark-up. For product information and reviews, but an obvious contender is the breadcrumb. Link to your twitter (and future Google+ profile) with 'rel=me' to signify ownership of your graph.
Add an alt-attribute to the site logo. Specify the dimensions for faster rendering.
7. Don't critique ehow.com if you fill Google's index with 245.000+ automated results.
Or put less bluntly: Write more unique content to introduce bigger categories. Add more relevant content to your listings (reviews, search/trend data, price watch).
8. Make clear if an item is "already sold".
If I click on 10 entries and I get 10 times "item already sold" I start to doubt the usefulness of the application. I compare this to a job site, where the jobs are mostly filled: You happen upon
such a site through Google, because Google still thinks these listings are relevant.
The site is mostly void of trust factors. Due to some listings being in ALL-CAPS, some result pages can look a bit spammy. Add more trust factors, and try to repair spammy listings.
The amount of actual content on the priceonomics product pages is super thin. google is not going to be a big fan of throwing out every page of search results into the index - you're pretty much ONLY creating duplicate content at this point. also the taxonomy has a ton of copies of what is actually the same item, depending on how it was written in the listing.
Sorting the taxonomy into distinct products and then adding some sort of content to beef up at least the category pages, which right now are 100% navigation, would be a big step forward. Lots of things you can do going forward, in thinking about this vertical, it's an interesting market space for SEO, lots of challenges and possible strategies.
I'm assuming eventually some sort of comparable product cross-linking will be built into the system?
These issues were fixed by doing slightly more complicated proxying rules and by filtering the cookies as part of the proxying.
Reverse proxying is a really powerful technique (you could sit things like mod_security in front of your relatively vulnerable WordPress install, for example), but unless you have an experienced sysadmin on hand to help if it all goes wrong it's probably not something you want to try based solely on an infographic.