Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Notice HN: The Review System on most major sites is broken
7 points by DanielBMarkham on Dec 15, 2010 | hide | past | favorite | 5 comments
I was going to blog about this, but I'm short for time, so I thought I'd just post here.

I was poking around my server logs, seeing what folks bought after visiting hn-books, and I found this author, Tim Ferriss. He has some book out about how to become superman, lose gads of weight, put on super muscle mass, all while picking your nose and watching Oprah. Or something very similar to that. (For all I know the man is a miracle worker. His pitch seems a bit over the top, though)

When I click over to Amazon, one of his books has 1062 reviews, all positive! http://bit.ly/fVCXNM

Geesh! Look -- I've been poking around Amazon, both as a new affiliate (the above link is an affiliate link) and as a customer. I like reading reviews for stuff I buy before I buy it. Both the good and bad reviews help me make an informed choice.

But something is really wrong here. As several reviewers noted on his most recent book, the number of positive reviews is just crazy. Books with tremendously great reputations don't have anywhere near the reviews as some of these books do.

There may be an innocent reason for this. Perhaps just a big fan club? Or more darkly, perhaps folks are starting to game the review systems. This isn't the first time I've suspected something is seriously wrong with the review systems on ecommerce sites. It's like a hidden form of spamming where you never realize you're being spammed. Hopefully some aspiring startup can help fix it. Maybe some kind of weighed meta review site? Because as more hacking takes place, more and more this pattern of selection isn't going to work at all for the consumer.




Given the lack of barriers to entry to reviewing on the internet, The more positive reviews a service has the more likely they've had to actively incentivise people to review it, which tends towards an unwritten law of the internet that the quality of a product is almost inversely proportional to the quantity of unsubstantiated glowing testimonials they're able to point to. Don't believe me: Google "home business opportunities" and compare the Ponzi schemes to the franchises...

In Ferris' case, he could probably have achieved his astroturf effect simply by sending hundreds of advance copies to bloggers in the "lifestyle guru" market who'd be salivating over the content even if they weren't affiliate-linking their shill blog entries to it, which isn't that different to what any other publisher does save for the scale and the profile (though I certainly wouldn't rule out the alternate hypothesis of him paying someone in the Philippines to do it for him).

It's relatively easy to algorithmically penalise astroturfing in the overall rankings, remove duplicate IP entries altogether and normalise scores for reviewers that only give out top marks and order reviews by some form of reviewer quality score rather than (primarily) date. They've probably also got enough data to offer a "People that buy and rate things similarly to you thought" selection of reviews to their regular customers and wishlist users. The real problem is that however much Amazon feel inclined to penalise obvious astroturfing and , it's manifestly not in their interests to do so given that the book is a guaranteed bestseller. Even if people really hate the book it probably isn't going to put them off Amazon as a vendor.

Hunch might help because it's simply too big and vague to easily game; but it's non-specific by design about why you might be interested in a service. Peer recommendations via Facebook are a possibility - though I'm not sure I really want to give out or get unsolicited product recommendations from my friends. The best solution is probably that venerable dinosaur; traditional media and reviewers that actually have some sort of reputation to stake. I'd place rather more trust in reviews linked to by Rotten Tomatoes or Metacritic than I would in Amazon or the IMDB.


There's absolutely a ton of room for improving review systems. Right now I have a fairly standard "how many stars + written comments" review set up on my startup (example link below), but I'm planning to test some ideas, such as giving people with different levels of karma different numbers of "points" to review with, to try to get the distribution of ratings closer to a standard bell curve.

example link: http://www.parkgrades.com/parks/rio-montana-park


I think it's helpful to look at a reviewer's review history. Urbanspoon prominently displays the reviewer's number of reviews next to each review. I generally ignore reviews from people with fewer than ten. A similar system could be helpful on other sites.

It can still be gamed, but at least it's a step in the right direction.


Here's another example.

I went looking for sci-fi books for my kindle for the Christmas holiday season.

It was really difficult: all I wanted was a good stand-alone book that was readable and thought-provoking. What I ended up getting by looking at the reviews were lots of books that were in some kind of 20-part series where the only reason they got so many good reviews was because the author/series had a huge fan base. It took an hour to de-tangle the fanboy reviews from the ones that were helpful, and that was even paying attention to Amazon's meta-review system.

That's not really very helpful.


1 item cited? Seriously?

It's not broken.

People are incited to review something for either because the like it or hate it.

How can nobody hate his book? Maybe because the only people who bought it are the ones who already drank the koolaid. They want to think it's great, so they post saying it's great. People who wouldn't like the book knew so before buying, so they didn't waste their money and feel cheated.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: