Hacker News new | comments | ask | show | jobs | submit login

The current rhetoric when it comes to the failings of online reviews is to point at fraudulent reviews. I posit that, rather, it's the lack of weighting reviews by the expertise of the reviewers. Clearly the reviews of a hardware engineer at Google should have more weight to them than an average consumer (on products like this), because as bkmartin points out, the average consumer has no idea as to the true quality of a product. Perhaps the cable does work for them. Perhaps the cable mostly works, but the wire gauge is under spec and 1 out of 100 people end up with a fire. That won't be reflected in the overall rating of a product.

Case in point, the Juiced Systems product that Benson reviewed as 2 stars is currently listed as an overall 5/5 stars. In Amazon's defence Benson's review is currently displayed at the top of the reviews. So any shoppers that go through the effort of reading the reviews will quickly see it.

But this product shouldn't be rated 5/5 overall, and because it is, many consumers who can't be bothered to check the content of the reviews will be burned.

This, to me, is a greater problem than fraudulent reviews. Fraudulent reviews can be solved by getting more regular customers to review products they buy. Uber is a great example of UX design that gets customers to review the quality of their service nearly every time. But even if Amazon or other retailers achieve a higher rate of reviews by actual costumers, the non-expert bias will remain.

Of course the trick is, how do we determine which reviewers are experts? Most review systems have a helpfulness rating on each review which could be used to weight reviews in the overall average. But that's only a proxy for an expert rating, is easily cheated, and it's harder to get customers to rate reviews than it is to get them to at least review the product.




When buying something online, I make a habit of reading a couple of the negative reviews. Sometimes they're, "I meant to buy a toaster, and misclicked on this television instead. Makes poor toast. 2/5." But sometimes they're more like, "It worked great for two years, and then became sentient and strangled my pet fish. 2/5" Rating distributions only tell you part of the story.


> "It worked great for two years, and then became sentient and strangled my pet fish. 2/5"

Of course, for some that would be a 5/5.... Which is another reason to read the negative reviews: Sometimes they confirm that the only downsides to a product is stuff you don't care about or don't see as a negative.


another really common thing is people rating the shipping instead of the product in their negative reviews


Though with some products this is helpful. I was just looking at air compressors on Amazon, and there were a few models that seemed to arrive damaged in the vast majority of reviews. Given that feedback, I might pay a bit of a premium to buy one locally.


Perhaps, but it would still be far more helpful to tie that information to the distributor instead of the product. For all you know, those reviews were all due to a different distributor, and the one your were looking at would have been fine.


Very interesting--I noticed the same thing when shopping Amazon for air compressors specifically. Decided to roll the dice and received an undamaged Makita... and did not circle back to write a review.


Someone did an article a while back about where you can use the reading comprehension level of reviewers to weight the Amazon star system.


Interesting --- I can pretty much look at a review from across the room and discriminate good from bad based on paragraph structure and the distribution of capital letters. If it looks like paragraphs you would see on the page of a book, it's probably well-reasoned enough to agree or disagree with. If it's ragged, with irregular capitalization, it rarely contains meaningful content. There's nothing there to evaluate.


Definitely, it was something I looked into trying a while back. A chrome extension that would give different weighted amazon reviews based on a couple models, in line on Amazon's site, could be useful. I was also trying to figure out a way to do text mining of book samples for better book recommendations, based on my preferences, but getting the samples out of Amazon was a pain.


I usually go through a few of the reviews, more depending on the price and relative number of reviews...

Of course with technical goods, you see a lot of reviews from people who don't know/understand what they were buying... (1/5 stars, this network adapter [PCI Ethernet] wouldn't work in my laptop)

Other negative reviews are about dealing with amazon for returns, or wrong product shipment, which really shouldn't be under the product review. Though it's worth seeing some of that.


Negative reviews are the first thing I always read.


One problem I have with Amazon reviews is this type :

"Just got the product, arrived really quick. It worked when I tried it a minute ago, 5 stars."


Or, "UPS driver threw package over fence, and it sat in the birdbath for 2 weeks before I found it. Didn't work when I plugged it in. One star."

People should have to take an IQ test before being allowed to post reviews.


These are better than Windows Store reviews. Somehow, MS has managed to get comment quality lower than YouTube. One word reviews. Half sentences. Utterly worthless.


Are customers pestered to leave reviews? For example, every time you open the app, it asks you for a rating, etc. That might cause worthless reviews.


I have a feeling there's some kind of reward system in place for reviewing products and answering questions. There's so many answers to fairly simple questions that are along the lines of

> will this HDMI cable connect my laptop to my tv?

> i don't know but it sends a good signal from my xbox to tv

They help nobody and just clutter up the actual helpful answers


As I understand it, when you send a question, random customers get mails from "Amazon Answers" titled "(First Name): Can you answer this question about (product)?"[1]. And even if they can't, they still answer to the best of their knowledge which sometimes leads to half-helpful discussions.

[1] http://www.ecommercebytes.com/cab/abn/y14/m04/i15/s03


You get imaginary internet points and if you do it enough, and well enough, you can get into the top X reviewers which earns you a badge. The real lure is Vine Voice where Amazon sends you stuff for free in return for a review.


You used to have the ability to downvote useless answers, but that seems to hav3ew gone away.


That's why I clicked on each of his reviews and marked them as helpful. That way they will sort to the top.

Ideally this will happen on every product review page, as other readers recognize an educated review and mark it as helpful accordingly.


Your clicks probably got discounted as fake, perhaps if you spent enough time on pages showing each review first they wouldn't? I don't truly know how careful Amazon are to filter anomalous upvotes however.


> Clearly the reviews of a hardware engineer at Google

Not sure if this is an oversight, but Benson identifies himself as a software engineer.

I feel divided on this. On the one hand, I'd like to give more weight to his reviews; on the other, why? Being at Google, a software engineer, or working on the Chromebook don't actually make it more likely that his reviews should be better than anyone else's.

I feel like most people would take his word because he claims to work at Google -- wouldn't Google be upset with that? It's not like they as a company endorse the product, but his call-out to them makes it sound like he's endorsing them as a Google representative.

It just makes me uncomfortable.


> Being at Google, a software engineer, or working on the Chromebook don't actually make it more likely that his reviews should be better than anyone else's.

Of course it does make it significantly more likely. "Being a software engineer at Google" restrict the space of reviewers to the > 100 IQ subspace and further to the "good with tech" subspace. "Working on Chromebook, et al." further cuts the space down to the people who have to deal with USB-C devices at work. Opinion of such selected person should be given much more weight than the average, because basically he most likely knows the shit he's talking about, while average reviewer has no clue about anything.

> I feel like most people would take his word because he claims to work at Google

I think most people take his word because he cites the USB spec directly. No bullshit, just pointing out the exact problems and their implications.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: