Hacker News new | comments | ask | show | jobs | submit login
Amazon Reviews: How We Spot the Fakes (thewirecutter.com)
335 points by Osiris30 on May 13, 2016 | hide | past | web | favorite | 228 comments



I actually know people whose job it is to write these things, and to create and maintain sock puppet personas on forums for the purpose of subtly promoting clients' products and disparaging those of competitors.

Then there are the professional reviews, which are all conflicted because most reviewers either get to keep the gear or the gear maker is a sponsor of the site/publication and the reviewer knows it. I say this as a person who's still involved in publishing reviews (of outdoor gear, mostly), and who most of the time gets to keep whatever some company sends me.

However rotten you think the adtech industry is, the state of product reviews online -- user generated, casual forum reviews, professional reviews, etc. -- is worse.

I have a standard rant about the current state of the web that I give all my family and friends, and it always ends with: "if you didn't pay to read it, you probably read an advertisement."


The FTC requires disclosure for reviews that were written by individuals who were given a product for the purpose of reviewing it. It is considered anticompetitive and illegal to forego disclosing this about a review.

As an engineer at perhaps the largest ratings and reviews company worldwide, I would love to hear your standard rant in long form. Please pm me if you feel up to that conversation. Ideally the result would be an improvement in the quality of the content of billions of reviews.


I wish they'd extend that to requiring journalists to disclose which PR agent convinced them to write a submarine for their company.


If the news agency accepted compensation in return for writing the piece, then they are required to disclose it. Relevant FTC guidance:

> A basic truth-in-advertising principle is that it’s deceptive to mislead consumers about the commercial nature of content. Advertisements or promotional messages are deceptive if they convey to consumers expressly or by implication that they’re independent, impartial, or from a source other than the sponsoring advertiser – in other words, that they’re something other than ads. (...) As the FTC Endorsement Guides establish, advertisers’ use of third-party endorsements may be deceptive if there is an undisclosed material connection between the advertiser and the endorser – one that might materially affect the weight or credibility of the endorsement

https://www.ftc.gov/tips-advice/business-center/guidance/nat...

https://www.ftc.gov/public-statements/2015/12/commission-enf...

These rules are enforced. For example, Machinima recently settled with the FTC over the fact that Machinima paid endorsers to include footage of the Xbox in reviews and commentary.

http://www.theverge.com/2016/3/17/11254260/machina-settles-f...


I have NEVER seen at attributed article in 30 years of reading such things. Typically you are not paid to write an article, you are encouraged to write an article featuring that company, and they agreed to do advertising in your magazine or on your site.


I don't (just) care where they got material compensation. I care whether they're doing this because of influence from one of the subjects of the story -- e.g. if the PR dude wrote the story and the journalist just forwarded it on.


What would incentivize me to forward on such a review? Material compensation just happen somewhere... No one posts great reviews about a product because they were asked nicely. Unsolicited? That's fine; I consider that a review based on the merit of the product. Solicited? The first question is, "what's in it for me?"


This FTC regulation might have more teeth if the definition of compensation included, for example, influencing the hiring or employment of a relative related to the subject of the post.


Or "writing the freaking article for you".


There are probably indirect ways to "pay" the advertiser


I'm not seeing any contact info for you in your profile, or a means to PM you. You can ping me via email with my first name at jonstokes.com. I'm in Austin, too, so can meet up.


Those reviews are still averaged into the star rating, which is all that most people look at


I don't know of any studies (do you?). Do people only look at the star rating? We have done some statistics and found that star rating only tells a small part of the story. Anyone with a background in statistics knows this, though.


My personal strategy is to ignore all the good reviews. I already know why I'm interested in something. I want to know what other people think sucks about that something, and then I decide whether it is something that would bother me.

People are naturally more vocal about complaints, and if they have a serious gripe then they are far more likely to speak up in a negative review.

If they like something, that is basically meeting expectations and there is no urgency to seek resolution. So I expect a larger percentage of positive reviews to be shills, since nobody else will feel much need to post anything.

But if the negative reviews are things I don't care about (or are obvious douche reviews) that tells me a lot.


This is my base strategy as well but thinking it through it mostly means that negative smear reviews of competitors should be more "valuable" than fake positive reviews of the companies own products.


If you guys don't mind a third person in that conversation, I've been spending a lot of time thinking about this issue lately and it would be enlightening to hear from others who have probably been thinking about it for much longer :) My email is my profile and I always reply. Also located in SF if anyone wants to discuss over food/drinks.


Always interested in more folks joining this conversation. It's what I do full time. I think my contact info is in my profile. I'm located in Austin so I'd have to fly out for drinks. ;)


> the largest ratings and reviews company worldwide

So, Amazon? Or am I misunderstanding what you mean by this?


Amazon is not strictly a ratings and reviews company. I don't know in total how many reviews they have.


Do those regulations have any teeth? Because it seems like it would be very hard to enforce.


If the FTC contacts you directly, expect your corporate lawyers to ask you to prioritize resolution over pretty much anything else.


You can file many kinds of complaints on the FTC website, but AFAIK, this is not one of them. It's a shame—I've encountered several websites, both as a consumer and as a startup founder, of pay-for-play reviews. Hopefully they'll create a category so that people can file complaints about this.


And if you think normal product reviews are in a bad state... well, then there are web hosting reviews. In that industry, pretty much 100% of the recommendations are driven entirely by 'whose affiliate scheme brings the most money'. Or even worse, they're like a mafia protection racket; pay the money and we'll take down bad reviews, don't pay and we'll take down the good ones and leave only the bad ones instead. Parts of that industry make the Amazon system look entirely respectable by comparison.

Then there's the state of video game reviews, but that's a whole 'nother kettle of fish. Double so when mobile apps are involved.


My favorites are reviews on the manufacturers own site. Oh wow! 480 5-star ratings out of 487?!


Look at that, the most expensive one is best rated!


Do you have an example? What makes it unbelievable? What if a product is legitimately that good?


It's not. No product is ever that good, simply because every product occasionally fails and some of your customers are awful.


True. The worst are VPN reviews though. Thats a total other league of amateur companies actually to be reviewed. Hosting companies, well its more clear who are the big players and such. vPN is a shady world - who knows who is really logging what - etc.


That's why you create a mesh route of various VPN providers. VPN provider 1 knows that you are using it to talk to other VPNs. VPN provider 2 knows that some VPN customer at VPN provider 1 is talking to website X. All the red tape and technical debt should prevent any major problems, if you know what I mean.

I personally just use Usenet for whatever people use VPNs for, I think. If I were doing something that needed to be kept from the NSA, I guess I would use Tor. Or postcards covertly hidden under rocks in public parks or whatever ;)


>if you didn't pay to read it, you probably read an advertisement.

But your own comment (that I'm replying to) isn't one, and we're not paying for it - you're speaking to us frankly and off the cuff (and if anything, against your own interests on those other sites) because of the informal club-like forum that this is. In the end I and any other readers here are giving you all due weight. in exchange everyone else speaks frankly about stuff they know about. You can read stuff here that probably posters are explicitly prohibited by their employers from posting. some guy from (insert whatever state agency here) can probably post from a throwaway but still waaaaaaaaaaay in contravention of whatever their policies are, and with only a little bit of indirection.

So it's hardly as simple as you suggest. yeah you have to read through the bullshit, but this has always been true everywhere. caveat emptor.


Or jonstokes works for a paid, members-only site that provides owner-verified reviews, making his post a subtle advertisement.


That's not a bad business idea :)


That's why I qualify with "probably". Also, it's sort of a deliberate overstatement intended to encourage the hearer to err on the side of caution.


A system that has worked for me is:

1. Foremost Does the product meet my need?

2. I ONLY take the negative reviews seriously and look for commonality between them. If I can live with those issues than I purchase the product.

3. I'm also cautious of products where users appear a bit over zealous to correct bad reviews.


Loads of products have fake bad reviews now, left by competition. The next logical step is fake corrections, and fake silly reviews on own product. I wouldn't be surprised if at some point in the future every product has a 4.75 star average, and the shills instead of bad reviews start writing fake-sounding good ones.


This is close to my strategy, but I mostly read reviews of 2-stars, 3-stars, 4-stars. I sort by usefulness, and read about 10 of each star-rating by usefulness, then recency. I've been thinking about packaging that 'workflow' into a browser extension, but I'm not sure how often Amazon changes the structure of their review content.


> I'm not sure how often Amazon changes the structure of their review content.

All the time, I think they do a decent amount of A/B testing on product pages (which would make sense). I looked into scraping reviews and doing some analysis for this exact reason.


Indeed. I often get paid to write these types of reviews informally.


And what are you advertising?


One thing I've found useful is following specific reviewers. For example, Peter Norvig reviews many, many programming and CS texts.

He has both glowing reviews such as this one: http://www.amazon.com/gp/review/R30AQNQK2I1O7P?ref_=glimp_1r...

And scathing ones such as this one: http://www.amazon.com/gp/review/R2C7L5KHUVHOR2?ref_=glimp_1r...

By following several reviewers I respect in each domain I care about, it's not hard to find quality texts (or products) that I'd never have found by typing words into Amazon's search box and looking at average review scores.


A Googler reviewing every USB Type C cable (and leaving scathing reviews for ones that suck) comes to mind.

http://gizmodo.com/a-google-engineer-is-publicly-shaming-cra...

http://www.amazon.com/gp/pdp/profile/A25GROL6KJV3QG/ref=cm_c...


That's a great example too, because Amazon actually ended up following up and banning cables that were found to be non-compliant with the spec, largely based on his reviews: http://lifehacker.com/amazon-bans-non-compliant-usb-c-cables...


I like that they've said they'll start banning the cables, but I'm disappointed that - from my reading of the article - Amazon isn't doing any testing to detect these bad cables and are instead relying on customer reviews to report bad cables. That means for most cables someone's going to have to buy a cable and have it damage their device or have the equipment and know-how to detect it ahead of time.

Having Amazon test every cable obviously doesn't scale either but couldn't they require the vendors to provide test results showing that the cable is spec compliant? Of course the vendor could forge the results, but that's intentionally malicious and this would at least catch non-malicious accidents.


Are these bad cables UL listed? That seems like it would be an easy enough filter for someone like Amazon to employ.

Electricity goes through it. Check. UL listed? Check. Okay, goes on the suggestions page.


This sort of thing is very hard to catch because the products themselves can change without a SKU changing which means you have to constantly test them. It's not Amazon's model, but from the customer's point of view it's probably better to severely limit the number of vendors for something commoditized like a USB cable.


He recently fried a Chromebook with one of those bad cables. Can't find the source right now but it was posted here on Hacker News.


Here's another one that comes to mind. He does GPS watches geared toward activity and fitness use cases: http://www.dcrainmaker.com/blog


Agreed, the best thing you can do is Google "_____ product review" and look for credible reviewers, someone with a reputation to protect. I worked with a product company where v1 had some product flaws and started receiving negative reviews, but it was very easy to overwhelm those with fake positive reviews across the web. Then when we came out with a much better v2, we got a scientist in our field with some following to do a short positive video review. That is difficult to get (I'm not talking about YouTube/Instagram/mommy blogger reviewers who give positive reviews to anyone who will send them free product) and there's no way he would have put his name on the line unless he had tried out the product on his own extensively. Any product company worth its salt should know this and get a few of these expert reviews posted to the web.


Great idea - This sounds like an idea for an app as finding and following reviewers you can trust manually sounds like a chore.


How do you know you can trust the app, though?


Maybe somebody can write-up a review.


We could have an app to follow the ones we trust.


It's turtles all the way down.


The app would need to have its own reputation to protect (i.e. a brand). I think the best approach for this is an app with a large community behind it. For a while, that seemed to be Epinions. I'm not really sure why it didn't work out for them.


Transparency. The app would need to be open enough about how it found and vetted trusted reviewers for people to take it seriously.


No problem. They will find vendors to review the app to bump it up in the app store.


Funny thing, the site linked in the OP is sort of an attempt at that. They look at a type of product, look at reviews, narrow down to a few different models, and test them out, and basically say "get this one."

I actually use their sister site the Sweethome more than the Wirecutter. I tend to be an obsessive prepurchase researcher, and so far I've been happy with their recommendations.


The wirecutter is great for finding the best product. It's less focused on deals though.


This has been my approach to following the news as well. Bias can be subtle and poor reporting isn't always easy to spot until you see another piece. But if I follow a couple specific people, though, I know how to place what they say in context. It helps.

What I find interesting is that the Internet has been long acclaimed for it's ability to give anyone an equal voice and to remove focus from authority and context put the focus on just the here-and-now content. But here we are lamenting the fact that context and credibility of the source are actually incredibly useful.


Amusingly, there is an obviously-fake review for exactly the same product as your scathing example.

http://www.amazon.com/review/R1UCB6JLZ6TRTN/


I own Java How to Code by Deitel, which he references in the review of the Python one by same company. It's terrible. I actually literally, not figuratively, used it to sleep at one occasion. It's huge, which is a bad thing. It only deals with language features, which is worse. It has lots of examples, which would be a good thing if they weren't so verbose and boring and can't fucking get to the point.

If you read this, not only should you not buy a Deitel programming book, but you should tell all your friends not to buy one as well.


This is what I try do. I review website builders at sitebuilderreport.com— but I actually in-depth reviews (for example: I pay for each website builder with my own credit card to test their cancellation policies).

The hard part (I think) can be establishing that trust— getting the reader to understand that hey, I'm a real guy, who's really spent a lot of time looking at website builders.


Does anyone know how to find good reviewers on Amazon?


This is essentially the idea behind The Wirecutter.


Half the time the fakes are so ludicrously easy to spot that I really wonder why they bother. Fifty glowing reviews of a book within a day of publication all raising the same 4 esoteric points in the review. No doubt 4 points suggested by the author/publisher when asking for the astroturfing.

Completely disregard all Amazon Vine reviews.

Now weight heavily in favour of reviews that have had update edits - "it failed after 2 months. The replacement failed 3 months after that" (common on all makes of $200 LED gaming keyboards) or "after a year it's still in perfect condition". Photos on the reviews are often a good sign too - but only when there's just a few - photos on reviews are rare. So lots with similar framing would be an instant strike.

Try finding a CAT6a cable on Amazon - seems they're all 4.5 - 5* wonderful, yet dig into the bad reviews a bit and most of them don't even meet wire or shield spec. Even those coming from Amazon themselves.

Aside from reviews from specific, trustworthy people, reading the mid-tier and bad reviews gives a much better flavour of what a product is about, and the odd review tearing the product to pieces can be worth its weight in gold.

If it's a marketplace seller my starting point is to mistrust all reviews and let them earn trust really slowly.

I sure won't be going near Fakespot, especially when they're so intentionally vague in describing their own product - it'll be a score to game like your Klout score and all the other meaningless metrics. The astroturfers will adjust what's needed in the fakes to pass.


> Half the time the fakes are so ludicrously easy to spot that I really wonder why they bother.

They only need to be slightly difficult for a robot to flag as a fake. Most people will just sort products by a combination of price-and-rating, and buy the top-rated thing in their budget; it doesn't matter if the 5-star reviews are all Markov-chain generated garbage, all users see are the stars.


I know I'm probably atypical - a techie and old enough to be cynical of most things, but are most people really still that simplistic? Surely 3 or 4 crap purchases or 25%+ return rate is going to be enough to tell anyone that the star rating is worthless?


If it's got say 300+ reviews it's not like people are going to read all 300+ to figure out how many of them are fake.


I actually avoid the reviews with photos as 99% of the time they end with "I got this product for my fair and unbiased appraisal" or somesuch.

I've never quite got around to writing a browser plugin that excludes any review that have such a statement, and / or builds up a database of reviewers that are trustworthy on amazon.

The guys in the Top x of reviewers, who receive all this crap to review, basically it's their living, they aren't going to start bad mouthing it.

Maybe take the star rating of each review within a month of a new listing and then compare the average 6 months later to see how shill those initial reviewers are.


> Maybe take the star rating of each review within a month of a new listing and then compare the average 6 months later to see how shill those initial reviewers are.

Steam just updated its user review system to do just this: it shows both the overall rate of positive reviews as it did before, but also the positive rate in the last 30 days. Now, their prime motivation was probably more to do with things like patches (or the lack thereof), but it would probably do a pretty good job of counteracting paid/biased reviews from around launch as well.


A moving average might be a good idea there.

Giving Vine reviews a strike probably cuts all the "I received this item on spec..." garbage, with or without photo. what's left is photos from hopefully "normal" reviewers.


It might be interesting to compare a few products where you are knowledgeable and have a high degree of confidence in the review quality (one way or another) and seeing how you align with the Fakespot score.


The point of trusting Amazon reviews is to lower search time for an item. You basically have to read the reviews of every 4-5 star rated item if you don't trust the Amazon ratings


That's entirely the point though, especially since free-for-alling marketplace, Amazon have proved their rating is worse than useless. If you trust the star rating without reading the reviews of every item you have more money than sense, or it's a £1 item.

I basically do have to read the reviews of every item I buy from Amazon or buy somewhere more trustworthy. Which is most places of enough size to be well known.

That's really sad because in the early days of Amazon, before marketplace, Vine and unchecked shills, their rating was one of the better metrics going. They threw away every single ounce of trust they had earnt. Apparently intentionally.

IMDB ratings are the same - so untrustworthy as to be worthless noise. Early IMDB, before Amazon, I could trust the ratings blind.


A lot of ratings feel like the user could have a different perspective.

If you give 5 star to an erotic novel you can reasonably assume that the buyer will understand the genre and not compare it to Jane Austin. Technically Six Shades of Grey is a better erotic novel than Pride and Prejudice but that is a silly comparison to make.

A £300 laptop computer can be very good value for money and deserve 5 starts. But it may not be as good as a £500 machine with 3 stars. The problem is that people don't understand the product and buy purely based on the number of stars.


I no longer trust Amazons reviews. They're so obviously loaded with fake reviews. I try to be smart about it and sort by most recent. I've been burned on a few products recently as well.

I met a shady guy recently, who has personally paid for tons of take reviews on kindle books. After hearing how easy it is from him, I really lost trust.


It's not just the reviews that are fake. I recently tried to buy a plush Baymax doll for my daughter, and there were so many people complaining about getting bad Chinese knockoffs from the sellers, that I gave up. They not only have lost control of their reviews, they've lost control of the vendors. Their whole storefront operation is circling the bowl, and I say this as a longtime Prime subscriber.


Ditto here. Amazon is almost as bad as circa 2004 EBay these days. Its impossible to find genuine products in many categories. Ordered a specific tool a few months ago, and a knockoff came in the mail via "epacket" from China.

My assumption for most products these days is that if Amazon is cheaper than a retailer, it's grey market.

IMO, they did a Walmart -- at some point the mission shifted from serving the customer to total domination at any cost.


I don't think so. Anything I buy shipped from and sold by amazon I know will be legit. I don't buy anything third party if I know I cant verify it's authentic.

EBay is much worse in this regard as there is nobody I can trust to have properly sourced their inventory.



Buying from Amazon is NOT a guarantee of authenticity. Popular items such as PlayStation controllers & Beats Headphones have a bad reputation for being knock offs.


Yes it is. If it is shipped and SOLD by Amazon Inc., it is pretty much legit. There is, however, also just shipped by Amazon, which is basically selling through third party, but Amazon handles shipping.


I understand the difference between SOLD by Amazon, and Amazon doing fulfillment. Products SOLD by Amazon are still not 100% guaranteed legit. That being said, I have not had any problem returning/getting credit for knockoff (or questionable) product.


Only way that can happen, I think, is when someone did a bait and switch with a return.


That is certainly one way. If Amazon is getting product from multiple suppliers, they can easily taint their inventory with a single bad vendor. Then there is always the human element. Amazon has a reputation as a somewhat 'crummy' employer. Warehouse workers, when pressed to meet order/put-away quotas, tend to put inventory in pick locations based on the product type, NOT where the warehouse management system directs them to. (This of course causes all kinds of stock level issues not to mention accounting issues for those third-party suppliers that Amazon is acting as a sell-through for) I do not believe that Amazon is intentionally buying knock-off products with the intent to scam customers but just based on their physical size & the variety of products that pass through their facilities, it is going to happen.

Back in the late 90's, early 2000's before Amazon was as big as they are now, I worked in the commercial music distribution arm of Sony/BMG. We were the ONLY people that were manufacturing (In our own facilities) CD's for bands on our music labels and yet bootleg copies of CD's were going out Amazon's doors to customers. How? Amazon doesn't keep an abundance of stock on hand. If they run to low, they would buy product for One Stops (at a higher price than we sold to them) because the One Stops could get several thousand CD's in a few hours while an order from our warehouse might take 36 hours to get there. It was a stop gap for them to not go Out Of Stock. Those One Stops should only have been getting their product directly from us also, but no one watched their inventory as closely, and they would at times buy product from unofficial manufacturers. Those One Stops were trying to scam people. (Mostly Mom and Pop CD retailers)

Again, I have not had any problem returning product to Amazon.


>> Warehouse workers, when pressed to meet order/put-away quotas, tend to put inventory in pick locations based on the product type, NOT where the warehouse management system directs them to

Have you got any source for this being a real problem ? Because it seems such things are easily detected at the packing station, by barcode. And if an employee does so often, he'll probably won't stay long.


Only speaking as an IT professional that has spend a couple of decades working in various distribution environments. If you are a pool manager in a warehouse this is a real problem. Warehouse employees working the receiving docks seem to always find a way to empty a truck at the end of a shift, even if that means shoving pallets of product in the nearest open rack slot. Yes, the WMS will eventually catch the problem and track it back to the employee. In a warehouse working with food/consumables there are lot controls that are suppose to be adhered too. (For tracking and recall purposes). The barcodes on a 'quality' knock off is going to be the same as the original product. Warehouse workers don't care if the product in a pick location is genuine or not. They are managed/measured on their picks per hour & accuracy. (Accuracy based on where they were directed in the warehouse to pick and the correct quantity) Most warehouse employees are through temp agencies and their work ethics can be, though certainly not always, somewhat lacking. Honestly, they typically aren't treated with respect from the start. This article is fairly representative of what I have seen in at least 2 of the 3 warehouse environments I have been associated. http://www.motherjones.com/politics/2012/02/mac-mcclelland-f...


They comingle many commodities that are fulfilled by Amazon. As long as the packing is good, you're at risk.


Generally, I only buy from Amazon when:

1) Amazon itself is the seller 2) The manufacturer is the seller 3) It's a sufficiently uncommon item that the odds of getting a counterfeit is low

Following these rules, I generally haven't had any problems with Amazon purchases.


Following no rules at all, except sometimes "must be Prime item," I've bought maybe 200 items from amazon in the past year and I don't really recall having a problem with any of them.

I'm sure I've had something that wasn't what I had hoped, but I truly don't remember. On the other hand, I can think of dozens of things that were great. Electronics, tools, consumables, toys.

Not to mention, Amazon pretty much always will refund you.

You guys sound a little worked up over next-to nothing, honestly.


Trying to buy things like batteries or SD cards is just an invitation for fakes. The only possibly way I might buy one is if it is sold by and fulfilled by Amazon.


Not even. Many have commented that "fulfilled by" means very little nowadays, too. The product / SKU is just thrown together. Amazon doesn't care about selling you the specific item the vendor sent them, just 'that item'.

I bought a Macbook Pro charger from Amazon, sold by Amazon. I got a clearly counterfeit charger that came in, I kid you not, two Ziplock bags (one for the cord, one for the unit).


Sold by and fulfilled by? I'd imagine if Amazon sold it, not just fulfilled it, they'd be very interested in knowing that and looking into it. Unless Amazon's really totally given up...

(I dropped prime when they blacklisted competitors' products: Chromecast and Apple TV.)


They can come directly from Apple like that. We get our chargers straight from Apple's Carlisle, Pennsylvania warehouse and they come in a zip-lock bag. It could still be counterfeit, but just because it comes in a plastic bag does not mean that it is.


In this case the text on the wall wart didn't match the text on the one I got with my MBP, so I'd consider it unlikely to be genuine.


Doesn't matter terribly much, everything trends to 4.5 stars anyway.

I have come to trust that if an item has tens of thousands of reviews, few 1-star reviews, and some useful critique/criticism in the reviews, that while it may not be the best [vacuum|pot|canoe] I'll probably do just fine with it. Much like with Android vs iOS, there are real differences but both have a very high quality level and the non-professional vacuum user is really fine with either choice.

If I really, really, really care about the exact details (e.g. digital cameras, computers) I'll go elsewhere, simply because storefront reviews are never a good source of side-by-side tests & sophisticated, in-depth analysis.


Always read the 3-star reviews.

Those are usually people who have given some thought to the good and bad about a product.


This is what I do also. I go for the 3-star. If I read a 3-star that seems legit and indicates the person really evaluated the product, I even look at their other reviews, as their 4 and 5 star reviews might be useful.


> If I really, really, really care about the exact details (e.g. digital cameras, computers) I'll go elsewhere, simply because storefront reviews are never a good source of side-by-side tests & sophisticated, in-depth analysis.

Honest question: where do you go instead? YouTube? Review blogs? What makes you trust those reviews more than the in-depth reviews you find on storefronts?


All over, actually. I trawl forums, review blogs, & news articles. The more the better.

"Professional" reviewers in particular fill a gap- some of them have had their hands on just about every <vacuum> ever made, so they can provide something frequently lacking on storefront reviews, an opinion well informed by experience with many alternatives.

I also like sites like DxOMark, which provide raw test data that you can review yourself.

The main theme though is just, many sources. Any one person can be bought, but not the whole internet.


I agree about pro/niche reviewers. A few sites I've used: dpreview.com (cameras, though owned by Amazon now, I think), rtings.com (TVs, and they're adding headphones), and sleeplikethedead.com (mattresses). I am sure they have their biases, but they're good places to start. (And you learn a lot about what the various features/materials/buzzwords actually mean.)


I like that most recently idea. I'm going to use that.


One potential solution is for Amazon to do a better job of encouraging users to rate every product they buy. That would quickly drown out the fake reviews and make such tactics uneconomical.

Off the top of my head I would suggest that on the post-checkout screen they should list the customer's most recent purchases asking "Please rate your current satisfaction with your recent purchases". Just a quick 5-star rating, a small button to write a textual review if the user wants, and a note clearly stating that their review can be changed later if their opinion changes.

I suggest putting that at post-checkout because it's a point where Amazon still has the user engaged, so asking is not annoying (like their emails currently are), but the user is done shopping so they aren't preoccupied with something else. In other words, it is the least annoying time to ask. They should, of course, also have a rating box on the home page (again, a list of recent purchases, possibly muxing in older purchases at random).

And then weighting reviews is super important. Reviews given by users later in their ownership of the product should be weighted higher than reviews given just after receiving the product. Changed reviews should be weighted higher, as well as reviews on returns.


Actually, they do this already. I get e-mailed about once a week from Amazon asking me, "How would you rate <<Thing you bought a month ago>>?"

I frankly find it annoying. It's sad to say, but unless the item was either really good or really bad, I don't feel compelled to review it.

The other problem is most people probably review their things as soon as they get it. Very few go back and edit their reviews if the thing disintegrates six months later, with the net result that the product has overwhelmingly positive reviews despite having the durability of the Dead Sea Scrolls.


The ones that drive me nuts are the 5 star "I bought it for my husband for his birthday next week" kind of reviews..

If the product hasn't been used for at least a few days / weeks (dependant on the item), then it's not a true review; more a reflection on how good Prime is at getting stuff to you on time and with no hassle.


LOL, exactly but the more annoying ones are in the Questions sections.

There is always someone who comments that they don't have the answer.

WTF just dont post if you don't have the answer.


That's actually Amazon's fault. They send a personalized email to the customer, saying e.g. "You purchased thing, and another thing customer has an important question about thing that you should answer by clicking this button." Sophisticates like the HN crowd know that clicking that button to say "n/a" isn't actually helping anyone. Amazon should know that most of its customers are not that sophisticated.


This and the questions that have three or four answers that are all identical.

The email from amazon makes it appear as though the question is being posed to you directly, and you can't see other answers while replying.

If it just showed answers others had given it would cut down on that a lot.


My comment addressed that:

> I suggest putting that at post-checkout because it's a point where Amazon still has the user engaged, so asking is not annoying (like their emails currently are), but the user is done shopping so they aren't preoccupied with something else.

And I addressed your second point as well in my comment.


As I understand it, those emails aren't generated automatically.

Sellers can trigger them to remind you to post a rating.


The problem with this is that it's really hard to get a user to click a button they don't have to. Additionally, many people already give low weight to reviews that are strictly 5 star ratings, so encouraging more of them probably won't help people determine how good a product really is.


If http://fakespot.com/ can spot fake reviews, Amazon themselves should be able to do it. It's in their long term interest that product reviews are genuine so that buyers have confidence. If I can't trust Amazon reviews, I'll just shop at Target where at least I can handle the physical product before I buy it.


A friend recommended fakespot to me a few months back. I tested it by pointing it to a product that I'd recently reviewed. It came back saying that the reviews were questionable, giving as an example of a shady review the one I'd just written myself!

Needless to say, I wasn't impressed with their heuristics.


How do I know that you're not a shill for fakespot?

How does the person reading this comment know I'm not a fakespot competitor?

It's a tricky business, isn't it?


Good point. But, for pedantry's sake, your analogy is backwards. The person you're replying to is questioning the accuracy of Fakespot, which would make them a competitor. A shill would be endorsing the service, or placing doubt on the critical comment, as you have done.


You're totally right. I have no idea how I got that backwards!


Indeed. How do we know some fake-review-review site isn't corrupted and taking money to assassinate competitors?


There should be a site that reviews and rates sites that review and rate reviews, so we could finally know which reviews reviews site to go to to find out which reviews are authentic.


But then we'd need another site that reviews and rates sites that review and rate sites that review and rate reviews.


That's good: there will always be another site to do, our jobs are secured.


It's in amazon's interest to suppress fake reviews, but it's definitely against amazon's interest to highlight the fact that their user reviews are often fake. The worst case for amazon is that users don't trust their reviews, and putting up a warning or score on reviews like fakespot doesn't improve the user's trust.

Amazon does police their reviews (as the article says, they've just sued another batch of companies for posting fake reviews) but they try to do it silently.


I've reported people posting hundreds of shill comments across a company's product line, I think it's nobody's job to care.


What is the rate of false positives with fakespot?

A 3rd party can afford to be over-zealous, but a first party solution can't.


It's not obvious it's in their best interest to block fake reviews. If product A has zero reviews and product B has ten glowing but fake reviews, which one makes more money for Amazon? Sure there's long-term reputation risk, but that's harder to quantify.


I mean, the primary purpose of this blog post isn't to educate the public but to point Amazon to the sophistication of their process and convince Amazon M&A to acquire them.


Do you think that Amazon doesn't already filter fake reviews? Also, people are already unlikely to write reviews, how likely do you think they are to keep writing reviews if their reviews keep getting flagged as fake?

[Disclosure: I work at AWS, but I really have no idea how any of this review stuff works]


Is there any way to make sure it only looks on (English) Amazon sites? I tried it, but it demands to look at all sites according to the description - which may not be the actual case, though.


Just don't go in the bathroom.


There are other related problems with Amazon's product pages:

Combining different items into one product page and review. For example take a look at the review for the "Dell Ultrasharp U2415 24-Inch" display. There are 6 different sized monitors on that page. Some have very different technical specs. Combining products makes sense when the difference is color preference or the size of a shirt. But the 24-inch U2415 is a very different display from the 34-Inch curved screen U3415W

Question and Answer section just before the actual reviews. I often see nonsense such as a question about a computer cable:

Q: Can I eat it?

A: I eat mine with ketchup.


Or the ever insightful:

Q: [Any question]

A: I don't know.


That boggled my mind for some time as well, but if you follow the rabbit, the answer is quite simple.

Whenever someone asks a new question, Amazon sends an email to previous buyers of the product. The email is worded in a way that entices a quick reply from the user. I forget the exact language they use, but it's very straight-forward, and it seems to make some people think that someone asked them personally. It entices them so much that they actually reply to the email like they would to another person's email who is asking them directly... They are just being polite.

What's funny is the irony in all of this. Amazon, a company that famously split tests nearly everything, manages to send people emails that look so genuine they get them to respond, even if they don't know an answer. And yet, the same company can't get the same people to add some genuine product reviews, which would help drown out the fakes.

Of course I don't have the data to know the response rates of these emails. I also don't claim to know that drowning out fakes with more legitimate reviews is the end-all solution. But it does make me wonder why we're even having this discussion, and why we even need sites like Fakespot, while Amazon, sitting on all that data about every one of their users, somehow seems to be unable to put it to use.


That's funny, I had no idea they sent those emails even though I've had Amazon for a long time. Thanks.


I love that Amazon tells me when a reviewer is part of the Vine program, so I can immediately ignore it. It's funny when a niche product has overwhelming negative reviews and a string of 5 star vine reviews.


The problem with Vine reviews is the users don't actually want or need the product they are reviewing.

When you get something for free that you don't need, it's much easier to like it. When you're actually buying something to use it daily, you become much more critical.


My problem with vine is that it's good for general stuff like lamps or cords but when it comes to more intricate things like say an embroidery machine or a vinyl cutter (guess what I'm doing as a hobby recently) then they don't have enough domain knowledge to give accurate reviews. I also feel like a lot of them feel they have to give good reviews to keep in the vine program which I directly know is not true.


The vine program is a disaster for Amazon reviews credibility IMO. The reviewers are almost never invested in the product and most come down to "this product does the thing it says it will do!" and then goes on to some largely irrelevant surface details "looks great, fits just right in my purse" and perhaps compares it to items in a completely different class instead of its actual competition "these $90 headphones are much better than my iphone ones". Some do make a good effort and can be supplementarily useful but they just don't have the same concerns as people looking to shell out full price.

I understand Amazon's (face value) motivation for it; decent products that are passed by because nobody reviewed it. But once you have one or two decent reviews all they do is add to the noise. Some sectors I've been looking at lately (eg USB chargers) are absolutely flooded with them to the point where it can be hard to find regular reviews. I used to use sorting by average customer review as a quick hack to see some of the best products in a sector, now it is often synonymous with "most spammed"; they are killing a key USP.

It's all very well playing the "we are transparent and non-coercive" card but the reality is people are very unlikely to start rocking the boat when you are handing them free stuff, no matter Amazon's position. And call me sceptical but don't forget the role of the selection process.

I'm sure it helps sales in the short term but long term it's a massive own goal. IMO they could salvage it by enforcing a 2 freebie reviews limit, or allowing users to filter them-- especially in the overall star rating.


Are you seeing the actual green Vine Reviewer at the top of these comments or the "I was given this product for free in return for an honest review" at the bottom text that a lot of astroturfers are using? I thought the # of vine reviews was limited but it may be up to the manufacturer on how many they send out. Vine reviewers don't choose what they review other than they can turn items down iirc.


I'd guess topic knowledge would also be a big deal. If someone reviews a pocket knife who has never used one in their life, it will be a different review from someone who hunts often and sharpens their own knives.



The other major issue is in bait-and-switching products. A company can put out a great product that generates true 5-star ratings, and then swap out their product with a cheaper version but continue selling it at the 5-star-product-price.


I was looking at bathroom scales recently and noticed this myself. Lots of five star reviews, many of which received the product free in exchange for a review. If you look deeper, most of these reviews are superficial. Adjust the review filter a bit and a clearer picture emerges with one and two star reviews from people who use it daily and have noticed a ton of flaws.

The product in question: https://www.amazon.com/gp/aw/reviews/B0113YL5V4/?sortBy=rece...

If you search for glass bathroom scales on Alibaba, you can find this exact model available at a price of $4/unit in volume so it's not surprising if it ends up being not the greatest quality.


I just realized I botched the link. Corrected: http://www.amazon.com/Etekcity-Digital-Weight-Bathroom-Scale...

Shows most recent verified purchases.


Here's just a few examples of listings with mostly fake reviews:

http://www.amazon.com/dp/B01DN9UKHC

http://www.amazon.com/dp/B01D0JDZFO

Most top listings are saturated with fake reviews. If you want to understand the depth of the fake listing problem, look at the thousands of people trying to create new listings in high-competition markets. Search Amazon for:

    "silicone oven mitts"
    "bamboo cutting board"
    "meal prep container"
Click any result, then scroll down to the "Sponsored Products" section for that item. Repeat this, really dig down into the "sponsored" lists. Notice the amount of copiers.

Now look at the reviews. For new sponsored products in these categories virtually all of the entries are from review-mill programs. There can be hundreds of fake reviews on a single listing. Many of the negative reviews you may see on these are fake as well, from competitors!

Review-mill programs work by having the reviewers buy the product through amazon (they are reimbursed or discounted). Then they do a write ups. Many are short and organic looking, the more easily spotted ones contain lots of text and multiple photos of the product.


Overall there seems to be an epidemics of "fakeness" in Internet ratings - even looking on IMDb for movie ratings I see e.g. a movie 7.7/10 but first few pages are unanimously 1-4/10 by what seem like genuine reviewers complaining about being deceived by IMDb to watch the movie...


For movies, IMDB average scores are exclusively between 5 and 8: a Four Point Scale. In contrast, Rotten Tomatoes and Metacritic scores hit the full gamut reliably: (I wrote a full statistical blog post on this with visualizations: http://minimaxir.com/2016/01/movie-revenue-ratings/ )

I did a similar thing for Amazon reviews many years ago but it was not very insightful. (http://minimaxir.com/2014/06/reviewing-reviews/)


Average scores should trend towards the center. I'm not sure if it's an example of the central limit theorem (I think that deals with different distributions tending towards normal), but you'd expect IMDB scores to trend toward the middle.


Central limit theorem assumes that events are independent and random. Which is not the case. (For one, there is a selection bias in that people would only review/vote on a product of they strongly like or strongly dislike it)

The lack of trend toward the middle implies that the scale is not great. But there is no incentive to fix it.


You also need to keep in mind that bandwagon hopping and fan flame wars/arguments/raids play a large part in these ratings (especially for movies and games on sites like IMDB or Metacritic).

So if something has a large, extremely passionate fandom, ratings can easily be skewed way higher than expected. If it's got a large (or at least vocal) group that seriously hates a movie or game, its ratings will be rock bottom. For example, Call of Duty Infinite War and its really bad Youtube ratings. Or the really, really critical coverage of Paper Mario Colour Splash and Metroid Prime Federation Force.

Not saying fake reviews and ratings don't exist for commercial gain, they do. But an angry (or really fanboyish) internet mob can cause this phenomenon about as much.


Even worse are movie ratings on iTunes. Though from reading the reviews I can't tell if they're fake, or written by twelve year olds that will watch anything that moves (and I'm not entirely convinced it even needs to move). Regardless, bad slasher flick that went straight to the DVD sale bin? Four stars!!!!111! The plot was paper thin, and the acting was terrible, but I still loved it!!! I've watched it like eight times1111

Thankfully, iTunes displays Rotten Tomatoes ratings.


Which is why I always use Metacritic. The issue with Metacritic is that anything with a specific social message will be artificially reviewed higher since critics have a much higher incentive to preserve reputation than random reviewers on IMDB.


For me the star ratings on IMDb often correlate with my own rating. But Metacritic and RottenTomatoes are often very skewed in a different direction and I really don't care to analyze why they conclude with such a different rating. Also with video games, Metacritic is pretty useless in my opinion - it's far better to read just a few sites I trust and they give me a better rating overview than a broad range of many bought ("100") or clueless ("20") critic reviews or user reviews.


I completely agree.

I find IMBb rating 98% accurate. No, I won't necessarily like a 8.1 move better than a 8.0 movie, but I will like it more than a 7.5 movie. And a 6.9 movie will always be crap to me.

I can think of very few movies where IMBb rating didn't agree with me at all, in either direction. And it's usually with movies that have a very social/philosophical/american message. But these are biased even more on the non-IMDb sites, like rotten tomatoes!

It might not work for everyone, but it sure works for me.


This reminds me of a recent HN thread about Amazon Echo. I brought up my skepticism that the undisclosed but touted Amazon sales of Echo were really a signal that the device is being adopted and well-liked by the consumer market that Amazon sought to target.

And, for saying anything even slightly negative about a big tech company, I of course got downvoted to oblivion by everyone adding their own anecdata.

Most loudly, at least one commenter linked to the reviews on Amazon itself, some tens of thousands with a 4-ish star average, as if this was definitive proof of something. Even just casually looking through the reviews, it was easy to see that almost all of the glowing, positive reviews were ghost written, or came from "Top X reviewers" who clear have a selection bias for offering positive reviews to reinforce their status as a highly ranked Amazon reviewer, or from people who got advanced copies of the device (again, selection bias if they are the sort of people who actively seek out new tech to review), and on and on.

What an echo chamber this place has become (no pun intended).


I know people that sell stuff on Amazon and they don't use fake reviews but if a competitor does use fake reviews it makes it hard not to want to make fake reviews also.. It's a vicious cycle.. Plus they get negative reviews out of nowhere.. Very likely the same competitor is paying to put negative reviews on your page.. So it's a double whammy.. Add a bunch of 1 stars to a competitor page and a bunch of 5 stars to your own page... Basically need to clean up the whole system somehow..


What would be ideal is if there were like "real reviewers" that are verified and their review carried more weight.. Which obviously sucks for the seller if they give a bad review.. Maybe there could be a mitigation system where the seller could try to fix something then mark the problem as fixed or something and then the reviewer could mark it as they agree, then they could update their review..


A lot of the "fake" reviews are by real reviewers. It is a common tactic to reimburse reviewers for purchase of the item so they then show up as a verified purchaser.


The problem is: Why should I write a review? I have nothing to gain.

Maybe I have had a bad experience and want to warn others: That would explain mostly bad reviews for almost every product. (Given enough users, some will have bad experiences, if only it is by them misusing it)

I can understand that mostly bad reviews creates an incentive to counter these with good reviews, but from the wrong side (seller, not consumer). This would explain the many fakes.

I guess reasonable trustworthy 3rd party sites like consumer reports are still necessary.


> Why should I write a review? I have nothing to gain.

Unfortunately I'm now of that mind-set as well.

A few years ago I was in the top-500 of reviewers on Amazon UK, based on 'helpful votes'. I only reviewed things that I had bought for myself and focused on items that had no or very few existing reviews.

Then two things occurred quite quickly:

1. I learned that many top reviewers were receiving free samples as part of Amazon Vine. I asked myself: why am I spending my time doing this without recompense? They were churning-out multiple reviews per week and I was doing perhaps three or four a month, but it still felt unfair.

2. People starting aggressively scoring reviews negatively for subjective reasons. I remember an airline stewardness marking as 'unhelpful' my review of a book because she said I had insulted her airline. Amazon weren't interested in reverting down-votes like that.

I drifted away from Amazon reviewing after that and in time I have also drifted away from Amazon. They never fostered their community of honest-amateur-reviewers ( with, say, a Gift Card at the end of the year ) and they're paying the price now.


What is the incentive to post commentary on discussion sites such as HN? For many, writing a review for an exceptional product triggers a type of internal emotional response that ties them to a community.


I write reviews all the time, and to be honest, Amazon makes it pretty easy. Basically if it shipped on time or earlier and was exactly what I expected, I will write a review and say exactly that.


What has shipping to do with the products quality? There are two things you buy at amazon: 1. a physical product 2. the amazon service and shipment. The second one should IMHO not be relevant for the peoduct review, yet many rate them and skew the results.


Yeah, fair, but if it's not a common product that amazon keeps on their shelves, then it's an important note, I think. I usually only buy stuff on amazon if I can't find it in the store, so it's usually a somewhat rare or very specific thing.


I have never ordered a product from Amazon that didn't ship on time or wasn't what I expected.

I'm not trying to be negative but I don't see what value the type of review you say you are writing adds.


I bought a smartphone case that looked robust in the picture, but was kinda cheap and didn't perfectly fit the phone, so I noted that in my review.

It's just a dinky review, you're not supposed to do a full product breakdown. If everyone thought of amazon reviews this way and just gave 30 seconds for a review for every item, then there would be no automated review problem.


That's fine. It's the other type that people are asking you not to do: didn't ship on time, packaging damaged, delivery person left it in the wrong place, the wrong item was sent, etc.


My thought on this was to develop a generic review system (ie would work for anything with a url) that shows you subjective results based on your social graph / trust map and weeds out bad actors by crunching the advogato algorithm.

The challenge is liquidity, ie how do you seed the network and content but if you can solve that you can operate a review service for a fraction of the resources currently being wasted on these red queen races.


> The challenge is liquidity, ie how do you seed the network and content but if you can solve that you can operate a review service for a fraction of the resources currently being wasted on these red queen races.

I've been thinking about this problem a lot too. There's actually many services that have come and gone in the review space. The solution I've come to is that any review service needs to be more than just reviews. It really needs to be more of a community.


seed with ratings from other review sites, but don't show the copywritten reviews


Another problem on amazon is allowing 3rd parties to "piggy-back" on original listings that aren't available directly from Amazon itself, and then the 3rd party will either substitute or not actually have the original item at all and ship whatever.

So the original item gets legit great reviews. But the reviewers never mention which seller they bought from. 3rd parties glam onto the high ratings and a lot of people get disappointment either with fulfillment or the product itself.

This is why I no longer buy unless it is amazon or amazon fulfilled (so I can at least return it). But you pay a premium for that.

Amazon could solve this by just noting which vendor the reviewer bought from on the subject line.


Amazon needs quality control. They sell more and more crap and you have to spend so much time perusing the reviews to find out if a product is actually good or not. Sometimes I prefer to goto Costco where yes I deal with long lines but at least I know what I am buying is of decent quality, the prices are low, and they have a very generous return policy (most electronics can be returned within 90 days and anything else you can return whenever you want with or without the receipt).


Sellers also game the system by offering a refund, without requiring a return, in exchange for reverting a bad review. I've reported this to Amazon but they weren't receptive.


This isn't really 'gaming' the system. It's good customer service. Even one Bad Amazon review can be death to a business on Amazon.

I think it's a little ridiculous that so much power is given to many buyers that only want to take out their anger on someone, even if it isn't really the fault of the seller.

I used to be an Amazon seller and quit because so many people would leave negative feedback over unrealistic expectations.

It seems to be the M.O. of the millennial generation.


Maybe you could game their gaming of the system by ordering the product on multiple accounts and writing bad reviews of it and get all of they sent you for free.


So change the review, get your refund, then change the review again to write about what happened.


I am more afraid of fake bad reviews.

In the case of electronic, photography stuff and similar you find people get more consistent results with some products. For instance, charging cables. Hypothetically you could buy a $5 third party cable vs a $20 OEM part, but any thought you might have a fire leana to the $20 cable.


I liked Amazon better before it became what Ebay became. It used to be a warehouse, now it's thousands merchants competing for exposure, thus prompting this.

When I search for a product, I now often get tens of matches, all almost identical, all having different prices and description and equally bad photographs.


It certainly makes it more difficult to compare products. These third-party sellers aren't doing themselves any favours. I recently wanted to purchase a simple phone case (Samsung S7 Edge, so relatively new). ALL of the cases had 4-5 stars, which makes reviews useless during research mode. It was pretty easy to spot patterns in reviews that were essentially paid-for but very annoying.

In the end, I ordered a case (after checking YouTube video reviews) and was contacted by the seller to review their product on Amazon in exchange for 10% off my next purchase with the same seller. BUT not if I was going to give them less than 5 stars; in _that_ case they wanted me to contact them directly.

/end anecdote


There are portals that connect reviewer shills with sellers who want to boost their product ranking. Google "buy amazon reviews" and you'll see it. Some do it individually, on fiverr for example.

The portals themselves are a very lucrative business, and are not going away anytime soon. There is an arms race between sellers to get that sweet sweet real estate (get their products on page 1 or page 2), and business can be a cutthroat matter. Of course, the system will eventually come to a steady state when too many buyers get burned buying crap. Review and feedback systems is a topic I'm very interested in.


I am glad they added verified reviews that are for people that actually bought the product.

I wish they would take it one step farther and add a option to only show verified reviews and associated stats for those verified reviews


Verified doesn't help much with spam reviews, the seller gets most of the money back from buying their own product.

My personal filter is to ignore any review that uses any variation of the phrase "found this little gem!".


Clustered timing of reviews is unfortunately not a great metric for determining whether or not they are legitimate. Many - probably most - companies send out emails to buyers asking them to review, no compensation provided. This is pretty effective, and a big driver of legitimate reviews. The timing of those emails is determined by response rates, which tend to be better on specific days - hence the mails are batched. So for some large entities, you'll see reviews clustering on Thursday to Saturday for example.


Amazon manually check reviews, which may also cluster


And now we get to have a recursive "but what's _your_ bias" discussion. For instance, would a writer for a site which posts professional reviews have a bias (conscious or otherwise) against reviews written by amateurs. Add in the "if that person was good I'd be out of work" bias and here we are.

Note that I'm talking specifically about the authors critique of the vine program, obviously the review spam is a real problem.

That aside, I do find wirecuter to be a good starting place.


Yeah after being tricked by a product or two I've now developed a keen eye for products with fake reviews.

Funny story: a co-worker of mine fell for the fake-review trick and bought a product, talking about its rave reviews. I knew from looking at the reviews that they were fake and he was going to be disappointed. He said "No, look, it has great reviews". I said okay, you're going to be disappointed. And he was disappointed. It was the "Amazon featured" lock laces.


Reviews in general are not a great indicator because the average person writes a terrible review. Either the review is made based on irrelevant details or they lack the perspective necessary to review the item appropriately.

Two recent examples:

There is a cheesesteak restaurant near me that has 4 stars on Yelp. People rave about how they have the best cheesesteaks ever! Yet if you look more closely, reviews from people that have eaten a cheesesteak on the east coast are much lower. After trying it myself, the food is terribly bland.

I was recently looking at coffee makers. The "Ninja" coffee maker has glowing reviews (though I suspect a chunk of them are fake). But the reviewers are comparing this $200 Ninja machine to their $30 Mr. Coffee. Again people that actually like good coffee are rating it much lower. Then looking at the reviews for a different coffee maker, many reviewers are comparing the result to their Aeropress or lower end Espresso machines.


> There is a cheesesteak restaurant near me that has 4 stars on Yelp. People rave about how they have the best cheesesteaks ever! Yet if you look more closely, reviews from people that have eaten a cheesesteak on the east coast are much lower. After trying it myself, the food is terribly bland.

Food is tough in some regions. In California, you get open warfare about the authenticity of Mexican restaurants. In the Midwest, people think Subway is a New York Deli.


Yeah, Yelp is a whole other beast. I've also developed a quick filter for reading those. Plus, food reviews are a lot more subjective. Whereas with Amazon reviews, sometimes all you really want to know is whether or not a product works as advertised. Pair that w/ their questions system and I find Amazon reviews are pretty useful, even with the occasional product fluffed w/ positive ratings.


When it comes to Amazon reviews, let's not forget some of the older examples of them being gamed. Before the internet 'services' and spammers offering to make fake reviews, the same thing was being done by notorious dodgy authors and creators on a more manual level. Like Robert Stanek, a notorious self published fantasy author who posted thousands of fake reviews of his books (and probably still does):

http://conjugalfelicity.com/robert-stanek/gaming-the-amazon-...

For example, his books sold maybe ten copies a week, yet somehow got hundreds of glowing and suspiciously worded reviews on Amazon. Got to the point other sites (and Wikipedia) basically put up warning signs.

So yeah, even before the current crop of reviewer sellers started popping up, this system was being gamed by the authors and creators themselves using sockpuppets to flood it with fake reviews for their work.


Yes - there are fake reviews, but there are also bunch of discount-for-review. Most of these reviewers are leaving disclaimer and often leave 1-3 star review in my experience.

Now the reason for that is because it super hard to get going with your product without sending discounted samples to reviewers. Because even if you spend tons of money on ads and somehow convince customer to buy your product without reviews, that customer in 99.9% will never leave a review. And if you dare to email this customer to ask for feedback - they will curse you, complain, report to amazon.

That's where things currently are - it is two way street. Without any traffic on it you will end up with very few shitty products by big companies who will have no reason to improve quality or innovate.

Just 2c from the other side of the fence.


Why is the wirecutter even reading Amazon reviews? Isn't their job to give independent analysis?


How do you start your independent analysis if by not looking at what's popular? Doing your own analysis and reading amazon reviews are not exclusive.

I find wirecutter usually accurate in facts, but I rarely agree with them on their recommendation of what is good for me. That being said, they are very explicit in why they recommend something, I just wish there were other reviewers who shared my set of interests and values in things.


Bear in mind that they also filter out anything they can't get a referral fee for.


If FakeSpot can detect fakes Amazon should be able to do it even better. The fact that FakeSpot even exists shows that Amazon has decided that they've won the online retail competition and there's no need to spend any more resources on improving it.


Does FakeSpot exist? The progress bar hangs whatever product I try it on.

Meanwhile the "Recommended with honest reviews" products have acquired some one star reviews of their own.

Useful or not? No idea. Doesn't look like it so far.

But this is probably an unsolvable problem.

You could solve it by hiring and paying 100% independent mystery shopper reviewers with unimpeachable integrity, and only reviewing products actually purchased for real money.

I don't think many online merchants would support that.

This is a fundamental problem with scarcity transactions - it's always more profitable to game the system than to play honestly.


Try to buy an in-ear or forehead thermometer on Amazon based on reviews? I've found it impossible - because every one they sell seems to have its reviews tainted by people who received free merchandise in exchange for their "honest review".


This is off-topic, but I feel your pain and have a (un-paid, and unaffiliated) recommendation. My wife and I went through about 10 thermometers of all types and price points the first year our first child was born. They were all of dubious accuracy, and worse: inconsistent.

Then, at the pediatrician's office we noticed them using an inner-ear thermometer. I asked the nurse to see it. It was a Braun Thermoscan. They aren't cheap at ~$70, but we bought one 5 years ago and that was that. It's fast, accurate and consistent. It paid for itself long ago and still works like it was new.

And one more time, because of the context of this comment: I have no affiliation with Braun.


The reviewing system on Amazon gives WAY too much space to the hoards of morons would give products one star because the package was damaged on arrival. As long as that's the case, the average rating won't mean anything.


The other problem is that companies provide free or heavily discounted products in exchange for a review.

So they are using real users who then with a wink and a nudge provide 4 & 5 star reviews to keep the free products coming.


The only thing that the author doesn't seem very familiar with is with Amazon's Vine program which isn't surprising since there aren't a lot of members.

> For example, returns and long-term use aren’t part of the evaluation. When you get something for free, you’re less likely to follow up on breakage concerns or customer service issues. Additionally, if the reviewer didn’t actually buy the product, that person doesn’t take the purchase and shipping processes into consideration.

I don't feel that this is true. People who are selected for Vine aren't random Amazon customers. They are Amazon's top ranked reviewers. A lot of these reviewers have started reviewing hundreds of items before Vine or any paid reviewing programs even came into existence. i.e. They do it for the 'love' so they will discuss price and shipping considerations

Yes given Vine's 30 day review deadline, you're not going to see anything initially about long term use; but... a lot of these reviews do get updated if anything about the product changes from the original review.

> You might notice how few of the reviews through Vine and similar programs are negative or even critical. This isn’t a case of reviewers intentionally being dishonest, but rather the result of unconscious positive bias.

I don't disagree here.

> Not paying for an item can make difficulties with that item seem less irritating.

Vine items are now treated as taxable income starting this year, so Vine reviewers are now paying for the items that they select.

> Additionally, reviewers may give their opinions on items for which they have no expertise or real experience and therefore have no frame of reference about how well something works by comparison.

I could be wrong but I feel that available Vine items are based on the customer's past purchase and review history. There's no guarantee that the customer is an expert, but there's some proof that the reviewer is at least familiar with the product category.

I'm sure that there are some bad Vine reviews and bad Vine reviewers. However discrediting the entire program without any data just doesn't seem fair.

(In case it wasn't already obvious, I'm part of the Vine program)


Part of the problem with the Vine program is that there's no real need or want that other reviewers have. If I'm looking for a backup battery because it's something I need every day, and I get shipped one and it takes two weeks and then it doesn't work consistently (something that happened to me), then I'm going to have a review that says that. If I'm a Vine reviewer and a backup battery shows up and I use it a few times and it works fine, I'll probably leave a positive review and then set the item aside. The lack of need or want completely taints the review process

The other issue is that psychology studies show that being given something ingratiates a person to the giver, which also greatly taints the Vine program.


I think that problem is mitigated by Amazon's algorithm matching you with things that are more likely for you to either really like or use on a regular basis, which leads to updated reviews. I've had some 5 star Vine reviews turn into 1 or 3 star ones after 2-3 months... and even some reverting back to 4-5 stars after a year or so.

> The other issue is that psychology studies show that being given something ingratiates a person to the giver, which also greatly taints the Vine program.

I agree with this. Another problem which the author mentioned is the lack of a review standard. One person's 5 star rating is not equivalent to another person's.


Maybe if Amazon simply posted a score/rating of how many people bought vs. returned the particular product and frequency over time, that information would alone be helpful. However, not sure they would ever share such competitively valuable information directly. They do sort of share it indirectly through the best sellers lists on the Amazon site, however, they do not share details of "how" they arrive at what "best-selling" means.


This is a wide spread problem across the board not just web. Look at some of the reviews of the apps on iOS. Very detailed and makes a pitch to future users to buy this app. I have almost always never left a review unless it is a great app or if I got really mad at the app. Never felt the need to leave reviews if the app did the job it's supposed to do.


My idea is instead of having review why not show the percent of orders that were returned. That would give you sense of the quality.

Another idea is to randomly select customers who actually ordered the product to leave a review. Since it's random you wouldn't have the self selection problem of people getting paid to leave reviews.


The sad part about all this is that the "social networks" will drive their bus through this hole.

It will be "Get recommendations from people you trust!!! (sortof)".

And, in short order, we'll be back to the same situation, only more tied-up, gagged and corrupted. ("Hey, pssst, make money fooling you friends!")


Actually, I think that's probably the only workable solution to the whole fake-review problem. The only reviews you should trust are the ones written either by people you already know and trust, or people who are trusted by people you trust, and so on.

Currently, it's really easy to bribe a random person to write an anonymous review. If the maker of some product has to bribe a friend of mine in order to get me to buy that product, it's going to be a lot more work for a lot less gain. Eventually it starts to look a lot like a company bribing all of its customers at once to buy its products, which if they can pull that off and make money, more power to them I guess.

The social media companies have the data and infrastructure to implement some sort of "product reviews weighted by the reviewer's proximity to you in the friend graph" feature, but I don't see users trusting Facebook or LinkedIn to be an impartial source of trustworthy reviews. (Also, a lot of Facebook friends aren't necessarily people you'd use for product recommendations, but that's a separate problem.)


Oh pleeeeeeze bring Fakespot to App store reviews! I'm not going to whine any more... but pleeeeeeeeze!

I've got a competitor that has made millions making fake reviews to boost up his lousy little app. He's been at it for about 5 years!


As always, it comes down to economics. So long as it's economical for people to run fake review companies online and for companies to buy sets of fake reviews, then it is going to happen. So how do we do that?


It's not economical for people to buy them if they get deleted for fraud. Amazon is in a position to use confirmed purchasers of a product as a filter. They can also score account based on how many legitimate reviews they have. Once they cross a threshold all of the reviews get nuked. If an account as 1000+ reviews and rarely purchases anything you just shadowban them.


I've seen tons of fake-sounding reviews tagged with "Verified Purchase" lately. I think this is only a small impediment, since the seller gets most of the money back from buying their own product. They just have to have fake reviewers actually buy the product and then reimburse them.


This is the most correct and obvious/very low cost way to do it that you almost start to think Amazon is in on the conspiracy with these fake reviewers because they have something to gain that we are not seeing.


That's the thing I don't get. Amazon wants to have reliable reviews, right? If people have to go elsewhere for reliable reviews those better-run sites will start affiliate programs with other sellers. That's a real risk for Amazon. I don't really care where I buy dog food or computer accessories or whatever. I live and breath AWS so they've got me there, but buying goods from them or target.com or wherever else is ultimately all the same.


TL;DR: Recommendations systems cannot be indifferent to truth.

This article highlights pretty much precisely the same problem that the "social media promote (bogus) conspiracy theories" item yesterday did. Slightly recycled comment.

Some related concepts:

1. Donella Meadows, in Thinking in Systems, notes that an absolute requirement of an effective and healthy system is accurate feedback and information. Media which are indifferent to truth value, or which actively promote distortion (see Robert Proctor's term, agnotology), will actively harm the system.

2. Celine's 2nd law, and inversion. In Robert Anton Wilson's Illuminatus! trilogy, a character notes that "accurate information is possible only in a non-punishing situation". Its inverse is also true: acurate information is only possible when it is accuracy itself and ONLY accuracy which is rewarded. Academic publishing, in which paper output and journal selection is a gateway determinant of professional careers, would be an instance of this. Or the long skew of The Learning Channel from NASA-PBS educational co-production to Honey Boo-Boo broadcaster.

3. Paperclip Maximizer. "Don't be evil" isn't good enough. You've got to actively seek out good. Even a benign maximisation goal will, if not tempered by requirements to provide net benefit, lead to catastrophic results.

4. Mancur Olson's "The Logic of Collective Action" explains how and why small (but motivated) fringe groups can achieve goals directly opposed to the interests of far larger groups. This explains a great deal of market and political dysfunction.

5. A generalisation of Gresham's Law leads to the realisation that understanding of complex truths is itself expensive. It's also (Dunning-Kruger) beyond the capability of much of the population. This also has some rather dismal implications, though as William Ophuls notes, political theory based on the assumption that all the children can be above average ("the Lake Wobegon effect") are doomed. You dance with the dunce what brung ya.

Amazon here will probably have to create and fund their own review staff if only to vet and validate user-supplied reviews. As one HN poster here notes, "what is my incentive to provide reviews?" A good review takes time, experience, is likely to draw flack, and, frankly, often comes from someone with better things to do with their time.

Crowdsourcing is useful where data are thin and the crowd is likely to be unbiased. Where specific expertise is required, Peter Noverg on programming texts, the Google employee who's uncovered numerous fraudulent USB-C cables, one qualified reviewer trumps millions who know nothing. (But you've got to ensure that the qualified reviewer stays both honest and engaged.)

More generally: the Amazon experience seems to be worsening. Trust is falling to eBay-like levels, if not worse. Search and discovery are poor, and product quality is all over the map. Books are one matter -- highly uniform product, simple function. Expensive kit of various description another. Among my recent purchases, an LED cabinet lighting system in which local, major chain, and Amazon sources (brick-and-morter and online) failed. Ikea turned out to have a well-thought-out product line, helpful (though ultimately not quite what I was looking for) information online, and, most importantly, an in-store display in which I could assess, mix, match, and ultimately assemble the system I'm using and enjoying now. Twice the price of what I'd initially specced out, but that system didn't work in the least.

https://plus.google.com/104092656004159577193/posts/chnJ1Mzc...


I wish there is something similar on the app store to weed out fake reviews.


Sounds like an opportunity for a fakespot browser extension...


You can filter Amazon reviews by "Verified Purchase" which, unless someone tells me differently, at least means the product was purchased. It's annoying but those reviews feel a bit more honest.


Did you read the article? It specifically mentioned that firms/fake reviewers use Verified Purchases as a method of building authenticity and believability. Specifically:

"The compensated-review process is simple: Businesses paid to create dummy accounts purchase products from Amazon and write four- and five-star reviews. Buying the product makes it tougher for Amazon to police the reviews, because the reviews are in fact based on verified purchases."


It means Amazon facilitated sending the product which AFAIK also covers when companies "give" reviewers free product so I don't really trust those a whole lot more, maybe a little more but it's not perfect.


AFAIK companies can provide discount codes for reviewers to "buy" the product but there is a limit to the discount amount.


Please don't rewrite titles unless they are misleading or linkbait: https://news.ycombinator.com/newsguidelines.html

(Submitted title was "Amazon Reviews Are Gamed by Compensated Reviews by Professional Review Companies".)


I like the original title with the secondary clause, "Let’s Talk About Amazon Reviews: How We Spot the Fakes"...I mean, when I saw the title "Let's Talk About Amazon Reviews" coming from a domain that is not amazon.com -- i.e. it's not a product/dev announcement from Amazon -- I knew it'd be about review fakery, but maybe not everyone is so cynical.


Ah yes, it's clearly better with the subtitle, so we've added that and taken out the superfluous bit ("Let's talk about").


You really think "Let’s Talk About Amazon Reviews" is a better summary than "Amazon Reviews Are Gamed by Compensated Reviews by Professional Review Companies"? At least in the latter, 4/6 of the words in it are not meaningless fluff.


It isn't a better summary in this case (though the article's title is ok once the subtitle is put in), but the rule has to work in general, and in general we aren't able to read all articles closely enough to vet rewritten titles.


That is a much better title than the one on the article.

"Lets Talk About Amazon Reviews" is linkbait IMHO, it fails to cover the topic of the article in the title, only the subject.


Amazon likes to make it look like they are "cleaning up" the fake reviews industry, by suing companies out of existence. But in reality they're just trying to drive sellers to use their own, official "paid review" product. [0]

Granted the Amazon program is more transparent (as the review is affixed with a label that says "user was compensated for this review"), but it's disingenuous at best for Amazon to act like they're taking the moral high ground by shutting down these review sites.

[0] https://www.amazon.com/gp/vine/help


> it's disingenuous at best for Amazon to act like they're taking the moral high ground by shutting down these review sites.

Has Amazon come out and said they're taking the moral high ground? I just assumed they were suing them for breaking their terms of service. Many companies do this if you break them frequently enough.


I think it's safe to assume that the many fluff pieces in the tech media are the result of a PR campaign to paint Amazon as the good guy cleaning up those bad people selling fake reviews ("the moral high ground"). But of course there's no mention of the fact that they've turned a blind eye to them for years, and they're only now shutting them down to eliminate competition for their new paid review product.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: