Then there are the professional reviews, which are all conflicted because most reviewers either get to keep the gear or the gear maker is a sponsor of the site/publication and the reviewer knows it. I say this as a person who's still involved in publishing reviews (of outdoor gear, mostly), and who most of the time gets to keep whatever some company sends me.
However rotten you think the adtech industry is, the state of product reviews online -- user generated, casual forum reviews, professional reviews, etc. -- is worse.
I have a standard rant about the current state of the web that I give all my family and friends, and it always ends with: "if you didn't pay to read it, you probably read an advertisement."
As an engineer at perhaps the largest ratings and reviews company worldwide, I would love to hear your standard rant in long form. Please pm me if you feel up to that conversation. Ideally the result would be an improvement in the quality of the content of billions of reviews.
> A basic truth-in-advertising principle is that it’s deceptive to mislead consumers about the commercial nature of content. Advertisements or promotional messages are deceptive if they convey to consumers expressly or by implication that they’re independent, impartial, or from a source other than the sponsoring advertiser – in other words, that they’re something other than ads. (...) As the FTC Endorsement Guides establish, advertisers’ use of third-party endorsements may be deceptive if there is an undisclosed material connection between the advertiser and the endorser – one that might materially affect the weight or credibility of the endorsement
These rules are enforced. For example, Machinima recently settled with the FTC over the fact that Machinima paid endorsers to include footage of the Xbox in reviews and commentary.
People are naturally more vocal about complaints, and if they have a serious gripe then they are far more likely to speak up in a negative review.
If they like something, that is basically meeting expectations and there is no urgency to seek resolution. So I expect a larger percentage of positive reviews to be shills, since nobody else will feel much need to post anything.
But if the negative reviews are things I don't care about (or are obvious douche reviews) that tells me a lot.
So, Amazon? Or am I misunderstanding what you mean by this?
Then there's the state of video game reviews, but that's a whole 'nother kettle of fish. Double so when mobile apps are involved.
I personally just use Usenet for whatever people use VPNs for, I think. If I were doing something that needed to be kept from the NSA, I guess I would use Tor. Or postcards covertly hidden under rocks in public parks or whatever ;)
But your own comment (that I'm replying to) isn't one, and we're not paying for it - you're speaking to us frankly and off the cuff (and if anything, against your own interests on those other sites) because of the informal club-like forum that this is. In the end I and any other readers here are giving you all due weight. in exchange everyone else speaks frankly about stuff they know about. You can read stuff here that probably posters are explicitly prohibited by their employers from posting. some guy from (insert whatever state agency here) can probably post from a throwaway but still waaaaaaaaaaay in contravention of whatever their policies are, and with only a little bit of indirection.
So it's hardly as simple as you suggest. yeah you have to read through the bullshit, but this has always been true everywhere. caveat emptor.
1. Foremost Does the product meet my need?
2. I ONLY take the negative reviews seriously and look for commonality between them. If I can live with those issues than I purchase the product.
3. I'm also cautious of products where users appear a bit over zealous to correct bad reviews.
All the time, I think they do a decent amount of A/B testing on product pages (which would make sense). I looked into scraping reviews and doing some analysis for this exact reason.
He has both glowing reviews such as this one: http://www.amazon.com/gp/review/R30AQNQK2I1O7P?ref_=glimp_1r...
And scathing ones such as this one: http://www.amazon.com/gp/review/R2C7L5KHUVHOR2?ref_=glimp_1r...
By following several reviewers I respect in each domain I care about, it's not hard to find quality texts (or products) that I'd never have found by typing words into Amazon's search box and looking at average review scores.
Having Amazon test every cable obviously doesn't scale either but couldn't they require the vendors to provide test results showing that the cable is spec compliant? Of course the vendor could forge the results, but that's intentionally malicious and this would at least catch non-malicious accidents.
Electricity goes through it. Check. UL listed? Check. Okay, goes on the suggestions page.
I actually use their sister site the Sweethome more than the Wirecutter. I tend to be an obsessive prepurchase researcher, and so far I've been happy with their recommendations.
What I find interesting is that the Internet has been long acclaimed for it's ability to give anyone an equal voice and to remove focus from authority and context put the focus on just the here-and-now content. But here we are lamenting the fact that context and credibility of the source are actually incredibly useful.
If you read this, not only should you not buy a Deitel programming book, but you should tell all your friends not to buy one as well.
The hard part (I think) can be establishing that trust— getting the reader to understand that hey, I'm a real guy, who's really spent a lot of time looking at website builders.
Completely disregard all Amazon Vine reviews.
Now weight heavily in favour of reviews that have had update edits - "it failed after 2 months. The replacement failed 3 months after that" (common on all makes of $200 LED gaming keyboards) or "after a year it's still in perfect condition". Photos on the reviews are often a good sign too - but only when there's just a few - photos on reviews are rare. So lots with similar framing would be an instant strike.
Try finding a CAT6a cable on Amazon - seems they're all 4.5 - 5* wonderful, yet dig into the bad reviews a bit and most of them don't even meet wire or shield spec. Even those coming from Amazon themselves.
Aside from reviews from specific, trustworthy people, reading the mid-tier and bad reviews gives a much better flavour of what a product is about, and the odd review tearing the product to pieces can be worth its weight in gold.
If it's a marketplace seller my starting point is to mistrust all reviews and let them earn trust really slowly.
I sure won't be going near Fakespot, especially when they're so intentionally vague in describing their own product - it'll be a score to game like your Klout score and all the other meaningless metrics. The astroturfers will adjust what's needed in the fakes to pass.
They only need to be slightly difficult for a robot to flag as a fake. Most people will just sort products by a combination of price-and-rating, and buy the top-rated thing in their budget; it doesn't matter if the 5-star reviews are all Markov-chain generated garbage, all users see are the stars.
I've never quite got around to writing a browser plugin that excludes any review that have such a statement, and / or builds up a database of reviewers that are trustworthy on amazon.
The guys in the Top x of reviewers, who receive all this crap to review, basically it's their living, they aren't going to start bad mouthing it.
Maybe take the star rating of each review within a month of a new listing and then compare the average 6 months later to see how shill those initial reviewers are.
Steam just updated its user review system to do just this: it shows both the overall rate of positive reviews as it did before, but also the positive rate in the last 30 days. Now, their prime motivation was probably more to do with things like patches (or the lack thereof), but it would probably do a pretty good job of counteracting paid/biased reviews from around launch as well.
Giving Vine reviews a strike probably cuts all the "I received this item on spec..." garbage, with or without photo. what's left is photos from hopefully "normal" reviewers.
I basically do have to read the reviews of every item I buy from Amazon or buy somewhere more trustworthy. Which is most places of enough size to be well known.
That's really sad because in the early days of Amazon, before marketplace, Vine and unchecked shills, their rating was one of the better metrics going. They threw away every single ounce of trust they had earnt. Apparently intentionally.
IMDB ratings are the same - so untrustworthy as to be worthless noise. Early IMDB, before Amazon, I could trust the ratings blind.
If you give 5 star to an erotic novel you can reasonably assume that the buyer will understand the genre and not compare it to Jane Austin. Technically Six Shades of Grey is a better erotic novel than Pride and Prejudice but that is a silly comparison to make.
A £300 laptop computer can be very good value for money and deserve 5 starts. But it may not be as good as a £500 machine with 3 stars. The problem is that people don't understand the product and buy purely based on the number of stars.
I met a shady guy recently, who has personally paid for tons of take reviews on kindle books. After hearing how easy it is from him, I really lost trust.
My assumption for most products these days is that if Amazon is cheaper than a retailer, it's grey market.
IMO, they did a Walmart -- at some point the mission shifted from serving the customer to total domination at any cost.
EBay is much worse in this regard as there is nobody I can trust to have properly sourced their inventory.
Back in the late 90's, early 2000's before Amazon was as big as they are now, I worked in the commercial music distribution arm of Sony/BMG. We were the ONLY people that were manufacturing (In our own facilities) CD's for bands on our music labels and yet bootleg copies of CD's were going out Amazon's doors to customers. How? Amazon doesn't keep an abundance of stock on hand. If they run to low, they would buy product for One Stops (at a higher price than we sold to them) because the One Stops could get several thousand CD's in a few hours while an order from our warehouse might take 36 hours to get there. It was a stop gap for them to not go Out Of Stock. Those One Stops should only have been getting their product directly from us also, but no one watched their inventory as closely, and they would at times buy product from unofficial manufacturers. Those One Stops were trying to scam people. (Mostly Mom and Pop CD retailers)
Again, I have not had any problem returning product to Amazon.
Have you got any source for this being a real problem ? Because it seems such things are easily detected at the packing station, by barcode. And if an employee does so often, he'll probably won't stay long.
1) Amazon itself is the seller
2) The manufacturer is the seller
3) It's a sufficiently uncommon item that the odds of getting a counterfeit is low
Following these rules, I generally haven't had any problems with Amazon purchases.
I'm sure I've had something that wasn't what I had hoped, but I truly don't remember. On the other hand, I can think of dozens of things that were great. Electronics, tools, consumables, toys.
Not to mention, Amazon pretty much always will refund you.
You guys sound a little worked up over next-to nothing, honestly.
I bought a Macbook Pro charger from Amazon, sold by Amazon. I got a clearly counterfeit charger that came in, I kid you not, two Ziplock bags (one for the cord, one for the unit).
(I dropped prime when they blacklisted competitors' products: Chromecast and Apple TV.)
I have come to trust that if an item has tens of thousands of reviews, few 1-star reviews, and some useful critique/criticism in the reviews, that while it may not be the best [vacuum|pot|canoe] I'll probably do just fine with it. Much like with Android vs iOS, there are real differences but both have a very high quality level and the non-professional vacuum user is really fine with either choice.
If I really, really, really care about the exact details (e.g. digital cameras, computers) I'll go elsewhere, simply because storefront reviews are never a good source of side-by-side tests & sophisticated, in-depth analysis.
Those are usually people who have given some thought to the good and bad about a product.
Honest question: where do you go instead? YouTube? Review blogs? What makes you trust those reviews more than the in-depth reviews you find on storefronts?
"Professional" reviewers in particular fill a gap- some of them have had their hands on just about every <vacuum> ever made, so they can provide something frequently lacking on storefront reviews, an opinion well informed by experience with many alternatives.
I also like sites like DxOMark, which provide raw test data that you can review yourself.
The main theme though is just, many sources. Any one person can be bought, but not the whole internet.
Off the top of my head I would suggest that on the post-checkout screen they should list the customer's most recent purchases asking "Please rate your current satisfaction with your recent purchases". Just a quick 5-star rating, a small button to write a textual review if the user wants, and a note clearly stating that their review can be changed later if their opinion changes.
I suggest putting that at post-checkout because it's a point where Amazon still has the user engaged, so asking is not annoying (like their emails currently are), but the user is done shopping so they aren't preoccupied with something else. In other words, it is the least annoying time to ask. They should, of course, also have a rating box on the home page (again, a list of recent purchases, possibly muxing in older purchases at random).
And then weighting reviews is super important. Reviews given by users later in their ownership of the product should be weighted higher than reviews given just after receiving the product. Changed reviews should be weighted higher, as well as reviews on returns.
I frankly find it annoying. It's sad to say, but unless the item was either really good or really bad, I don't feel compelled to review it.
The other problem is most people probably review their things as soon as they get it. Very few go back and edit their reviews if the thing disintegrates six months later, with the net result that the product has overwhelmingly positive reviews despite having the durability of the Dead Sea Scrolls.
If the product hasn't been used for at least a few days / weeks (dependant on the item), then it's not a true review; more a reflection on how good Prime is at getting stuff to you on time and with no hassle.
There is always someone who comments that they don't have the answer.
WTF just dont post if you don't have the answer.
The email from amazon makes it appear as though the question is being posed to you directly, and you can't see other answers while replying.
If it just showed answers others had given it would cut down on that a lot.
> I suggest putting that at post-checkout because it's a point where Amazon still has the user engaged, so asking is not annoying (like their emails currently are), but the user is done shopping so they aren't preoccupied with something else.
And I addressed your second point as well in my comment.
Sellers can trigger them to remind you to post a rating.
Needless to say, I wasn't impressed with their heuristics.
How does the person reading this comment know I'm not a fakespot competitor?
It's a tricky business, isn't it?
Amazon does police their reviews (as the article says, they've just sued another batch of companies for posting fake reviews) but they try to do it silently.
A 3rd party can afford to be over-zealous, but a first party solution can't.
[Disclosure: I work at AWS, but I really have no idea how any of this review stuff works]
Combining different items into one product page and review. For example take a look at the review for the "Dell Ultrasharp U2415 24-Inch" display. There are 6 different sized monitors on that page. Some have very different technical specs. Combining products makes sense when the difference is color preference or the size of a shirt. But the 24-inch U2415 is a very different display from the 34-Inch curved screen U3415W
Question and Answer section just before the actual reviews. I often see nonsense such as a question about a computer cable:
Q: Can I eat it?
A: I eat mine with ketchup.
Q: [Any question]
A: I don't know.
Whenever someone asks a new question, Amazon sends an email to previous buyers of the product. The email is worded in a way that entices a quick reply from the user. I forget the exact language they use, but it's very straight-forward, and it seems to make some people think that someone asked them personally. It entices them so much that they actually reply to the email like they would to another person's email who is asking them directly... They are just being polite.
What's funny is the irony in all of this. Amazon, a company that famously split tests nearly everything, manages to send people emails that look so genuine they get them to respond, even if they don't know an answer. And yet, the same company can't get the same people to add some genuine product reviews, which would help drown out the fakes.
Of course I don't have the data to know the response rates of these emails. I also don't claim to know that drowning out fakes with more legitimate reviews is the end-all solution. But it does make me wonder why we're even having this discussion, and why we even need sites like Fakespot, while Amazon, sitting on all that data about every one of their users, somehow seems to be unable to put it to use.
When you get something for free that you don't need, it's much easier to like it. When you're actually buying something to use it daily, you become much more critical.
I understand Amazon's (face value) motivation for it; decent products that are passed by because nobody reviewed it. But once you have one or two decent reviews all they do is add to the noise. Some sectors I've been looking at lately (eg USB chargers) are absolutely flooded with them to the point where it can be hard to find regular reviews. I used to use sorting by average customer review as a quick hack to see some of the best products in a sector, now it is often synonymous with "most spammed"; they are killing a key USP.
It's all very well playing the "we are transparent and non-coercive" card but the reality is people are very unlikely to start rocking the boat when you are handing them free stuff, no matter Amazon's position. And call me sceptical but don't forget the role of the selection process.
I'm sure it helps sales in the short term but long term it's a massive own goal. IMO they could salvage it by enforcing a 2 freebie reviews limit, or allowing users to filter them-- especially in the overall star rating.
The product in question: https://www.amazon.com/gp/aw/reviews/B0113YL5V4/?sortBy=rece...
If you search for glass bathroom scales on Alibaba, you can find this exact model available at a price of $4/unit in volume so it's not surprising if it ends up being not the greatest quality.
Shows most recent verified purchases.
Most top listings are saturated with fake reviews. If you want to understand the depth of the fake listing problem, look at the thousands of people trying to create new listings in high-competition markets. Search Amazon for:
"silicone oven mitts"
"bamboo cutting board"
"meal prep container"
Now look at the reviews. For new sponsored products in these categories virtually all of the entries are from review-mill programs. There can be hundreds of fake reviews on a single listing. Many of the negative reviews you may see on these are fake as well, from competitors!
Review-mill programs work by having the reviewers buy the product through amazon (they are reimbursed or discounted). Then they do a write ups. Many are short and organic looking, the more easily spotted ones contain lots of text and multiple photos of the product.
I did a similar thing for Amazon reviews many years ago but it was not very insightful. (http://minimaxir.com/2014/06/reviewing-reviews/)
The lack of trend toward the middle implies that the scale is not great. But there is no incentive to fix it.
So if something has a large, extremely passionate fandom, ratings can easily be skewed way higher than expected. If it's got a large (or at least vocal) group that seriously hates a movie or game, its ratings will be rock bottom. For example, Call of Duty Infinite War and its really bad Youtube ratings. Or the really, really critical coverage of Paper Mario Colour Splash and Metroid Prime Federation Force.
Not saying fake reviews and ratings don't exist for commercial gain, they do. But an angry (or really fanboyish) internet mob can cause this phenomenon about as much.
Thankfully, iTunes displays Rotten Tomatoes ratings.
I find IMBb rating 98% accurate. No, I won't necessarily like a 8.1 move better than a 8.0 movie, but I will like it more than a 7.5 movie. And a 6.9 movie will always be crap to me.
I can think of very few movies where IMBb rating didn't agree with me at all, in either direction. And it's usually with movies that have a very social/philosophical/american message. But these are biased even more on the non-IMDb sites, like rotten tomatoes!
It might not work for everyone, but it sure works for me.
And, for saying anything even slightly negative about a big tech company, I of course got downvoted to oblivion by everyone adding their own anecdata.
Most loudly, at least one commenter linked to the reviews on Amazon itself, some tens of thousands with a 4-ish star average, as if this was definitive proof of something. Even just casually looking through the reviews, it was easy to see that almost all of the glowing, positive reviews were ghost written, or came from "Top X reviewers" who clear have a selection bias for offering positive reviews to reinforce their status as a highly ranked Amazon reviewer, or from people who got advanced copies of the device (again, selection bias if they are the sort of people who actively seek out new tech to review), and on and on.
What an echo chamber this place has become (no pun intended).
Maybe I have had a bad experience and want to warn others: That would explain mostly bad reviews for almost every product. (Given enough users, some will have bad experiences, if only it is by them misusing it)
I can understand that mostly bad reviews creates an incentive to counter these with good reviews, but from the wrong side (seller, not consumer). This would explain the many fakes.
I guess reasonable trustworthy 3rd party sites like consumer reports are still necessary.
Unfortunately I'm now of that mind-set as well.
A few years ago I was in the top-500 of reviewers on Amazon UK, based on 'helpful votes'. I only reviewed things that I had bought for myself and focused on items that had no or very few existing reviews.
Then two things occurred quite quickly:
1. I learned that many top reviewers were receiving free samples as part of Amazon Vine. I asked myself: why am I spending my time doing this without recompense? They were churning-out multiple reviews per week and I was doing perhaps three or four a month, but it still felt unfair.
2. People starting aggressively scoring reviews negatively for subjective reasons. I remember an airline stewardness marking as 'unhelpful' my review of a book because she said I had insulted her airline. Amazon weren't interested in reverting down-votes like that.
I drifted away from Amazon reviewing after that and in time I have also drifted away from Amazon. They never fostered their community of honest-amateur-reviewers ( with, say, a Gift Card at the end of the year ) and they're paying the price now.
I'm not trying to be negative but I don't see what value the type of review you say you are writing adds.
It's just a dinky review, you're not supposed to do a full product breakdown. If everyone thought of amazon reviews this way and just gave 30 seconds for a review for every item, then there would be no automated review problem.
The challenge is liquidity, ie how do you seed the network and content but if you can solve that you can operate a review service for a fraction of the resources currently being wasted on these red queen races.
I've been thinking about this problem a lot too. There's actually many services that have come and gone in the review space. The solution I've come to is that any review service needs to be more than just reviews. It really needs to be more of a community.
So the original item gets legit great reviews. But the reviewers never mention which seller they bought from. 3rd parties glam onto the high ratings and a lot of people get disappointment either with fulfillment or the product itself.
This is why I no longer buy unless it is amazon or amazon fulfilled (so I can at least return it). But you pay a premium for that.
Amazon could solve this by just noting which vendor the reviewer bought from on the subject line.
I think it's a little ridiculous that so much power is given to many buyers that only want to take out their anger on someone, even if it isn't really the fault of the seller.
I used to be an Amazon seller and quit because so many people would leave negative feedback over unrealistic expectations.
It seems to be the M.O. of the millennial generation.
In the case of electronic, photography stuff and similar you find people get more consistent results with some products. For instance, charging cables. Hypothetically you could buy a $5 third party cable vs a $20 OEM part, but any thought you might have a fire leana to the $20 cable.
When I search for a product, I now often get tens of matches, all almost identical, all having different prices and description and equally bad photographs.
In the end, I ordered a case (after checking YouTube video reviews) and was contacted by the seller to review their product on Amazon in exchange for 10% off my next purchase with the same seller. BUT not if I was going to give them less than 5 stars; in _that_ case they wanted me to contact them directly.
The portals themselves are a very lucrative business, and are not going away anytime soon. There is an arms race between sellers to get that sweet sweet real estate (get their products on page 1 or page 2), and business can be a cutthroat matter. Of course, the system will eventually come to a steady state when too many buyers get burned buying crap. Review and feedback systems is a topic I'm very interested in.
I wish they would take it one step farther and add a option to only show verified reviews and associated stats for those verified reviews
My personal filter is to ignore any review that uses any variation of the phrase "found this little gem!".
Note that I'm talking specifically about the authors critique of the vine program, obviously the review spam is a real problem.
That aside, I do find wirecuter to be a good starting place.
Funny story: a co-worker of mine fell for the fake-review trick and bought a product, talking about its rave reviews. I knew from looking at the reviews that they were fake and he was going to be disappointed. He said "No, look, it has great reviews". I said okay, you're going to be disappointed. And he was disappointed. It was the "Amazon featured" lock laces.
Two recent examples:
There is a cheesesteak restaurant near me that has 4 stars on Yelp. People rave about how they have the best cheesesteaks ever! Yet if you look more closely, reviews from people that have eaten a cheesesteak on the east coast are much lower. After trying it myself, the food is terribly bland.
I was recently looking at coffee makers. The "Ninja" coffee maker has glowing reviews (though I suspect a chunk of them are fake). But the reviewers are comparing this $200 Ninja machine to their $30 Mr. Coffee. Again people that actually like good coffee are rating it much lower. Then looking at the reviews for a different coffee maker, many reviewers are comparing the result to their Aeropress or lower end Espresso machines.
Food is tough in some regions. In California, you get open warfare about the authenticity of Mexican restaurants. In the Midwest, people think Subway is a New York Deli.
For example, his books sold maybe ten copies a week, yet somehow got hundreds of glowing and suspiciously worded reviews on Amazon. Got to the point other sites (and Wikipedia) basically put up warning signs.
So yeah, even before the current crop of reviewer sellers started popping up, this system was being gamed by the authors and creators themselves using sockpuppets to flood it with fake reviews for their work.
Now the reason for that is because it super hard to get going with your product without sending discounted samples to reviewers. Because even if you spend tons of money on ads and somehow convince customer to buy your product without reviews, that customer in 99.9% will never leave a review. And if you dare to email this customer to ask for feedback - they will curse you, complain, report to amazon.
That's where things currently are - it is two way street. Without any traffic on it you will end up with very few shitty products by big companies who will have no reason to improve quality or innovate.
Just 2c from the other side of the fence.
I find wirecutter usually accurate in facts, but I rarely agree with them on their recommendation of what is good for me. That being said, they are very explicit in why they recommend something, I just wish there were other reviewers who shared my set of interests and values in things.
Meanwhile the "Recommended with honest reviews" products have acquired some one star reviews of their own.
Useful or not? No idea. Doesn't look like it so far.
But this is probably an unsolvable problem.
You could solve it by hiring and paying 100% independent mystery shopper reviewers with unimpeachable integrity, and only reviewing products actually purchased for real money.
I don't think many online merchants would support that.
This is a fundamental problem with scarcity transactions - it's always more profitable to game the system than to play honestly.
Then, at the pediatrician's office we noticed them using an inner-ear thermometer. I asked the nurse to see it. It was a Braun Thermoscan. They aren't cheap at ~$70, but we bought one 5 years ago and that was that. It's fast, accurate and consistent. It paid for itself long ago and still works like it was new.
And one more time, because of the context of this comment: I have no affiliation with Braun.
So they are using real users who then with a wink and a nudge provide 4 & 5 star reviews to keep the free products coming.
> For example, returns and long-term use aren’t part of the evaluation. When you get something for free, you’re less likely to follow up on breakage concerns or customer service issues. Additionally, if the reviewer didn’t actually buy the product, that person doesn’t take the purchase and shipping processes into consideration.
I don't feel that this is true. People who are selected for Vine aren't random Amazon customers. They are Amazon's top ranked reviewers. A lot of these reviewers have started reviewing hundreds of items before Vine or any paid reviewing programs even came into existence. i.e. They do it for the 'love' so they will discuss price and shipping considerations
Yes given Vine's 30 day review deadline, you're not going to see anything initially about long term use; but... a lot of these reviews do get updated if anything about the product changes from the original review.
> You might notice how few of the reviews through Vine and similar programs are negative or even critical. This isn’t a case of reviewers intentionally being dishonest, but rather the result of unconscious positive bias.
I don't disagree here.
> Not paying for an item can make difficulties with that item seem less irritating.
Vine items are now treated as taxable income starting this year, so Vine reviewers are now paying for the items that they select.
> Additionally, reviewers may give their opinions on items for which they have no expertise or real experience and therefore have no frame of reference about how well something works by comparison.
I could be wrong but I feel that available Vine items are based on the customer's past purchase and review history. There's no guarantee that the customer is an expert, but there's some proof that the reviewer is at least familiar with the product category.
I'm sure that there are some bad Vine reviews and bad Vine reviewers. However discrediting the entire program without any data just doesn't seem fair.
(In case it wasn't already obvious, I'm part of the Vine program)
The other issue is that psychology studies show that being given something ingratiates a person to the giver, which also greatly taints the Vine program.
> The other issue is that psychology studies show that being given something ingratiates a person to the giver, which also greatly taints the Vine program.
I agree with this. Another problem which the author mentioned is the lack of a review standard. One person's 5 star rating is not equivalent to another person's.
Another idea is to randomly select customers who actually ordered the product to leave a review. Since it's random you wouldn't have the self selection problem of people getting paid to leave reviews.
It will be "Get recommendations from people you trust!!! (sortof)".
And, in short order, we'll be back to the same situation, only more tied-up, gagged and corrupted. ("Hey, pssst, make money fooling you friends!")
Currently, it's really easy to bribe a random person to write an anonymous review. If the maker of some product has to bribe a friend of mine in order to get me to buy that product, it's going to be a lot more work for a lot less gain. Eventually it starts to look a lot like a company bribing all of its customers at once to buy its products, which if they can pull that off and make money, more power to them I guess.
The social media companies have the data and infrastructure to implement some sort of "product reviews weighted by the reviewer's proximity to you in the friend graph" feature, but I don't see users trusting Facebook or LinkedIn to be an impartial source of trustworthy reviews. (Also, a lot of Facebook friends aren't necessarily people you'd use for product recommendations, but that's a separate problem.)
I've got a competitor that has made millions making fake reviews to boost up his lousy little app. He's been at it for about 5 years!
This article highlights pretty much precisely the same problem that the "social media promote (bogus) conspiracy theories" item yesterday did. Slightly recycled comment.
Some related concepts:
1. Donella Meadows, in Thinking in Systems, notes that an absolute requirement of an effective and healthy system is accurate feedback and information. Media which are indifferent to truth value, or which actively promote distortion (see Robert Proctor's term, agnotology), will actively harm the system.
2. Celine's 2nd law, and inversion. In Robert Anton Wilson's Illuminatus! trilogy, a character notes that "accurate information is possible only in a non-punishing situation". Its inverse is also true: acurate information is only possible when it is accuracy itself and ONLY accuracy which is rewarded. Academic publishing, in which paper output and journal selection is a gateway determinant of professional careers, would be an instance of this. Or the long skew of The Learning Channel from NASA-PBS educational co-production to Honey Boo-Boo broadcaster.
3. Paperclip Maximizer. "Don't be evil" isn't good enough. You've got to actively seek out good. Even a benign maximisation goal will, if not tempered by requirements to provide net benefit, lead to catastrophic results.
4. Mancur Olson's "The Logic of Collective Action" explains how and why small (but motivated) fringe groups can achieve goals directly opposed to the interests of far larger groups. This explains a great deal of market and political dysfunction.
5. A generalisation of Gresham's Law leads to the realisation that understanding of complex truths is itself expensive. It's also (Dunning-Kruger) beyond the capability of much of the population. This also has some rather dismal implications, though as William Ophuls notes, political theory based on the assumption that all the children can be above average ("the Lake Wobegon effect") are doomed. You dance with the dunce what brung ya.
Amazon here will probably have to create and fund their own review staff if only to vet and validate user-supplied reviews. As one HN poster here notes, "what is my incentive to provide reviews?" A good review takes time, experience, is likely to draw flack, and, frankly, often comes from someone with better things to do with their time.
Crowdsourcing is useful where data are thin and the crowd is likely to be unbiased. Where specific expertise is required, Peter Noverg on programming texts, the Google employee who's uncovered numerous fraudulent USB-C cables, one qualified reviewer trumps millions who know nothing. (But you've got to ensure that the qualified reviewer stays both honest and engaged.)
More generally: the Amazon experience seems to be worsening. Trust is falling to eBay-like levels, if not worse. Search and discovery are poor, and product quality is all over the map. Books are one matter -- highly uniform product, simple function. Expensive kit of various description another. Among my recent purchases, an LED cabinet lighting system in which local, major chain, and Amazon sources (brick-and-morter and online) failed. Ikea turned out to have a well-thought-out product line, helpful (though ultimately not quite what I was looking for) information online, and, most importantly, an in-store display in which I could assess, mix, match, and ultimately assemble the system I'm using and enjoying now. Twice the price of what I'd initially specced out, but that system didn't work in the least.
"The compensated-review process is simple: Businesses paid to create dummy accounts purchase products from Amazon and write four- and five-star reviews. Buying the product makes it tougher for Amazon to police the reviews, because the reviews are in fact based on verified purchases."
(Submitted title was "Amazon Reviews Are Gamed by Compensated Reviews by Professional Review Companies".)
"Lets Talk About Amazon Reviews" is linkbait IMHO, it fails to cover the topic of the article in the title, only the subject.
Granted the Amazon program is more transparent (as the review is affixed with a label that says "user was compensated for this review"), but it's disingenuous at best for Amazon to act like they're taking the moral high ground by shutting down these review sites.
Has Amazon come out and said they're taking the moral high ground? I just assumed they were suing them for breaking their terms of service. Many companies do this if you break them frequently enough.