

With 1-5 star ratings, most things average out to about 4.3 - c3o
http://www.onthemedia.org/transcripts/2009/10/09/08

======
imgabe
I've often wondered about this. I always suspected online reviews would tend
to skew high. I think there's a couple of factors at work.

First, people are pretty good at predicting if they'll like something. If it
looks like they're not going to like a movie, they don't watch it. Hence, the
majority of people who review something, are people who expected to like it.

Second, 5 star reviews are kind of pointless for something like cotton balls
or artificial sweetener. Either you got the product and it's the right thing
in the quantity you wanted, or it's not. There's not a lot of room for
subjective discrimination between one box of Splenda and another. Since the
product lived up in every way to your expectations, there's really no reason
to give it less than full marks.

~~~
ugh
To your second point I would just like to add that such a skewed distribution
of reviews is actually not so much a problem. Most dog foods will get five
stars because they just do their job. And only a few don’t, hence the skewed
distribution. But the customer has no problem at all to avoid bad products. If
anything such a clear dichotomy should make it rather easy.

(Just one little theory of mine: I think star reviews don’t work anymore when
it comes to more complex [say, than dog food] things. For those you need to
read the written reviews and ignore stars. [And maybe look at the
distribution. Finding out that a product is polarising and the reasons for
that can be helpful.] If someone rates a camera with one star because it
doesn’t film 1080p [a perfectly valid complaint] I don’t care much for his
opinion because of different preferences. His written information is
nevertheless helpful [I could, after all, care for 1080p], but the information
conveyed in his one star rating is close to zero.)

~~~
warp
I wouldn't say ignore the stars, but use them as a filter. Read only the
negative reviews. Any 5 star review is just a rabid fan, you can safely ignore
those. 4 star reviews might be more interesting, but the real content is in
the 1, 2 and 3 star reviews.

Whatever issues those people have with the product might be genuine issues
which you would also not be happy about, or the issue might be something you
wouldn't mind. I've bought products based on negative reviews purely because
whatever the reviewer didn't like about it, it was something I know I _would_
like.

Usually though the reviews won't tell you whether a product is any good, but
assuming there are enough reviews on a product the reviews will allow you to
determine that certain products really are crap.

~~~
dkarl
_Read only the negative reviews.... the real content is in the 1, 2 and 3 star
reviews._

I always feel sad rating a movie three stars for "Liked it" on Netflix,
because I know that Netflix's rating system is incompatible with the way
people assign ratings. Any rating system has to assume that less than half the
rating system will be used for neutral-to-positive reviews, and the rest of
the system is a big wasteland of varying degrees of "This sucks!" I know the
recommendation system doesn't care whether I mean three stars to be positive
or not, but it still bothers me.

Is this just an American thing related to the hundred-point scale used in
schools, where 69/100 is a failing grade? Or is it universal?

~~~
imgabe
I think the great thing about netflix is that it's recommendations try to
predict what _you_ would rate a movie.

So, if you consistently use three stars to indicate "liked it, reasonably
good" and I use it to indicate "waste of time" then our respective
recommendations from netflix will be different.

The movies it predicts as 3 stars for me, will be movies I think are a waste
of time, and for you would be movies you might find reasonably enjoyable.

------
thinksketch
The funny thing about the 5 star rating system is that it's completely the
wrong model for many situations in which it's used. It originated from sites
like slashdot where, because everyone had to read the same front page of
content, it made sense to pick the content democratically using a five star
system.

But, systems like netflix that use the five star rating system have it all
backwards! Their goal should be to establish the niche genres that attract
each user. This goal has nothing to do with brute popularity. Sure you want to
assess how much someone likes a movie in order to steer them into the correct
genre, but it doesn't make sense to attach an overall popularity rating to the
movie.

Given a hugely diverse database such as the netflix movie library, it's
ridiculous to assume that individuals will like things in proportion to the
average popularity. That's not how taste works. And the weird thing is, there
is no reason for these databases to constrain themselves to an averaged
popularity index - they're just accustomed to the five star model is all. They
should be using a micro-genre mapping scheme that steers you towards clusters
of movies that have received attention from users with similar taste.

Meanwhile, by constraining your rating to a discrete number of stars (1,2,3,4,
or 5) they are killing the quality of their sample. (See the jellybean
guessing experiment in the link below..)

These algorithms should ditch their discrete-value database of how much users
say they like something, and instead use some continuous measure of how much
attention is spent on each item. Hell, some people love to watch crappy movies
and write bad reviews for them. Anyway, for more on this rant, see the link
below.. cheers-

[http://www.thinksketchdesign.com/2008/05/03/design/algorithm...](http://www.thinksketchdesign.com/2008/05/03/design/algorithm-
design/does-the-netflix-challenge-have-it-backwards)

~~~
shpxnvz
Granted, it's been a year or two since I ditched Netflix, but as I recall
their recommendation system already ranked the importance of ratings based on
how similar the other reviewer's tastes were to yours (I imagine by finding
other people who tended to have ranked movies the same as you).

My impression was that they were far more concerned with doing well with
recommendations than with the plain rankings.

~~~
thinksketch
That's true, but their entire model is still built on a database built up of
5-discrete-value ratings. True, it does seem that they place an emphasis on
reviewers' similar tastes, but they are still crippled by the poor resolution
of their data.

They're spending so much energy tweaking their 5-star algorithm by tiny
amounts, but it seems like they'd be much better off investing in a richer
database medium - like attention spent browsing various genres on their
website while looking for their next movie...

I don't know, it just seems like the five star system is so crude for a
company willing to spend millions to improve their recommendation system by
even a tiny amount.

"So you like this movie? Like, would you say, "4 stars" like it, or "5 stars"
like it?" really? That's what your database is made of? - know what i mean?

~~~
kakooljay
That is a great point.. imagine a prof issuing grades that way. I'd love to
see someone with a 100-point system do some A/B testing to see how much
recommendations degrade with a 5-point system.

------
eli
Well, sure. Who is going to take the time to write a review for something that
is just "ok"

------
andrewljohnson
Not on the App Store they don't. Not when money is involved.

