
The Story Behind the Worst Movie on IMDb - thampiman
http://fivethirtyeight.com/features/the-story-behind-the-worst-movie-on-imdb/
======
cgtyoder
538 is using the wrong metrics. For a movie to be truly, truly, _sublimely_
bad, that badness needs to be recognized - people in some cases will want to
watch it for its incredible badness. This means that there are going to be a
disproportionate number of both 1 and 10 votes. And the winner based on this
is - The Room.
[http://www.imdb.com/title/tt0368226/ratings?ref_=tt_ov_rt](http://www.imdb.com/title/tt0368226/ratings?ref_=tt_ov_rt)

~~~
gabemart
A clear conversation is difficult in the absence of a clear, canonical
definition of "bad". It would be very easy for me to get a few friends
together and make a 90 minutes improvised movie filmed on my phone. That would
be an absolutely terrible movie, but it wouldn't be a very satisfying answer
to the question "What's the worst movie?" because it would be seen by very few
people and it would be made with no ambition of quality. I could go further
and make a movie that was 90 minutes of a black screen. That would be a worse
movie but an even less satisfying answer.

Even if you limit the field to movies that have seen a theatrical release, the
same principal applies. Some movies are shockingly bad, but are made with few
little effort by people who know they are making a bad movie. These seem less
interesting, less sad and less funny than very bad movies made by people
trying very hard and spending lots of money.

Personally, when rating the badness of work, I judge it against my own
assessment of the potential of the work. The potential isn't directly
measurable or quantifiable, but it consists of things like the budget, the
vision of the creator, the skill and effort of the artists involved in the
creation, etc. etc.

So, for example, I think of Star Wars: Episode 1 as being a worse film than a
lot of 50s low-budget sci-fi B-movies, even though I might choose to watch it
more readily. The B-movies were pumped out without much effort. They're boring
and poorly made on a technical level, but they never really had a chance to be
any good. Episode 1 was made on a vast budget, drew on extremely rich source
material, and was the culmination of unimaginable amounts of time and effort
by a large number of hard-working people. For me, that makes the failure of
Episode 1 more profound and more egregious.

~~~
NAFV_P
> _I could go further and make a movie that was 90 minutes of a black screen.
> That would be a worse movie but an even less satisfying answer._

What about a blue screen?

[http://www.imdb.com/title/tt0106438/](http://www.imdb.com/title/tt0106438/)

~~~
CamperBob2
I like drysart's criterion above (the delta-Q factor) as a way of
discriminating between lowbrow but honestly-crafted crap like the usual MST3K
fodder, and expensive yet incompetent crap like SW Episode 1.

But you raise an interesting point, in that there should also be a penalty for
novel, provocative work that, while perhaps well-executed, rests on principles
that can never be used again, so they don't advance the state of the art or
otherwise leave us with any enduring influence. Works like _Blue_ and John
Cage's _4 '33_ would fall into that category, I think.

~~~
NAFV_P
> _But you raise an interesting point, in that there should also be a penalty
> for novel, provocative work that, while perhaps well-executed, rests on
> principles that can never be used again, so they don 't advance the state of
> the art or otherwise leave us with any enduring influence. Works like Blue
> and John Cage's 4'33 would fall into that category, I think._

I've seen the first ten minutes of Blue, it's actually really good. After ten
minutes you start seeing stuff, but you can't work out if it is your own
imagination or the film itself. The concept of Blue could be advanced: I was
thinking of "Gamma" \- viewers are exposed to intense gamma radiation for 90
minutes.

~~~
shabble
Whilst I suspect you're taking the piss, I wonder if you could you induce
[https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena](https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena)
without receiving a medically serious dose of radiation?

Those buzzkills at the FDA have already banned any artistically worthwhile
amount of X-rays from CRT televisions though[1] :(

[1]
[http://www.gpo.gov/fdsys/pkg/CFR-2013-title21-vol8/xml/CFR-2...](http://www.gpo.gov/fdsys/pkg/CFR-2013-title21-vol8/xml/CFR-2013-title21-vol8-sec1020-10.xml)

~~~
NAFV_P
> _Whilst I suspect you 're taking the piss, I wonder if you could you induce
> [https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena](https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena)
> without receiving a medically serious dose of radiation?_

It's a joke, apologies I couldn't resist. Some art forms and certain works of
art are genuinely dangerous [0].

Regarding cosmic rays, I thought it was caused my massive particles, and being
able to produce them artificially would be a considerable scientific feat -
its artistic relevance would pale in comparison.

> _Those buzzkills at the FDA have already banned any artistically worthwhile
> amount of X-rays from CRT televisions though[1] :(_

An old chemistry teacher of mine used to use an old TV screen as a cover for
potassium+water reactions, it was about an inch in thickness.

[0]
[http://en.wikipedia.org/wiki/Richard_Serra](http://en.wikipedia.org/wiki/Richard_Serra)

------
dubfan
The opposite effect, a bad movie being rated highly, also occurs. One example
of this is with the film "The Oogieloves in the Big Balloon Adventure"
([http://www.imdb.com/title/tt1520498](http://www.imdb.com/title/tt1520498)).
This film was panned by critics on release and was a complete flop at the box
office. Its IMDB rating languished until sometime last year when it began to
rise. Around the same time, a bunch of poorly-worded unequivocally positive
reviews showed up on the page, all following the same pattern (including one
that appears to have been run through a translator and back). The rating made
it up to 8.0, although it's since fallen to 7.6. If it's not the movie's
production or marketing team involved in this clear manipulation, I don't know
who it would be, as I can't see that this movie has any kind of cult
following.

~~~
pawn
It could be the same sorts of people that write raving reviews for the 3 wolf
moon shirt on Amazon.

~~~
johnward
so 4chan?

~~~
pawn
Plausible

------
nkozyra
I feel like there's an assumption that this data is _erroneous_ for this
reason - I think that any reason in which a person gives a negative rating
toward a film is valid, be it politically motivated or otherwise.

FiveThirtyEight has had a rough start so far so I may be more inclined to seek
more nuance than I normally would as they've tended to be extraordinarily
simplistic (Silver himself notwithstanding) to date.

~~~
what_ever
But most of the people who has given the rating have not seen the movie. They
are giving the lowest rating just to put their point forward. This is not a
documentary to put the truth on the screen (I am not sure how much of the
movie is fiction).

I think the author calls data erroneous as the rating does not represent the
actual quality of the movie - a thing which IMDB wants to achieve. He doesn't
call it erroneous for actual errors in the process of collecting ratings.

I think, as you said, you are just inclined to seek more nuances.

~~~
nkozyra
> But most of the people who has given the rating have not seen the movie

I'm sure it's possible, but _no_ rating represents the "actual quality of the
movie." A rating is based on a subjective amalgam measurement that encompasses
a myriad biases, even when watched and with total scrutiny.

~~~
jorleif
While this is true, I would argue that there is a social expectation that an
IMDB review is written by someone who has seen the movie. Especially, it is
expected to be true on average with the usual "few bad apples" thrown in the
mix. In this case, this expectation does not hold, and in this sense these
reviews are truly anomalous (they are produced by a different process than the
other ones). From the perspective of the expected usage of the reviews this
anomaly is indeed possible to see as an error, rather than just slightly
different behavior.

------
eightofdiamonds
I found it odd that the article makes note that 36,000 ratings come from
outside the U.S. Maybe I am missing something but wouldn't it be expected that
many of a Bollywood movie's ratings would come from outside the U.S.?

~~~
jebus989
I'd guess on Indian review-aggregator websites written in Hindi the trend is
reversed ;)

------
rcavezza
The irony is that they wanted to bury the movie on imdb, but they did such a
good job that it is becoming a famous case study.

~~~
meepmorp
I don't think they wanted to bury the movie. I think they wanted a mass
showing of anger to demonstrate their power, to themselves and others. They
want people to know who's doing it and why.

------
calibwam
I would never have heard about this movie if it hadn't been so low rated.
Perhaps they rather should have given in 5 stars, burying it in mediocrity?

~~~
regoldste
If their goal was to bury it in mediocrity, I think that would be a good
strategy. But I think their goal is to bring attention to a movie that
distorted history in a meaningful way, then down-voting it is a pretty
effective strategy. If you can down-vote it to the level where it has the
worst score in the IMBD, or even among the ten lowest, it attracts attention
and goads curiosity about what makes the movie so awful, which leads people to
Google it and quickly find information about this social movement.

------
bluedino
Luckily, IMDB also shows Metacritic ratings next to their own user ratings. I
take those into account before I make my decisions on what I'm going to
rent/stream.

Far too often do you find a movie (usually a somewhat recent release) that's
just flat out terrible despite having a > 7 score on IMDB (and usually a
Metacritic score in the 30's or 40's)

~~~
edanm
Do you have any interesting examples that you've found?

~~~
jebus989
Excuse the plugging of my own wares, but by coincidence I did some analysis
the other day of the disparity between User/Critic ratings of movies on
RottenTomatoes: [http://benjaminlmoore.wordpress.com/2014/05/05/what-are-
the-...](http://benjaminlmoore.wordpress.com/2014/05/05/what-are-the-most-
overrated-films/)

Example: "Spy Kids" got a 93% fresh rating from critics, but 45% from audience
ratings.

Code attached should you want to investigate for yourself.

~~~
icebraining
If you look at the Average Ratings, the difference is smaller, though: the
critics gave it 7.2/10 while the audience gave it 5.2/10.

------
fuzionmonkey
Troll 2 used to be the worst-rated movie on IMDb until it became a cult
favorite and garnered tons of 5-star ratings.

If you're interested, there's a really interesting documentary about Troll 2
called Best Worst Movie, which covers the film's production and cult
following. After watching the documentary, Troll 2 becomes an enjoyable movie
because you know all the crazy backstories to the actors and scenes.

------
sillysaurus3
_He was sentenced to life in prison for his crimes by the Bangladeshi
International Crimes Tribunal. But many Bangladeshis found that sentence too
lenient, and more than 100,000 of them gathered in Shahbag Square in the
capital city of Dhaka to challenge it._

It's interesting that so many people gathered solely to express "we want this
guy dead, not imprisoned." I wonder if there's more context?

~~~
kchoudhu
There is.

Given the dynamics of Bangladeshi politics, a life sentence from the current
government is actually a jailing until the opposition comes to power and
pardons everyone convicted by its predecessor.

I'm against capital punishment, but I understand where the protesters are
coming from: they want real justice for the atrocities in 1971, and see the
death penalty as the only way to obtain lasting closure.

Politics in a democracy of 180MM people is really, really messy.

~~~
seabee
It seems messier than politics in a democracy of 300MM people or 70MM...

~~~
kchoudhu
You refer (I assume) to the UK and the US? One invented modern parliamentary
democracy, and the other is a global superpower. Both have been at the
business of democracy for 200+ years, and have never had to deal with the
problem set Bangladesh is facing today.

Snark serves no one well.

~~~
6cxs2hd6
>>> Politics in a democracy of 180MM people is really, really messy.

>> It seems messier than politics in a democracy of 300MM people or 70MM...

> Snark serves no one well.

FWIW I didn't see it as snark. You seemed to imply the size of the democracy
mattered. His/her response was a couple counter-factuals. Then you seemed to
acknowledge it's not the size, it's the age of the democracy (which is a great
point).

In any case I found the rest of your original post really helpful context,
explaining why a "life" sentence probably won't turn out to be that. Thanks!

------
BitMastro
Maybe a solution could be weighting the vote according to the user history: a
user leaving a single vote on a single movie shouldn't be as influential as an
user that voted on a wider range of movies over time

~~~
TheAnimus
Also the range of their votes.

If someone only gives 10s or 1s to everything, their vote of 10, should
probably have less weight, than someone who distributes their votes more
evenly.

~~~
radisb
This is not correct. Many people dont bother to vote or review unless the
movie for them is at an extreme goodness or badness, so the hassle is worth
it. I ve watched a lot of so-so or simply good movies, but I only have voted
5-6 times and those got either a 10 or a 1.

------
brudgers
_Data gone Wrong_

Did they start hanging out with the bad kids, take up cigarettes, drinking,
gambling only to progress to crack and burglaries one of which ended with our
Data shooting a home owner who returned unexpectedly?

I guess I don't understand what data is. I always thought it was a set of
values. And I always thought that the problem when using data was in the
interpretation, and that a prudent consumer of data would always be careful to
distinguish between a random sample and self-selecting sample when drawing
conclusions, and then would only state conclusions couched in the language of
statistical inference.

Leaving aside the question of why I should give a fuck about this supposed
outrage, why does the author expect there to be a strong correlation between
movie quality and the ratings on a website devoted to providing entertainment
by having users rate movies?

When _The Matrix_ is purported to be better a better movie than _Lawrence of
Arabia_ , the problems of interpretation are systemic.

~~~
mturmon
Everything you say is right, of course. Yet, I upvoted the story.

I thought it was indicative of a larger trend where crowdsourced data are used
to illustrate a point. Like the Google flu trends articles, which have gone
around HN at least twice, once when they were successful
([https://news.ycombinator.com/item?id=5040204](https://news.ycombinator.com/item?id=5040204))
and once when they were critiqued (e.g.,
[https://news.ycombinator.com/item?id=7455307](https://news.ycombinator.com/item?id=7455307)).

I work a lot with sampled data, and I have found that sampling issues can be
some of the most difficult to appreciate and to quantify -- even for experts.

I guess it comes down to sampling from one distribution, P(x), when the
situation you really care about samples according to a different distribution
P'(x). If P is far from P', your conclusions from P can be arbitrarily bad. If
you have an adversary moving P around deliberately, as here, it's even worse.

~~~
brudgers
Statistics experts are fewer and further between than experts in other fields
who use statistics to justify their decisions, and the article shows how far
off base most people are...after all the author conducted numerical analysis
of the database and presents their findings as facts about data and includes a
rough statistical comparison of the voting patterns of the lowest rated
[called 'worst'] and the second lowest rated movies.

If there is an interesting statistical result it's that the movie's rating is
entirely consistent with crowd sourced predictions. The theory is that 'wisdom
of crowds' results directly from diversity among those making predictions.[1]
In the case of the lowest rated movie, those making predictions were unusually
homogeneous, and therefore an inaccurate prediction as to the quality is
unsurprising.

Again, it's all in the interpretation, e.g. there's statistical evidence that
a lot of morons ranked the _The Matrix_.

[1] Diversity Prediction Theorem:
[http://vserver1.cscs.lsa.umich.edu/~spage/ONLINECOURSE/predi...](http://vserver1.cscs.lsa.umich.edu/~spage/ONLINECOURSE/prediction.pdf)

------
DanBC
Years ago, when you could download roms from the mame website, I gave a strong
up score to the Tron romset.

That tipped the romset into the spotlight - there were some leaderboards for
highest recent activity and so on. Other people started downloading the romset
and voting on it.

Suddenly this obscure romset was catapulted into most of the lists for "most
active" and "best" etc.

I had rated the game honestly. I had fond memories playing the cab for a week
on holiday in my youth. But I was surprised that so many other people felt the
same, especially on the Mame platform where the game's controls made it
tricky.

All my deliberate attempts at voting shennanigans failed miserably. (Although
I haven't investigated MTurk or similar yet.)

I wish there was a site like Meatball wiki where people could share their
vote-weighting methods.

~~~
__david__
> Suddenly this obscure romset…

Obscure? :-(

This was my favorite arcade game as a kid. I don't think it was very obscure.
Though I have an actual Tron arcade machine not 10 feet from me, so perhaps
I'm the wrong person to judge that…

------
z3phyr
For more information on the war, read the wikipedia article:
[http://en.wikipedia.org/wiki/Indo-
Pakistani_War_of_1971](http://en.wikipedia.org/wiki/Indo-
Pakistani_War_of_1971)

~~~
meepmorp
This is what I was thinking when reading the article. IIRC, east Pakistan was
getting massacred (literally) until India stepped in and essentially beat West
Pakistan.

------
NAFV_P
The archetypal 'bad' movie is "Plan 9 from Outer Space" [0], by Ed Wood. It
holds that title mainly because of its production values.

This one has caused a stir on IMDB for reasons of moral integrity, or bending
the truth. Rather odd, since this has been going on in films and similar media
for a very long time.

"International Gorillay", a film from Pakistan depicting Salman Rushdie [1].
Coincidentally it was released a year or so after Rushdie's "The Satanic
Verses" [2]. Ayatollah Ruhollah Khomeini [3] wanted Rushdie dead because of
this novel. BTW Iranian films can be very good, like "Where is the Friend's
Home?" [3].

[0]
[http://www.imdb.com/title/tt0052077/](http://www.imdb.com/title/tt0052077/)
[1]
[http://www.imdb.com/title/tt0251144/](http://www.imdb.com/title/tt0251144/)
[2]
[http://en.wikipedia.org/wiki/The_Satanic_Verses](http://en.wikipedia.org/wiki/The_Satanic_Verses)
[3]
[http://en.wikipedia.org/wiki/Ruhollah_Khomeini](http://en.wikipedia.org/wiki/Ruhollah_Khomeini)
[3]
[http://www.imdb.com/title/tt0093342/](http://www.imdb.com/title/tt0093342/)

------
bowlofpetunias
IMDB ratings are bull. Lots of great movies, often even classics, rate around
6 on IMDB because the general public considers them "boring".

The only movies that escape from the IMDB average are a) decent movies that
are loved by the masses, b) great movies the masses don't watch (being in
black and white or not in English alone pretty much guarantees 2 bonus points)
and c) movies everyone agrees on are total crap.

~~~
tveita
If the general public considers a movie boring, giving it a mediocre rating
seems fair. Other members of the general public will then know to avoid it.
They'll want to watch movies they like, not movies you like.

What you may want is a personalized rating based on people with the same
tastes as you.

~~~
_delirium
> What you may want is a personalized rating based on people with the same
> tastes as you.

I haven't used it in a while, but I used to find MovieLens, from the
University of Minnesota, useful for that:
[http://movielens.umn.edu](http://movielens.umn.edu)

It even has a feature where you can get joint personalized recommendations for
you and a friend (assuming the friend also has a MovieLens account, of
course), which is useful for brainstorming movies to rent with someone that
might be mutually enjoyed.

------
dfine
Contains the best author disclaimer I've seen:

 _Data analysis by Eugene Bialczak. Also, a disclaimer: the author wrote much
of the IMDb Trivia App._

------
mkmk
It's not that the rating is inaccurate... it's that a huge population of
people that don't usually rate movies on IMDB has suddenly entered the rating
pool. Maybe the solution is to have country-specific ratings, so that my
ratings are averaged with the ratings of my peers, rather than the ratings of
people in entirely different cultures.

~~~
drzaiusapelord
The rating is completely inaccurate. The ratings have nothing to do with the
movie or its merits, but from a perceived slight that a bunch of bloggers and
nationalists were offended at.

------
toconnor
Any crowdsourced data can be easily manipulated by a well organized or large
enough group. See the results of the naming poll for the Megyeri Bridge in
Budapest in 2006.
[http://en.wikipedia.org/wiki/Megyeri_Bridge#Naming_poll](http://en.wikipedia.org/wiki/Megyeri_Bridge#Naming_poll)

------
danbmil99
A film I had the pleasure to work on, "Car 54: Where Are You?" has a rating of
2.4, and that was not due to any crowdsourced manipulation; people really hate
the movie. I thought it was marginally worse than Police Academy 4, but most
people apparently think it really sucks.

------
grlhgr420
if you use imdb as a general indicator of a film's quality you're much better
off entirely excluding the scores of films produced by india and turkey

------
anilgulecha
Very interesting -- a lesson for IMDB to filter votes that fall outside normal
distribution ranges.

~~~
Shivetya
A lesson for any site which allows voting. Participatory sites, regardless of
size, are subject to turfing or the like. Some handle it better than others,
some don't. The worst don't when the results favor their own ideology.

~~~
thaumasiotes
Well, interestingly, the problem here isn't quite that people are giving the
movie _false_ reviews. They might genuinely hate when this sort of thing
happens -- bitter fights over terminology aren't particularly rare. The
"problem" here is that they care more about the movie than do the general
population of movie-watchers; if everyone on earth had to watch Gunday and
rate it on imdb, all of these people might honestly rate it 1, but they would
all be swamped by reviews from people who thought it was decent (and that
group would likely be swamped by other 1-star reviews from people who didn't
speak the language, but try not to focus on them).

It's similar to the problem of an author's "best" or most notable works
getting lower crowd-sourced reviews than their average work. What happens
there is that generally people only bother to read, and subsequently review,
books that they think there's a good chance they'll like. If you're a mediocre
author putting out filler in your genre, your 6th, average, book will get
pretty good reviews because everyone who bought it knew what they were
getting. If your 7th book happens to be excellent, hype will induce a lot of
people who don't like your genre to buy it and see what everyone's talking
about -- and they'll take it out on you in their reviews afterwards.

------
mcguire
"hanged to death"? "to apply any new learnings"?

~~~
phaemon
If you think either of those is incorrect for some reason, please _state_ the
reason, so I can mock you for your ignorance.

It's difficult to mock properly when all you've given is a question mark. :)

------
bcRIPster
...and 4Chan starts pumping 5 stars into the votes in 4... 3... 2...

------
gphilip
"There are currently more than 235,000 films on IMDb, and ... not a single
qualified movie besides “Gunday” rates worse than 1.8."

"The next lowest-rated movie on IMDb — 1.8 stars overall ..."

I am not sure what the writer means by a "qualified" movie, but this one does
rate less than 1.8:
[http://www.imdb.com/title/tt2094870/](http://www.imdb.com/title/tt2094870/)

It has votes from only 195 users as of this writing, though.

~~~
_delirium
To qualify for the "bottom 100" list, a film needs at least 1500 ratings:
[http://www.imdb.com/chart/bottom](http://www.imdb.com/chart/bottom)

------
david927
It just needs to be bicameral: one rating from critics and one rating by the
public. Rotten Tomatoes does this, and I think it lends for a better overall
result.

------
xfour
Justin Beiber's Believe now at 1.5, just saying, and I believe that happened
more organically.
[http://www.imdb.com/title/tt3165608/](http://www.imdb.com/title/tt3165608/)

------
ulisesrmzroche
I don't see how crowdsourcing is hurting here. In fact, I learned something
new.

I'm thinking this movie otherwise gets forgotten in the trash bin of bad
movies and the data would never tell you anything because it wouldn't exist.

------
elij
Currently working on an approach that would avoid this type of gaming of
rankings by changing the way we collect ratings:
[https://aeolipyle.co/](https://aeolipyle.co/)

------
thetrb
I don't understand this part: "91 percent of all reviewers gave it one star.
The next lowest-rated movie on IMDb — 1.8 stars overall — has a more even
distribution of ratings, with only 71 percent of reviewers giving it one star.
The evidence suggests the push to down-vote “Gunday” was successful".

To me that's just stating the obvious. Of course if there is such a thing as a
worst movie then it will have a higher percentage of 1 star votes than other
movies. So I don't know how that's evidence for anything except that the movie
seems to be bad.

------
shubb
Any idea where this guy got the data? He mentions rating distributions.

I though the majority of IMDB data was not downloadable?

~~~
dagw
Check out [http://www.imdb.com/interfaces](http://www.imdb.com/interfaces)

Also you can buy a license for access to more complete datasets.

------
bakhy
hey, let the data speak for itself! ;)

seriously, i do take IMDB ratings into account, but i consider them unreliable
at best. Inception, when it came out, was the best movie of all time for a
while, according to IMDB users. enough said.

------
gjjhjjjhhh
Who would pay attention to ratings from a single unreliable source.

For me the best place are meta sites that gather from many related sizes.

~~~
ronaldx
I prefer the IMdB's ratings system over, say, meta-site Rotten Tomatoes.

At Rotten Tomatoes they rated Seth Rogen-vehicle "Neighbors" as 100%, last I
checked. Meaning only: no reviewer at that point had said it was awful. And
yet, I saw it and it was awful.

At IMdB, I can see that it rates as 7.6/10.0, and also that teenager voters
loved it (9.0/10.0), where older viewers hated it (5.0/10.0). Far more useful
information.

Gunday (the film of the parent article) gets fairer treatment from the IMdB's
top 1000 users, who rate it 4.9/10.0

~~~
peterwwillis
IMHO no single website should be used as a definitive source for the quality
of peer-reviewed content. Instead, averages should be taken from different
sites, compared to the types of movies usually reviewed as high or low on
those respective sites.

Each site has its own user base, and those users have their own biases. A
review from Rotten Tomatoes will vary greatly from those of sites that include
only noteworthy critics or only crowdsourced opinion without the "community"
aspect. Some communities will be more critical, while some will be less. Like
most online reviews, the criteria for rating is completely subjective; one
user gives it a 10 because it had their favorite actress in it, while another
user gives it a 5 because there was a scene they didn't like. What's awful to
you may not be awful to me, and crowd-sourced data or aggregate generalized
polling isn't a great way to distinguish that.

~~~
ronaldx
I am interested to explore this idea more.

In general, I don't trust online reviews for anything because I don't know
what 'normal' is or the motivations that people have for voting, as you say
here. If I read any restaurant reviews or ratings, it's under the assumption
that they're astroturfed.

But, IMdB provides the online reviews I trust the most. I know that the
dataset is large enough to be reliable. I feel that people are motivated to
vote for their own reference, with minimal outside agenda ( _this exception is
newsworthy because it 's rare_). As an encyclopedic reference, the site is
politically and culturally neutral, compared in particular to national
newspaper reviews. I know from experience how the IMdB normally and
consistently rates films/genres I might be interested to watch (I know its
biases, compared to my own). The stats are openly broken down demographically,
with the statistically crucial number of responses, which I feel confident in
interpreting. I also know, for example, that new releases will be over-rated
according to how close they are to their release date.

By averaging additional information from other sites, I believe it would be
difficult to retain those subtle points, which are important to me. I
certainly don't believe that Rotten Tomatoes contains the same nuance of
information.

~~~
peterwwillis
Rotten Tomatoes does contain different information, and different statistics.
For example, take Robin Hood: Prince of Thieves.

IMDb gives a 6.9/10 from 119,115 users. That's the only real statistic it
gives us; there is no 'Metascore' for this movie. (Personally I think the
Oscar nomination should be mentioned next to this score, but it's sort of
buried further down the page)

Rotten Tomatoes, on the other hand, shows us several numbers. The Tomatometer
is at 50% for "All Critics", with an average rating of 5.7/10 and 52 reviews.
The "Top Critics" Tomatometer is at 36%, with an average rating of 5.9/10 and
14 reviews. The Audience rating, however, is 73%, with an average rating of
3.4/5 from 333,273 users.

\--

The above example shows how you can sometimes use IMDb's stats of general
ratings as a median reference between the audience numbers and critics
numbers. But this doesn't always paint the best picture. Let's take another
example: the recent (and generally accepted as a flop) 47 Ronin.

Here, IMDb gives us a 6.3/10 rating from 60,849 users. But it also gives a
Metascore of 29/100\. Yet the number in a gold star in bigger font is just the
'6.3'. So even though this movie has a dramatically lower Metascore, they only
feature the general user rating from IMDb.

The story is much more dramatic on Rotten Tomatoes. It has a 13% 'All Critics'
Tomatometer rating from 72 reviews, with an average rating of 4.1/10\. The
'Top Critics' Tomatometer shows 0% with 13 reviews and an average rating of
2.9/10; not one single top critic liked this film, out of 13 critics! Yet the
audience rating shows 51% with an average of 3.3/5 with 53,921 ratings.

We can see how IMDb got its '6.3' rating here: both seem to show a middle-of-
the-road rating from general audience ratings. But Rotten Tomatoes' critics
stats show this movie isn't worth wasting two hours of your life on. The
gamble of whether you may like the movie or not becomes more certainly _less
likely_ when you take the critics' reviews into consideration.

\--

As we can see above, even within each website, sometimes they have not only
greatly varying statistics, but sometimes missing or hidden information (Robin
Hood not having a Metacritic rating and not prominently displaying the oscar
nod). People weigh their options based on the data they have at hand, so the
information you give people - along with its context - will change their minds
greatly, regardless of the source.

This is why I think it's much more realistic to look at multiple different
sites. You need as much information as you can get, and no one site gives you
all the relevant data, as it varies from film to film and user community to
user community. It would seem you can't just depend on 'the crowd' to give
someone an accurate idea of whether they will like a film; a survey would
probably be better, but nobody's going to survey every film they watch.

------
Theodores
Barbra Streisand is in it.

------
snake_plissken
Your powers of observation are simply startling!!!

------
anuraj
Bollywood movies are kitsch. Generally they don't deserve any rating.

~~~
timdiggerm
So glad we have cultural judges like you to dictate what is or isn't
deserving.

