Hacker News new | past | comments | ask | show | jobs | submit login
The Story Behind the Worst Movie on IMDb (fivethirtyeight.com)
245 points by thampiman on May 7, 2014 | hide | past | web | favorite | 140 comments



538 is using the wrong metrics. For a movie to be truly, truly, sublimely bad, that badness needs to be recognized - people in some cases will want to watch it for its incredible badness. This means that there are going to be a disproportionate number of both 1 and 10 votes. And the winner based on this is - The Room. http://www.imdb.com/title/tt0368226/ratings?ref_=tt_ov_rt


A clear conversation is difficult in the absence of a clear, canonical definition of "bad". It would be very easy for me to get a few friends together and make a 90 minutes improvised movie filmed on my phone. That would be an absolutely terrible movie, but it wouldn't be a very satisfying answer to the question "What's the worst movie?" because it would be seen by very few people and it would be made with no ambition of quality. I could go further and make a movie that was 90 minutes of a black screen. That would be a worse movie but an even less satisfying answer.

Even if you limit the field to movies that have seen a theatrical release, the same principal applies. Some movies are shockingly bad, but are made with few little effort by people who know they are making a bad movie. These seem less interesting, less sad and less funny than very bad movies made by people trying very hard and spending lots of money.

Personally, when rating the badness of work, I judge it against my own assessment of the potential of the work. The potential isn't directly measurable or quantifiable, but it consists of things like the budget, the vision of the creator, the skill and effort of the artists involved in the creation, etc. etc.

So, for example, I think of Star Wars: Episode 1 as being a worse film than a lot of 50s low-budget sci-fi B-movies, even though I might choose to watch it more readily. The B-movies were pumped out without much effort. They're boring and poorly made on a technical level, but they never really had a chance to be any good. Episode 1 was made on a vast budget, drew on extremely rich source material, and was the culmination of unimaginable amounts of time and effort by a large number of hard-working people. For me, that makes the failure of Episode 1 more profound and more egregious.


This was really insightful. It's the same in any arena: if I try to think of a "bad" novel, my mind won't even go to the acres of paperback romances or teen vampire stories: I'll think of books published in hard cover and mentioned as being in the running for the Booker prize, that happen to be trite, inept, and derivative.


In order for it to be meaningful that something is "bad", it has to have had intentions of being "good".

Anyone can intentionally make a bad movie that's worse than a stinker like Battlefield Earth. That's nothing worth recognizing because the difference between the actual quality and the intended quality, the ΔQ, is so tiny.

"The worst movie ever" designation should go to the film that has the highest ΔQ.


Exactly. And that's why none of the films made by The Asylum* are eligible to the "so bad it's good" category. They never had any ambition of being good in the first place, so why bother?

* http://en.wikipedia.org/wiki/The_Asylum


hmm this means we need a measure of intended quality. Perhaps budget would be a good place to start? More invested = higher expectations. of course this might be thrown off by high-minded indie flicks and hugely expensive summer sequels, so perhaps include a polynomial term to capture the outliers.

really, profitability is the best "general goodness" measure ever created.


> I could go further and make a movie that was 90 minutes of a black screen. That would be a worse movie but an even less satisfying answer.

What about a blue screen?

http://www.imdb.com/title/tt0106438/


I like drysart's criterion above (the delta-Q factor) as a way of discriminating between lowbrow but honestly-crafted crap like the usual MST3K fodder, and expensive yet incompetent crap like SW Episode 1.

But you raise an interesting point, in that there should also be a penalty for novel, provocative work that, while perhaps well-executed, rests on principles that can never be used again, so they don't advance the state of the art or otherwise leave us with any enduring influence. Works like Blue and John Cage's 4'33 would fall into that category, I think.


> But you raise an interesting point, in that there should also be a penalty for novel, provocative work that, while perhaps well-executed, rests on principles that can never be used again, so they don't advance the state of the art or otherwise leave us with any enduring influence. Works like Blue and John Cage's 4'33 would fall into that category, I think.

I've seen the first ten minutes of Blue, it's actually really good. After ten minutes you start seeing stuff, but you can't work out if it is your own imagination or the film itself. The concept of Blue could be advanced: I was thinking of "Gamma" - viewers are exposed to intense gamma radiation for 90 minutes.


Whilst I suspect you're taking the piss, I wonder if you could you induce https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena without receiving a medically serious dose of radiation?

Those buzzkills at the FDA have already banned any artistically worthwhile amount of X-rays from CRT televisions though[1] :(

[1] http://www.gpo.gov/fdsys/pkg/CFR-2013-title21-vol8/xml/CFR-2...


> Whilst I suspect you're taking the piss, I wonder if you could you induce https://en.wikipedia.org/wiki/Cosmic_ray_visual_phenomena without receiving a medically serious dose of radiation?

It's a joke, apologies I couldn't resist. Some art forms and certain works of art are genuinely dangerous [0].

Regarding cosmic rays, I thought it was caused my massive particles, and being able to produce them artificially would be a considerable scientific feat - its artistic relevance would pale in comparison.

> Those buzzkills at the FDA have already banned any artistically worthwhile amount of X-rays from CRT televisions though[1] :(

An old chemistry teacher of mine used to use an old TV screen as a cover for potassium+water reactions, it was about an inch in thickness.

[0] http://en.wikipedia.org/wiki/Richard_Serra


Penalize the avant-garde? Madness. Also, saying that they did not contribute to cinema is like huh?


To some extent, the market already penalizes avant-garde art created for its own sake. I'm just not sure how critics should respond. If the critics bubble effusively about how great Blue and 4'33 are, what exactly are we supposed to do with that criticism, as either artists or patrons?

To me, works like these are symptoms of art forms that are well on their way to exhausting their own possibilities. What next? Green? 4'34?


Last time someone created a "work" inspired by 4'33", John Cage's publisher demanded royalties. http://classicalconvert.com/2007/07/the-stupidest-music-laws... So it may be difficult to build on those ideas even if someone did figure out an interesting way to do so.


4'33 was like an MVP. (It wasn't exactly the first of its kind.) Artists had dealt with room noise before, but usually as a distraction instead of realizing that it's part of the experience of a piece.

As an artist, you could be more aware of the room noise as part of the performance of your work instead of getting in the way of it. As a patron, you could consciously choose a place that has noise to enhance your experience of a work.


Or do the opposite and listen to it in an anechoic chamber - which apparently is psychologi ally difficult to handle:

http://www.ted.com/conversations/14056/why_is_absolute_silen...


The what? John cage has a collection at NYU college for performing arts, and there's tons of other stuff too, like Sonic Youth paid homage to him, and Philip Glass, and many others...I had to write essays about essays about him , so it's not like he's a no one or whatever.

Also, You can't just say 'The Market' as if it was all one bucket. Or some 'God' thing that punishes.


It's probably worth mentioning that Blue is more than a curiosity. It really is a profoundly moving film - one of Jarman's finest. If the synopsis of it on IMDB appeals at all I'd encourage you to track it down.


For me, "badness" is the difference between what (I think) the director thought of the movie, and what I thought of it.


To that extent, a comparison of box office numbers, budget, and imdb rating would be interesting. What movies have been able to buy quality at the best price? Which ones swung hardest and missed? It would be fascinating to look for common qualities within groups.


I tried watching Episode 1 last weekend, I really did. I lasted about 2 minutes into Jar Jar before turning it off.


A triumph only equaled by its monumental failure.


They're not actually interested in talking about "the worst movie in history." That's just a hook to talk about what they really want to talk about: crowdsourced ratings are subject to organized groups with an agenda.


This is true. Just look at how EA was voted twice in a row as the Worst Company in America. They have problems, sure, but it's ridiculous for it to beat out companies such as Monsanto, Bank of America, Comcast, Halliburton, etc. Companies that have seriously harmed the country.

But no, a game company is the worst because a bunch of fanboys rallied to say so.


Oh, hi Mark...

It's really quite fascinating to me how a movie like that, so incredibly terrible with seemingly no irony in it's production, has somehow become a fantastic work of art. It's that piece in the modern museum you think looks absolutely so terrible but something about it appeals to you.

I've probably seen The Room a half dozen times in various settings (movie theater showing it in jest to a huge audience, group of friends over a few beers, etc), and it's always overwhelmingly entertaining because it's just THAT bad. It almost impresses me more that they pulled this off (unintentionally, of course) when compared to some of my actual favorite films.


That is fascinating. I've never seen The Room, but I sure want to now.

It's fascinating because of some similarities to the truly worst movie in history, The Tango Lesson[0]: each movie has one person as writer, director, and star, and (apparently) focuses self-indulgently on the abortive romantic entanglements of the author, who is utterly unaware that nobody else could possibly care.

The Tango Lesson falls through the bottom of the badness scale into a realm where it can't even be enjoyed for its unredeemable awfulness. Everyone I was with thought it was dreadful, and what was left of the audience shuffled out shaking their heads.

http://www.imdb.com/title/tt0120275/


Everything Wrong With The Room: https://www.youtube.com/watch?v=mvuwldnG7c0

The Nostalgia Critic reviews The Room: http://thatguywiththeglasses.com/videolinks/thatguywiththegl...


Everything Wrong With The Room:

Paradoxically, this video manages to be more irritating than the terrible film it talks about. It's particularly ironic that the reviewer complains about Tommy wiseau saying 'hi' too often, and then peppers the video with metallic pings every few seconds while making nonsensical unfunny asides to viewers.


After you've watched The Room, read "The Disaster Artist: My Life Inside the Room, the Greatest Bad Movie Ever Made". It is a fascinating tale! (And is being turned into a movie of it's own starring James Franco)


You ever seen Annie Hall though? That's a great movie.


The exceptions to the rule are where the writer/star/director is aware of their own ridiculousness - see also Curb Your Enthusiasm.

I haven't seen The Room, but the ne plus ultra of that kind of self-indulgence for me is Garden State. 90 minutes of Zach Braff's daddy issues and romantic wish fulfilment - I've had better afternoons. (It amazes me that it picked up an audience more or less on the basis of having The Shins on the soundtrack.)

The Star Wars prequels fall into that category, even if Lucas isn't in them. Proves very well that he should be shot on sight if ever he approaches a typewriter again. Not a fanboy of the first films by any measure, but at least Lawrence Kasdan or whoever can string a couple of lines together. Left to his own devices, Lucas is capable only of bloated, soulless tech demos.


As an aficionado of bad movies, I agree that there is more to a bad movie than just low review scores. Entertaining bad movies need a certain something to them. The worst kind of movie is a dull and boring one.

The 'best' bad movies are hard to quantify. I certainly can't think of any 'good' bad movies that planned or aimed to be regarded as bad. As another commenter said, you need the creators to believe that they were making something worthwhile...


Try a movie like Bad Taste which for years I recommended as it's so bad it's good: "I'm a Derek and Dereks don't run". You can imagine my surprise years later when they announced the director for LotR.

I would be interested to see how its rating has changed over time.


I don't think that's always true - there are certainly movies that are "so bad they're good" (and thus become the fodder of MST3K/RiffTrax). But, there are movies that are just straight-up bad and can never be redeemed even by the likes of MST3K. For example, the Paris Hilton movie the author briefly discusses.

There are certainly films that are "so bad they're good", but then there are just bad films.


That is nothing compared to "Manos: The Hands of Fate" (http://www.imdb.com/title/tt0060666/ratings?ref_=tt_ov_rt)

Unexpectedly, however, "Plan 9 from Outer Space" is rated much more evenly for a film that's been considered to be the worst of all time.


Manos is bad in the I-can't-even-watch-this-for-fun-without-MST3K kind of way. And even then it's a slog.

I'm a die-hard MST3K fan and I've only seen Manos two or three times.

Even so, I Heart Huckabees tops my list of movies that I just can't watch. That one's actively grating rather than just incompetent.


Interesting. My favorite bad movie has a similar graph, but not quite so many 10's.

R.O.T.O.R. - http://www.imdb.com/title/tt0098156/ratings?ref_=tt_ov_rt


That's pretty low. Netflix doesn't even have this movie listed. How does one go about finding an obscure movie like that?

My personal favorite bad movie has always been "Cool As Ice" [1], but even that seems to be rated higher than R.O.T.O.R.…

[1] http://www.imdb.com/title/tt0101615/ratings?ref_=tt_ov_rt


The opposite effect, a bad movie being rated highly, also occurs. One example of this is with the film "The Oogieloves in the Big Balloon Adventure" (http://www.imdb.com/title/tt1520498). This film was panned by critics on release and was a complete flop at the box office. Its IMDB rating languished until sometime last year when it began to rise. Around the same time, a bunch of poorly-worded unequivocally positive reviews showed up on the page, all following the same pattern (including one that appears to have been run through a translator and back). The rating made it up to 8.0, although it's since fallen to 7.6. If it's not the movie's production or marketing team involved in this clear manipulation, I don't know who it would be, as I can't see that this movie has any kind of cult following.


I suspect it's someone testing out their automated review-submission software ... try it on something so obscure it won't get noticed for awhile if at all.


It could be the same sorts of people that write raving reviews for the 3 wolf moon shirt on Amazon.


Unlikely, the positive reviews aren't humorous and this film is basically unknown except to people like me who are obsessed with marketing flops.


so 4chan?


Plausible


I feel like there's an assumption that this data is erroneous for this reason - I think that any reason in which a person gives a negative rating toward a film is valid, be it politically motivated or otherwise.

FiveThirtyEight has had a rough start so far so I may be more inclined to seek more nuance than I normally would as they've tended to be extraordinarily simplistic (Silver himself notwithstanding) to date.


The problem not that the opinion of the members of Gonojagoron Moncho are invalid, but that they are not representative of general opinion on the film. The online activism has skewed the films ratings to a point that, unless you are a member of Gonojagoron Moncho, it becomes a less that useful metric. This is a case of a motivated minority overwhelming an apathetic majority. Think of the many times Steven Colbert had his fans write in his name on online polls such as naming a bridge in an eastern european country or a NASA space mission.


But most of the people who has given the rating have not seen the movie. They are giving the lowest rating just to put their point forward. This is not a documentary to put the truth on the screen (I am not sure how much of the movie is fiction).

I think the author calls data erroneous as the rating does not represent the actual quality of the movie - a thing which IMDB wants to achieve. He doesn't call it erroneous for actual errors in the process of collecting ratings.

I think, as you said, you are just inclined to seek more nuances.


> But most of the people who has given the rating have not seen the movie.

In the (paraphrased) words of Andrew Gelman:

"I can't knock this guy for slamming my book without reading it. For example, I have never read the autobiography of Uri Geller, but I'm confident it's full of crap."

If you don't think Gonojagoron Moncho is lying to these people about the point they're taking offense over, how is it relevant that they haven't seen the movie?


> But most of the people who has given the rating have not seen the movie

I'm sure it's possible, but no rating represents the "actual quality of the movie." A rating is based on a subjective amalgam measurement that encompasses a myriad biases, even when watched and with total scrutiny.


While this is true, I would argue that there is a social expectation that an IMDB review is written by someone who has seen the movie. Especially, it is expected to be true on average with the usual "few bad apples" thrown in the mix. In this case, this expectation does not hold, and in this sense these reviews are truly anomalous (they are produced by a different process than the other ones). From the perspective of the expected usage of the reviews this anomaly is indeed possible to see as an error, rather than just slightly different behavior.


While true, the ratings can be useful if there is some form of understanding as to what people are rating the movie for. Most ratings cover the simplistic "I saw the movie and this is how I feel about it" kind of data. In that case, the ratings can be useful.

But if a group with an agenda comes in and rates the movie on a different set of criteria without letting the typical reader know renders the data useless.


I found it odd that the article makes note that 36,000 ratings come from outside the U.S. Maybe I am missing something but wouldn't it be expected that many of a Bollywood movie's ratings would come from outside the U.S.?


I'd guess on Indian review-aggregator websites written in Hindi the trend is reversed ;)


The irony is that they wanted to bury the movie on imdb, but they did such a good job that it is becoming a famous case study.


I don't think they wanted to bury the movie. I think they wanted a mass showing of anger to demonstrate their power, to themselves and others. They want people to know who's doing it and why.


Not really. This is just fluff; you could write the same article on how Time magazine's '100 most influential epople' polls are regularly gamed to oblivion. This year it featured two Indian politicians int eh top 10 (because India is having a general election), and some Indian people saw this as an example of their country's growing significance on the world stage. Unfortunately the two politicians were mostly in the company of singers and other entertainers, reflecting the 'soft news' focus of the American edition.


The Streisand effect.


I would never have heard about this movie if it hadn't been so low rated. Perhaps they rather should have given in 5 stars, burying it in mediocrity?


If their goal was to bury it in mediocrity, I think that would be a good strategy. But I think their goal is to bring attention to a movie that distorted history in a meaningful way, then down-voting it is a pretty effective strategy. If you can down-vote it to the level where it has the worst score in the IMBD, or even among the ten lowest, it attracts attention and goads curiosity about what makes the movie so awful, which leads people to Google it and quickly find information about this social movement.


Luckily, IMDB also shows Metacritic ratings next to their own user ratings. I take those into account before I make my decisions on what I'm going to rent/stream.

Far too often do you find a movie (usually a somewhat recent release) that's just flat out terrible despite having a > 7 score on IMDB (and usually a Metacritic score in the 30's or 40's)


Do you have any interesting examples that you've found?


Excuse the plugging of my own wares, but by coincidence I did some analysis the other day of the disparity between User/Critic ratings of movies on RottenTomatoes: http://benjaminlmoore.wordpress.com/2014/05/05/what-are-the-...

Example: "Spy Kids" got a 93% fresh rating from critics, but 45% from audience ratings.

Code attached should you want to investigate for yourself.


If you look at the Average Ratings, the difference is smaller, though: the critics gave it 7.2/10 while the audience gave it 5.2/10.


That's really really awesome, thank you for sharing.


The Avengers was particularly egregious. It was in the top 250 movies of all time for opening weekend. A huge % of ticket sales occur in the opening weekend. There is not a doubt in my mind that these ratings are thoroughly manipulated by the corrupt studios. At least IMDB now shows the metacritic score which is more accurate. A lot of Disney/Marvel movies seem to follow this trend.

If I had unlimited time and money I would create a new movie review site that fights corruption. I would have a phone app for doing the reviews and it would require taking a photograph of your ticket stub at the theater and uploading it for proof (w GPS tag). It is far too easy and anonymous to participate in these online movie ratings sites that have a large impact on people's decision making. I have more ideas for the review site if anyone wanted to discuss it further.


I believe your observation is correct, that movies are over-rated at first, but I doubt this is wholly - nor even mainly - corruption.

People who are motivated to see a movie early are the same people who are likely to enjoy that movie. This makes it natural for movies to get rated very highly when they are first released. That's just natural.

If you forced people to photograph their ticket stub, I believe this could make the problem worse: mainly people who particularly loved the movie (or thought it was significantly worthy of comment) would be motivated enough to do that. You'll just collect the IMdB's 10s and 1s.

I know many people use the IMdB for their own record-keeping. These people would stop doing that if there was an administrative barrier that made it less convenient than a paper equivalent, e.g. writing in an address book. I strongly believe this makes their votes neutral/genuine compared to other survey methods.


But some 10's and 1's are lazy indignant people. Putting an obstacle in the way of registering a rating could filter out lots of noise. Leaving people who deliberately, intentionally review movies. Those ratings would be of an entire different class.


But surely they don't use the mean to find the average score ?

Even a basic trimmed statistic would find a much more accurate average. Lazy people leaving 10s and 1s would be removed or given a lower weighting.


As long as it's consistent such ratings are averaged out as a 8.2 is only meaningful in comparison to other scores.


Anchorman 2 and The Internship were both > 7 near their DVD release but have settled back down into the 6's and have low Metacritic scores.


This mirrors my own experience with IMDB -- it is of dubious merit for big budget recent releases, and I've always suspected (based upon nothing but a sense of "what else could it be?") that Hollywood has long understood the value of crowdsourced metrics like that.


Troll 2 used to be the worst-rated movie on IMDb until it became a cult favorite and garnered tons of 5-star ratings.

If you're interested, there's a really interesting documentary about Troll 2 called Best Worst Movie, which covers the film's production and cult following. After watching the documentary, Troll 2 becomes an enjoyable movie because you know all the crazy backstories to the actors and scenes.


He was sentenced to life in prison for his crimes by the Bangladeshi International Crimes Tribunal. But many Bangladeshis found that sentence too lenient, and more than 100,000 of them gathered in Shahbag Square in the capital city of Dhaka to challenge it.

It's interesting that so many people gathered solely to express "we want this guy dead, not imprisoned." I wonder if there's more context?


There is.

Given the dynamics of Bangladeshi politics, a life sentence from the current government is actually a jailing until the opposition comes to power and pardons everyone convicted by its predecessor.

I'm against capital punishment, but I understand where the protesters are coming from: they want real justice for the atrocities in 1971, and see the death penalty as the only way to obtain lasting closure.

Politics in a democracy of 180MM people is really, really messy.


It seems messier than politics in a democracy of 300MM people or 70MM...


You refer (I assume) to the UK and the US? One invented modern parliamentary democracy, and the other is a global superpower. Both have been at the business of democracy for 200+ years, and have never had to deal with the problem set Bangladesh is facing today.

Snark serves no one well.


>>> Politics in a democracy of 180MM people is really, really messy.

>> It seems messier than politics in a democracy of 300MM people or 70MM...

> Snark serves no one well.

FWIW I didn't see it as snark. You seemed to imply the size of the democracy mattered. His/her response was a couple counter-factuals. Then you seemed to acknowledge it's not the size, it's the age of the democracy (which is a great point).

In any case I found the rest of your original post really helpful context, explaining why a "life" sentence probably won't turn out to be that. Thanks!


"...a Bangladeshi nationalist movement called Gonojagoron Moncho, or National Awakening Stage. Gonojagoron Moncho was founded in response to the trial of Abdul Quader Molla, a Bangladeshi Islamist leader who last year was found guilty of killing hundreds of civilians as part of a paramilitary wing during Bangladesh’s liberation war from Pakistan in 1971. "

That, uh, seems like plenty of context to me? Throughout history, seriously bad dude plus nationalistic fervor equals lynch mob pretty darn often. The seriously bad dude part is usually even optional!


The next paragraph goes on to say that as a result of the protests, his political party was banned from participating in future elections. I'm sure they wanted him dead as well (and they did get that wish), but the banning of his political party is huge.


That's really, really sad if it's true.


Maybe a solution could be weighting the vote according to the user history: a user leaving a single vote on a single movie shouldn't be as influential as an user that voted on a wider range of movies over time


This kinda happens already, or at least it used to many years ago; you have to be an active user for your vote to count.

The most obvious effect is movies that drop in the rankings 30 days after release; people who just saw a movie give it 10/10 and then stop participating, so 30 days later they are no longer considered 'active' and their vote stops contributing to the overall ranking.


Also the range of their votes.

If someone only gives 10s or 1s to everything, their vote of 10, should probably have less weight, than someone who distributes their votes more evenly.


This is not correct. Many people dont bother to vote or review unless the movie for them is at an extreme goodness or badness, so the hassle is worth it. I ve watched a lot of so-so or simply good movies, but I only have voted 5-6 times and those got either a 10 or a 1.


The solution is recognizing that a 'honest' or 'good' metric is not possible. Thing is, to produce such a list you need to have some function f, which maps the complex opinions of each and every user of the site into an orderable set, usually integers smaller than 5 or 10.


IMDB only counts 'regular voters' in the top 250 movie list.


Data gone Wrong

Did they start hanging out with the bad kids, take up cigarettes, drinking, gambling only to progress to crack and burglaries one of which ended with our Data shooting a home owner who returned unexpectedly?

I guess I don't understand what data is. I always thought it was a set of values. And I always thought that the problem when using data was in the interpretation, and that a prudent consumer of data would always be careful to distinguish between a random sample and self-selecting sample when drawing conclusions, and then would only state conclusions couched in the language of statistical inference.

Leaving aside the question of why I should give a fuck about this supposed outrage, why does the author expect there to be a strong correlation between movie quality and the ratings on a website devoted to providing entertainment by having users rate movies?

When The Matrix is purported to be better a better movie than Lawrence of Arabia, the problems of interpretation are systemic.


Everything you say is right, of course. Yet, I upvoted the story.

I thought it was indicative of a larger trend where crowdsourced data are used to illustrate a point. Like the Google flu trends articles, which have gone around HN at least twice, once when they were successful (https://news.ycombinator.com/item?id=5040204) and once when they were critiqued (e.g., https://news.ycombinator.com/item?id=7455307).

I work a lot with sampled data, and I have found that sampling issues can be some of the most difficult to appreciate and to quantify -- even for experts.

I guess it comes down to sampling from one distribution, P(x), when the situation you really care about samples according to a different distribution P'(x). If P is far from P', your conclusions from P can be arbitrarily bad. If you have an adversary moving P around deliberately, as here, it's even worse.


Statistics experts are fewer and further between than experts in other fields who use statistics to justify their decisions, and the article shows how far off base most people are...after all the author conducted numerical analysis of the database and presents their findings as facts about data and includes a rough statistical comparison of the voting patterns of the lowest rated [called 'worst'] and the second lowest rated movies.

If there is an interesting statistical result it's that the movie's rating is entirely consistent with crowd sourced predictions. The theory is that 'wisdom of crowds' results directly from diversity among those making predictions.[1] In the case of the lowest rated movie, those making predictions were unusually homogeneous, and therefore an inaccurate prediction as to the quality is unsurprising.

Again, it's all in the interpretation, e.g. there's statistical evidence that a lot of morons ranked the The Matrix.

[1] Diversity Prediction Theorem: http://vserver1.cscs.lsa.umich.edu/~spage/ONLINECOURSE/predi...


When Data Bites Back. Good points, dude.


Years ago, when you could download roms from the mame website, I gave a strong up score to the Tron romset.

That tipped the romset into the spotlight - there were some leaderboards for highest recent activity and so on. Other people started downloading the romset and voting on it.

Suddenly this obscure romset was catapulted into most of the lists for "most active" and "best" etc.

I had rated the game honestly. I had fond memories playing the cab for a week on holiday in my youth. But I was surprised that so many other people felt the same, especially on the Mame platform where the game's controls made it tricky.

All my deliberate attempts at voting shennanigans failed miserably. (Although I haven't investigated MTurk or similar yet.)

I wish there was a site like Meatball wiki where people could share their vote-weighting methods.


> Suddenly this obscure romset…

Obscure? :-(

This was my favorite arcade game as a kid. I don't think it was very obscure. Though I have an actual Tron arcade machine not 10 feet from me, so perhaps I'm the wrong person to judge that…


For more information on the war, read the wikipedia article: http://en.wikipedia.org/wiki/Indo-Pakistani_War_of_1971


This is what I was thinking when reading the article. IIRC, east Pakistan was getting massacred (literally) until India stepped in and essentially beat West Pakistan.


The archetypal 'bad' movie is "Plan 9 from Outer Space" [0], by Ed Wood. It holds that title mainly because of its production values.

This one has caused a stir on IMDB for reasons of moral integrity, or bending the truth. Rather odd, since this has been going on in films and similar media for a very long time.

"International Gorillay", a film from Pakistan depicting Salman Rushdie [1]. Coincidentally it was released a year or so after Rushdie's "The Satanic Verses" [2]. Ayatollah Ruhollah Khomeini [3] wanted Rushdie dead because of this novel. BTW Iranian films can be very good, like "Where is the Friend's Home?" [3].

[0] http://www.imdb.com/title/tt0052077/ [1] http://www.imdb.com/title/tt0251144/ [2] http://en.wikipedia.org/wiki/The_Satanic_Verses [3] http://en.wikipedia.org/wiki/Ruhollah_Khomeini [3] http://www.imdb.com/title/tt0093342/


IMDB ratings are bull. Lots of great movies, often even classics, rate around 6 on IMDB because the general public considers them "boring".

The only movies that escape from the IMDB average are a) decent movies that are loved by the masses, b) great movies the masses don't watch (being in black and white or not in English alone pretty much guarantees 2 bonus points) and c) movies everyone agrees on are total crap.


If the general public considers a movie boring, giving it a mediocre rating seems fair. Other members of the general public will then know to avoid it. They'll want to watch movies they like, not movies you like.

What you may want is a personalized rating based on people with the same tastes as you.


> What you may want is a personalized rating based on people with the same tastes as you.

I haven't used it in a while, but I used to find MovieLens, from the University of Minnesota, useful for that: http://movielens.umn.edu

It even has a feature where you can get joint personalized recommendations for you and a friend (assuming the friend also has a MovieLens account, of course), which is useful for brainstorming movies to rent with someone that might be mutually enjoyed.


I agree and usually go to rotten tomatoes instead. One need look no further than the Lord of the Rings movies at 9, 11, and 17 to get a sense of the crowd doing most of the voting on IMDB. Sure, they were good, but all three in the top twenty movies ever? Come on.


Contains the best author disclaimer I've seen:

Data analysis by Eugene Bialczak. Also, a disclaimer: the author wrote much of the IMDb Trivia App.


It's not that the rating is inaccurate... it's that a huge population of people that don't usually rate movies on IMDB has suddenly entered the rating pool. Maybe the solution is to have country-specific ratings, so that my ratings are averaged with the ratings of my peers, rather than the ratings of people in entirely different cultures.


The rating is completely inaccurate. The ratings have nothing to do with the movie or its merits, but from a perceived slight that a bunch of bloggers and nationalists were offended at.


While there's certainly a culture common to my compatriots and me, it's hardly a great indicator viz-a-viz preferences in films. At least in that area, I probably have more in common with many people here than with the average citizen of my country.

Besides, for us in small countries, we would probably have almost no ratings in any film besides blockbusters.


Any crowdsourced data can be easily manipulated by a well organized or large enough group. See the results of the naming poll for the Megyeri Bridge in Budapest in 2006. http://en.wikipedia.org/wiki/Megyeri_Bridge#Naming_poll


A film I had the pleasure to work on, "Car 54: Where Are You?" has a rating of 2.4, and that was not due to any crowdsourced manipulation; people really hate the movie. I thought it was marginally worse than Police Academy 4, but most people apparently think it really sucks.


if you use imdb as a general indicator of a film's quality you're much better off entirely excluding the scores of films produced by india and turkey


Very interesting -- a lesson for IMDB to filter votes that fall outside normal distribution ranges.


A lesson for any site which allows voting. Participatory sites, regardless of size, are subject to turfing or the like. Some handle it better than others, some don't. The worst don't when the results favor their own ideology.


Well, interestingly, the problem here isn't quite that people are giving the movie false reviews. They might genuinely hate when this sort of thing happens -- bitter fights over terminology aren't particularly rare. The "problem" here is that they care more about the movie than do the general population of movie-watchers; if everyone on earth had to watch Gunday and rate it on imdb, all of these people might honestly rate it 1, but they would all be swamped by reviews from people who thought it was decent (and that group would likely be swamped by other 1-star reviews from people who didn't speak the language, but try not to focus on them).

It's similar to the problem of an author's "best" or most notable works getting lower crowd-sourced reviews than their average work. What happens there is that generally people only bother to read, and subsequently review, books that they think there's a good chance they'll like. If you're a mediocre author putting out filler in your genre, your 6th, average, book will get pretty good reviews because everyone who bought it knew what they were getting. If your 7th book happens to be excellent, hype will induce a lot of people who don't like your genre to buy it and see what everyone's talking about -- and they'll take it out on you in their reviews afterwards.


"hanged to death"? "to apply any new learnings"?


If you think either of those is incorrect for some reason, please state the reason, so I can mock you for your ignorance.

It's difficult to mock properly when all you've given is a question mark. :)


...and 4Chan starts pumping 5 stars into the votes in 4... 3... 2...


"There are currently more than 235,000 films on IMDb, and ... not a single qualified movie besides “Gunday” rates worse than 1.8."

"The next lowest-rated movie on IMDb — 1.8 stars overall ..."

I am not sure what the writer means by a "qualified" movie, but this one does rate less than 1.8: http://www.imdb.com/title/tt2094870/

It has votes from only 195 users as of this writing, though.


To qualify for the "bottom 100" list, a film needs at least 1500 ratings: http://www.imdb.com/chart/bottom


It just needs to be bicameral: one rating from critics and one rating by the public. Rotten Tomatoes does this, and I think it lends for a better overall result.


Justin Beiber's Believe now at 1.5, just saying, and I believe that happened more organically. http://www.imdb.com/title/tt3165608/


I don't see how crowdsourcing is hurting here. In fact, I learned something new.

I'm thinking this movie otherwise gets forgotten in the trash bin of bad movies and the data would never tell you anything because it wouldn't exist.


Currently working on an approach that would avoid this type of gaming of rankings by changing the way we collect ratings: https://aeolipyle.co/


I don't understand this part: "91 percent of all reviewers gave it one star. The next lowest-rated movie on IMDb — 1.8 stars overall — has a more even distribution of ratings, with only 71 percent of reviewers giving it one star. The evidence suggests the push to down-vote “Gunday” was successful".

To me that's just stating the obvious. Of course if there is such a thing as a worst movie then it will have a higher percentage of 1 star votes than other movies. So I don't know how that's evidence for anything except that the movie seems to be bad.


Any idea where this guy got the data? He mentions rating distributions.

I though the majority of IMDB data was not downloadable?


Check out http://www.imdb.com/interfaces

Also you can buy a license for access to more complete datasets.


If it's on the Internet it's downloadable. I'm not sure what the IMDB TOS says about it, but I remember watching a video about sqlite that used IMDB as its example.


hey, let the data speak for itself! ;)

seriously, i do take IMDB ratings into account, but i consider them unreliable at best. Inception, when it came out, was the best movie of all time for a while, according to IMDB users. enough said.


Who would pay attention to ratings from a single unreliable source.

For me the best place are meta sites that gather from many related sizes.


I prefer the IMdB's ratings system over, say, meta-site Rotten Tomatoes.

At Rotten Tomatoes they rated Seth Rogen-vehicle "Neighbors" as 100%, last I checked. Meaning only: no reviewer at that point had said it was awful. And yet, I saw it and it was awful.

At IMdB, I can see that it rates as 7.6/10.0, and also that teenager voters loved it (9.0/10.0), where older viewers hated it (5.0/10.0). Far more useful information.

Gunday (the film of the parent article) gets fairer treatment from the IMdB's top 1000 users, who rate it 4.9/10.0


IMHO no single website should be used as a definitive source for the quality of peer-reviewed content. Instead, averages should be taken from different sites, compared to the types of movies usually reviewed as high or low on those respective sites.

Each site has its own user base, and those users have their own biases. A review from Rotten Tomatoes will vary greatly from those of sites that include only noteworthy critics or only crowdsourced opinion without the "community" aspect. Some communities will be more critical, while some will be less. Like most online reviews, the criteria for rating is completely subjective; one user gives it a 10 because it had their favorite actress in it, while another user gives it a 5 because there was a scene they didn't like. What's awful to you may not be awful to me, and crowd-sourced data or aggregate generalized polling isn't a great way to distinguish that.


I am interested to explore this idea more.

In general, I don't trust online reviews for anything because I don't know what 'normal' is or the motivations that people have for voting, as you say here. If I read any restaurant reviews or ratings, it's under the assumption that they're astroturfed.

But, IMdB provides the online reviews I trust the most. I know that the dataset is large enough to be reliable. I feel that people are motivated to vote for their own reference, with minimal outside agenda (this exception is newsworthy because it's rare). As an encyclopedic reference, the site is politically and culturally neutral, compared in particular to national newspaper reviews. I know from experience how the IMdB normally and consistently rates films/genres I might be interested to watch (I know its biases, compared to my own). The stats are openly broken down demographically, with the statistically crucial number of responses, which I feel confident in interpreting. I also know, for example, that new releases will be over-rated according to how close they are to their release date.

By averaging additional information from other sites, I believe it would be difficult to retain those subtle points, which are important to me. I certainly don't believe that Rotten Tomatoes contains the same nuance of information.


Rotten Tomatoes does contain different information, and different statistics. For example, take Robin Hood: Prince of Thieves.

IMDb gives a 6.9/10 from 119,115 users. That's the only real statistic it gives us; there is no 'Metascore' for this movie. (Personally I think the Oscar nomination should be mentioned next to this score, but it's sort of buried further down the page)

Rotten Tomatoes, on the other hand, shows us several numbers. The Tomatometer is at 50% for "All Critics", with an average rating of 5.7/10 and 52 reviews. The "Top Critics" Tomatometer is at 36%, with an average rating of 5.9/10 and 14 reviews. The Audience rating, however, is 73%, with an average rating of 3.4/5 from 333,273 users.

--

The above example shows how you can sometimes use IMDb's stats of general ratings as a median reference between the audience numbers and critics numbers. But this doesn't always paint the best picture. Let's take another example: the recent (and generally accepted as a flop) 47 Ronin.

Here, IMDb gives us a 6.3/10 rating from 60,849 users. But it also gives a Metascore of 29/100. Yet the number in a gold star in bigger font is just the '6.3'. So even though this movie has a dramatically lower Metascore, they only feature the general user rating from IMDb.

The story is much more dramatic on Rotten Tomatoes. It has a 13% 'All Critics' Tomatometer rating from 72 reviews, with an average rating of 4.1/10. The 'Top Critics' Tomatometer shows 0% with 13 reviews and an average rating of 2.9/10; not one single top critic liked this film, out of 13 critics! Yet the audience rating shows 51% with an average of 3.3/5 with 53,921 ratings.

We can see how IMDb got its '6.3' rating here: both seem to show a middle-of-the-road rating from general audience ratings. But Rotten Tomatoes' critics stats show this movie isn't worth wasting two hours of your life on. The gamble of whether you may like the movie or not becomes more certainly less likely when you take the critics' reviews into consideration.

--

As we can see above, even within each website, sometimes they have not only greatly varying statistics, but sometimes missing or hidden information (Robin Hood not having a Metacritic rating and not prominently displaying the oscar nod). People weigh their options based on the data they have at hand, so the information you give people - along with its context - will change their minds greatly, regardless of the source.

This is why I think it's much more realistic to look at multiple different sites. You need as much information as you can get, and no one site gives you all the relevant data, as it varies from film to film and user community to user community. It would seem you can't just depend on 'the crowd' to give someone an accurate idea of whether they will like a film; a survey would probably be better, but nobody's going to survey every film they watch.


You should note that Rotten Tomatoes' main score is just the percentages of yay vs nay, meaning, the "percentage of critics who have given the movie a positive score".

If you look below that, you can see the actual "Average Score", which in the case of "Neighbors" is actually lower than IMdB's, at 7.1/10.


A lot of movies don't get a pre-release screening for critics (or they do for limited amount of critics, usually small timers that can be influenced with gifts or free tickets), thats why you'll sometimes see RT give a skewed rating until the release day.

The release for that movie is the 9th, so its not out yet outside of limited distribution. Wait until all the reviews come in this weekend for a more complete score.


If we're talking single data points, The Shawshank Redemption is a good movie, but it's not the best of all time (as rated by IMDB), and I've rarely heard people talk about it in any context. Pretty much all the other members of the top 20 get attention in the public mindspace, frequently referred to outside of conversations on film.


Isn't that movie not even out yet? I always wait for the reviews once the movie is actually out.


It depends. When I'm at the cinema deciding what to watch, googling the title with 'imdb' is quick and easy enough as a general indicator.


The next campaign could encourage everyone to give a low rating at all the different sites.


Barbra Streisand is in it.


Your powers of observation are simply startling!!!


Bollywood movies are kitsch. Generally they don't deserve any rating.


So glad we have cultural judges like you to dictate what is or isn't deserving.


Can't blame him for that statement. The number of watchable movies released in Bollywood is very low.


Stereotypical guesses are the last thing I would like to see on Hacker News. There are good as well as bad movies all over the world.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: