

Duke Nukem Forever exposes crap reviewers - 100 reviews compared - Dinoguy1000
http://www.xentax.com/?p=303

======
Steko
His methodology?

"We all know that we are right, as the true value of DNF is around 65-70%.
Thus, I’d say the 60-90% group of reviewers are The Good."

Srsly?

~~~
dirkgadsden
That's only the beginning of the flaws; I saw no mention of the author taking
into account the individual scoring systems that the various reviewers used.
Some reviewers can be quite harsh, and would use the majority of the 0-100
spectrum for scoring AAA titles, others (i.e. the ones getting the games for
free from the publishers) tend to only use between 65-100 (65 being an
arbitrary cutoff) on the scale (they'd give a 65 to a game that another
reviewer who used the entire scale would give a 0 to). It's especially ironic
that the author bashes Metacritic and other aggregation sites for being
"shallow," then proceeds to churn out similarly shallow, infographic/graph-
laden content. With something as incredibly subjective and contextual as a
video game review, distilling a game to a numbered score is pretty ridiculous;
then aggregating those scores together into a totally context-lacking mass is
pointless, as the result would be totally lacking in almost any meaning
whatsoever.

~~~
MrMouse
The data is what it is. I collected as many scores I could, without the
preselection bias that you find with metacritic. Indeed, I do not take into
account "individual scoring systems" that various reviewers used, although I
did note whether they used a 0-100% scale, or 5 stars. The bottom line is,
many a reader will look at the scores to decide, even if you feel that is not
the case. Also, reviewers that only use 65-100 are basically incompetent. Use
another system if you are not going to use the rest of the scale anyway, or -
if you think the actual text of a review matters only - just don't use any
scoring system at all.

I present the raw objective scoring data, and these data indicate there are
two bell curves possibly, and those bell curves prove the interesting point. I
don't care if you project whatever I feel about the game in question on the
validity of the data, because that's just nonsense. The data was collected
without selection bias, just gathering as much as possible given the
timelines. The only thing that matters is that we felt there was something
amiss with certain reviews based on our own experience of this game in
particular. From then on, it was a tool to expose certain reviewers for
showing incompetent behaviour. You can rest assured that I don't care whether
DNF sells or not. I do care when reviewers at major sites show questionable
conduct. DNF is the tool, from then on we will watch them, also in retrospect.
Bad reviews can make or break a game in certain cases, and this kind of power
in the wrong hands is not to be tolerated.

~~~
Steko
Your problem is entirely in the line I quoted:

"We all know that we are right, as the true value of DNF is around 65-70%. "

This makes the whole exercise total bullshit. Obviously the data is irrelevant
since you assumed your conclusion.

And another thing if you had actually clicked thru to metacritic you'd realize
why there are two peaks in your data. The console versions are consistently
rated a full star lower then the pc version. I took me less then a minute to
verify that at least 2 of the 40-rating reviews were based on console
versions.

