

Bayesian Truth Serum - compbio
http://nel.mit.edu/bayesian-truth-serum

======
crusso
It looks like an algorithmic leveraging of the Dunning-Kruger effect.

~~~
compbio
Yes, it looks similar.

Problem: Getting honest questions on questionaires is difficult.

Setup: Respondents supply not only their own answers, but also percentage
estimates of others' answers.

Algorithm: The BTS formula assigns higher scores to answers that are
surprisingly common.

Premise: If people truly hold a particular belief, they are more likely to
think that others agree or have had similar experiences.

[http://www.slideshare.net/micmickimo/introduction-to-
bayesia...](http://www.slideshare.net/micmickimo/introduction-to-bayesian-
truth-serum)

~~~
gweinberg
I don't see why that would be. It seems to me that people are vastly more
likely to purport to hold an orthodox opinion when they actually hole a
heretical one than vice versa. But obviously they will also predict that other
people will purport to hold the orthodox opinion.

~~~
Retra
That should only be the case when they are poor analysts. If they are good
analysts, they should be more likely to claim a heretical opinion and predict
orthodox opinions.

After all, those who have the most information are not in a position to accept
the most common opinions, and they have enough information to know what the
common opinions are.

I would say it's not really about who is telling the truth, but who you should
listen to -- who is _relevant_ in addition to truthful.

~~~
proveanegative
I think gweinberg had in mind the sort of situation in which saying you hold a
heretical opinion brings a punishment that would not be offset by you being
recognized as a superior expert later. This creates an incentive to lie.

