

How to confuse a moral compass: Survey 'magic trick' causes attitude reversal - tokenadult
http://www.nature.com/news/how-to-confuse-a-moral-compass-1.11447

======
richardjordan
I am not sure the conclusions are particularly revelatory. The political
system in the US appears to rely on using tribal association -
Democrat/Republican, Conservative/Liberal - to get people to group themselves
(by branding each tribe) then the political class just inserts whatever
policies suit their goals and the masses fill in the blanks of justification.
From the last decade alone there are countless examples of Bush policies
opposed by D voters and supported by R voters yet when the same policies are
enacted by the Obama administration people take the opposite positions. We all
know it happens and see it every day.

~~~
hindsightbias
> countless examples

name one.

> We all know

I think you see what you want to see, and not any reality.

~~~
pfedor
Here's one example: some twenty years ago, "science" was a dirty word among
liberals, Newton's Principia was a "rape manual", quarks were "constructed",
and "scientism" was but another tool in our evil western civilization's
arsenal for oppressing minorities and other cultures.

Nowadays, liberals are _all about_ science. Anyone who'd dare to question the
scientific method is corrupt or crazy or both. All it took was the climate
change and the controversy around teaching evolution in schools.

~~~
hindsightbias
I was around 20 years ago and I have no idea what you're talking about.
Mainstream liberals were anti-science?

~~~
richardjordan
The Republicans owned NASA and space science in the 80s. Yes there was
definitely an anti-science bent in mainstream liberalism for quite some time.

~~~
hindsightbias
I worked for NASA then, you are making stuff up.

------
pyre
1\. How many people might notice, but think that the mistake was theirs, and
even argue for the opposite of their position in order to save face?

2\. How many of these people hold staunch views on the questions asked? E.g.
would this have the same effect on a extremist skinhead as it would on the
average college student?

~~~
praptak
> 1\. How many people might notice, but think that the mistake was theirs, and
> even argue for the opposite of their position in order to save face?

Probably most of them. This mechanism has been described by Cialdini as
"Commitment and Consistency". Marketers abuse this in a clever way by ramping
up commitment. If you answer a bunch of seemingly innocent questions to make
you look good ("I often go see opera"), you have made a small commitment. Then
it can be leveraged to sell you an expensive opera discount card ("For a
person who visits opera so often, it would be irrational to turn the offer
down".)

------
thowar2
I think this experiment does more to show the ineffectiveness of surveys than
it does to imply that the subjects are fools.

~~~
srean
In what way does it show ineffectiveness of surveys ? not challennging the
assertion, trying to understand why you think so.

To me it seemed to be an indication of what was stated in the article, that
people are less committed to their views than they imagine.

I would go one step further and conjecture that we have a tendency to defend a
position, when confronted with "evidence" that we have endorsed that view
before. I expect this to work on moral positions on which our convictions are
not solid.

@pyre My confusion was not about whether surveys can be manipulated. I can
well imagine that they can be. My question was more in the lines of: where in
linked post is it demonstrated. Sometimes I dont read between the lines enough
and thought I must have missed something. And yes, who in their right mind
would want to leave any children behind ! Its mostly semantic gymanstics

~~~
AnthonyMouse
>In what way does it show ineffectiveness of surveys ? not challennging the
assertion, trying to understand why you think so. >To me it seemed to be an
indication of what was stated in the article, that people are less committed
to their views than they imagine.

The problem is that a survey result is a pretty meaningless number if it can
be changed by anyone you put on television with a fancy suit and a pie chart.
What you really want to know from a survey is a) what the people who feel
strongly think about it, i.e. the people who can't be swayed by an ad buy or a
newscast, and b) what portion of the population are those people and what
portion are the ones who can be easily swayed. But if people are
overestimating how strongly they're committed to their answers then they're
contaminating both results: The second directly and the first indirectly by
inserting the uninformed answers of the disinterested into the sample that
ostensibly measures those with strong feelings.

~~~
prof_hobart
I agree with everything you're saying, except for the fact that you've got the
assumption in there that the purpose of a survey is to genuinely understand
people's opinions on a subject rather than to claim support for your own
position or even to influence the views of the people you are surveying.

This may be a semantics thing, but surveys are extremely effective - just not
as a tool for understanding opinions.

------
phreeza
Slightly off topic, but when I scrolled to the bottom of the article, I was
really excited to see they had _proper_ citations for the research the article
was talking about.

I actually thought this was a BBC article, hadn't been paying much attention
to the design and URL. Then I realized that this was on Nature, so I suppose
it is to be expected here.

I really wish more science journalism did this, even on mainstream outlets.

------
azakai
Did attitudes reverse, or did students that had no real opinion on some topics
make up answers to them, and when they re-read the questions and they were
different, accepted that because they didn't feel strongly either way?

Or, did students have some opinion on the topic, but when faced with a
reworded question, be embarrassed that they must have misread it initially -
which means they had not done what they were asked to do, which is answer the
survey properly? There can be a lot of pressure to participate correctly in
psychological experiments, I know that in my undergrad you had a requirement
to do some amount of experiments to graduate (and non-psych students would be
paid, which also puts pressure on you to do your role correctly). To say
"sorry, I wasted your time - I misread these questions" would not be easy,
much simpler to just make up new answers and leave.

I find it hard to accept their interpretation that attitudes actually
reversed. Yes, some experiments - like Milgram's - show that people can do
surprising things, that we would not expect in advance, and would even say "I
would never do that", but likely we would. But this isn't the same - in
Milgram's study, behavior was all that mattered. In this one, it is the
interpretation of changing attitudes. I agree it is likely people will _voice_
different attitudes, but I doubt it is because they actually changed them,
instead it is either social pressure to not admit you failed at your task to
read the questions and answer them honestly, and/or that the student doesn't
care either way about the topic.

------
bonyt
<http://www.youtube.com/watch?v=G0ZZJXw4MTA> Good take on surveys and leading
questions

------
aeontech
Our minds are much less logical, consistent, or rational than we like to
imagine. The concept of cognitive shortcuts is not new, analytical thinking
requires effort and energy, so our minds tend to filter out majority of input
and process it subconsciously instead of expending energy on consciously
analyzing and processing every little decision. Unfortunately the filtering
mechanisms are automatic and kick in even for things that _should_ be
considered thoughtfully.

Some further reading on cognitive biases and shortcuts:

\- Motivated Tactician model tries to explain why people use stereotyping,
biases, and categorization in some situations and more analytical thinking in
others [<http://en.wikipedia.org/wiki/Motivated_tactician>]

\- Framing of problem/question affects how we process it, and even the answer
we arrive at. [<http://en.wikipedia.org/wiki/Framing_(social_sciences)>]

\- Affect heuristic is "going with your gut", or deciding based on emotion
evoked by the question [<http://en.wikipedia.org/wiki/Affect_heuristic>]

\- Availability heuristic is the "if you can think of it, it must be
important" heuristic, which leads people to fear flying more than driving and
terrorism more than flying, even though their chances of dying from a car
accident are far higher than ever being involved in a plane crash or a
terrorist attack [<http://en.wikipedia.org/wiki/Availability_heuristic>]

------
stfu
Of cause it could be also the case, that survey participants just do not care
that much about the responses they give, i.e. they are not very interested in
arguing about their answers but instead want to get out of the survey
situation.

------
peteretep
[http://en.wikipedia.org/wiki/Robert_Cialdini#6_key_principle...](http://en.wikipedia.org/wiki/Robert_Cialdini#6_key_principles_of_persuasion_by_Robert_Cialdini)

See: "Commitment and Consistency"

~~~
conanite
Scott Adams (of Dilbert), also inspired by Cialdini, found an excellent use
for the consistency principle. See [0] (text possibly nsfw)

[0]
[http://dilbertblog.typepad.com/the_dilbert_blog/2007/03/toda...](http://dilbertblog.typepad.com/the_dilbert_blog/2007/03/today_i_will_im.html)

------
naich
These must have been questions on which the participants did not feel
particularly strongly one way or the other. I'm pretty sure I would notice a
100% reversal in my stance on something I have a prior opinion on.

~~~
orangethirty
Having a prior opinion is about having some previous understanding of the
subject being discussed. If more data were to be provided to you then you
might reason differently. Not because you are changing your mind or flip-
flopping (like politicians say), but because you are a thinker and coherent
individual. You will analyse the new data with fresh eyes and challenge the
old data with it. Its all part of the scientific approach. Don't you agree?

~~~
naich
Being given new information on a subject I know about which contradicts my
understanding might make me change my mind. Just showing me that I apparently
answered the question differently wouldn't, for the obvious reason that I know
about the subject.

If I don't have any previous understanding of the subject then I wouldn't be
able to give an immediate opinion on it. If pressed, I might choose to answer
a question on the spur of the moment, but my choice would be mainly random. If
it was mainly random then the magic trick described here would probably work,
but it wouldn't really mean anything as it was a mainly random choice in the
first place.

~~~
orangethirty
:)

------
jacques_chester
This is different from the classic phenomenon explored in _Yes, Prime
Minister_ [1].

It's not that they manipulated the questions in advance -- it's that they
_reversed the meaning of the questions after the respondent had answered
them_. Even under these conditions, the majority of respondents were then
prepared to support the position which they had, ostensibly, only a few
minutes ago opposed and _vice versa_.

Remarkable.

[1] <http://www.youtube.com/watch?v=G0ZZJXw4MTA>

~~~
its_so_on
nothing remarkable. if any one of us were presented in a fabriacted
"forwarded" quote by us,

> in this format...

and we didn't really care, we would probably defend the misquotation (unless
we explicitly remember what we actually said.)

when was the last time you said: "If I said that, I was completely wrong: I
feel the opposite on this matter."

that would leave someone looking like someone completely unreliable whose
opinion shouldn't matter in the slightest. no shit people don't want to be
that person and will protect what they think people believe they have said.

you could show this with a second study, in which people are asked to defend a
statement others heard them make, but where the defendend KNOWS that what the
other HEARD is not what they SAID. It's easier and probably in many cases will
happen, that you defend what you know the other person thought you had said.

THIS IS WHY "DISRUPTIVE LISTENING" (sorry, I don't remembe rthe correct word
at the moment, the meaning is "actively pretending to hear something else")
WORKS SO WELL.

In other words. If you are told, "There is no way we can ever be interested",
and you repeat to the caller, "so if I understand you you are saying that
there is no way you can put it in this month's budget"

then the person will quite likely agree with that statement. You just changed
their meaning and yet they will not contradict you.

I wouldn't either.

~~~
jrajav
I say "I was wrong" quite a lot in fact, and I think it's an important
ability. Of course, it's easier when you've been called out on some trivia or
an educated guess, but I'm not often confronted with something _I_ said that
is _diametrically opposed_ to what I currently believe. Is anyone?

(Cue political jokes)

~~~
its_so_on
but we're talking about a few moments ago. it is a conversational thing - it
doesn't make much sense to argue against what you've just been misquoted as
saying a moment ago.

to justify the title, "how to confuse a moral compass" the methodology would
have to remove the justification or public discourse aspect, and show that
someone's internal or private belief actually changed due to the
conversational obligation to defend what they are misquoted as saying. i.e.
you would have to find out what they thought to themselves privately before
and after.

maybe a way to do this would be to have an "outside friend" who is more
trusted than any of the people involved in this fake context, and see if they
would report a different opinion to this friend that they think has no
connection or knowledge of the immediate social context, due to the
misquotation in the immediate social context.

As it is,the methodolgy does not justify the conclusion. (as summarized for
us, I didn't click through to the paper.)

------
drcube
In addition to "anecdotes are not data", everybody should learn that "self
reported data is useless". Don't rely on it when making any kind of important
decision. Surveys are just not a reliable way of learning anything about
anything.

------
espeed
_These findings suggest that if I'm fooled into thinking that I endorse a
view, I'll do the work myself to come up with my own reasons [for endorsing
it]._

Isn't this what cable news is all about?

------
pi18n
This doesn't show the ineffectiveness of surveys but rather the effectiveness
of surveys as a brainwashing tool.

------
Vivtek
In the last paragraph, is the critic asking me to believe that the ethics of
meat consumption are somehow _more important to me_ than policy in Palestine?
Really?

~~~
randallsquared
Why does that seem surprising? The vast majority of people will perceive that
policy in Palestine neither directly affects them nor can be directly affected
by them, while the ethics of meat consumption affects everyone who eats meat
and everyone who chooses not to eat meat for ethical reasons.

~~~
Vivtek
So animal rights are honestly more important to you than human ones?
Seriously?

~~~
randallsquared
I didn't say they were, or which side of either issue I come down on; those
aren't really germane to the point of my comment.

------
gailees
I didn't really get the glue part, but the,results of this study point to some
really interesting conclusions that,could be huge for copy in the future.

------
001sky
_“I don't feel we have exposed people or fooled them,” says Hall. “Rather this
shows something otherwise very difficult to show, [which is] how open and
flexible people can actually be.”_

\--There is a "social" reversal stigma, likely at play here.[1] In isolation
witout time constraint, check to confirm result.

_______________________

[1] _"Participants were then asked to [read aloud] three of the statements,
including the two that had been altered, and discuss their responses."_

