
Is it OK for psychologists to deceive? - pepys
http://aeon.co/magazine/psychology/is-it-ok-for-psychologists-to-deceive/
======
rubbingalcohol
When I was a kid, I had a psychologist who ratted out many of the things I
told him, supposedly "in confidence," to my parents. It was a deceptive abuse
of doctor-patient confidentiality , and I promptly stopped talking to him when
I found out he had deceived me.

In general, I would say "no." It's never okay for a psychologist to deceive
their patients, or to experiment on people without their knowledge. Talking to
a professional about mental health issues puts a person in a vulnerable
position. When a psychologist abuses that vulnerability, it is an extreme
betrayal that would make it less likely for the patient to be able to open up
to anyone else about their issues, thus obstructing their ability to receive
mental health treatment in the future.

Performing psychological experiments on people without their knowledge could
similarly aggravate underlying psychological issues, with potentially serious
consequences. This is some Nazi doctor-level shit.

Mental health issues are in some ways more problematic than physical health
issues, because culturally we don't take them as seriously as "real" diseases.
Too often our medical system treats mental health issues as a moral
inadequacy, not a health issue. The fact that we even think twice about the
question posed in the OP shows how seriously messed up our priorities are.

~~~
vacri
> _It 's never okay for a psychologist to deceive their patients_

This is not true, as there are certain cases where doctor/patient
confidentiality is explicitly allowed to be breached - if you credibly
threaten the practitioner or other people, then the duty of care changes. If
you _credibly_ threaten to murder someone, then a mental health staffer is
entirely within their rights to breach your confidence.

Your own situation you describe, of course, does not appear to be one of those
cases.

~~~
wfo
He's not talking about the current status of the law. The law has absolutely
zero to do with right and wrong. He's talking about morality. He seems to
believe that if you agree to help someone with their mental problems, that you
shouldn't turn around and betray their trust for ANY reason -- that's the only
way any real trust can be developed. I tend to agree.

If someone said "you can trust me, except for this laundry list of things you
can say in which case I'll have you locked in a cage and force you to take
medication against your will" would you trust them? Because that's the reality
of every conversation with a psychologist currently -- it's just not made
explicit.

~~~
vacri
I said 'duty of care', not 'law'. If you _credibly_ threaten the practitioner
or third parties, then _morality_ most definitely says you should protect
those people from harm. I mean seriously, make the moral comparison: break
someone's trust or let someone get maimed or die.

To put it bluntly, it's stupid to demand a practitioner _ignore credible
threats_ (stress _credible_ ) against their own life, in favour of some
weirdly puritan moral theory. Practitioners are people too, with their own
lives, loves, and foibles... and, frequently, their own mental illnesses.
They're not any more disposable than the patients they treat. Just because
you're interested in helping someone doesn't mean that you commit to do so
even unto your own death.

> _If someone said "you can trust me, except for this laundry list of things
> you can say in which case I'll have you locked in a cage and force you to
> take medication against your will" would you trust them?_

Psychologists can't prescribe medicine or force detention - you're tilting at
a windmill here. Psychiatrists can do both, though with some limitations on
how they can detain. Similarly, credible threats are hardly a 'laundry list'.

------
KingMob
Former cognitive neuroscientist here.

This article is supposed to be talking about psychological research, but it
starts with an anecdote about a bank teller's UFO prank? Psych research has
had a checkered history, for sure, but this straw man doesn't help the
author's case. How does this reflect on academic research standards?

For those who don't know, modern research standards (i.e., the last 40 years)
require all research to be approved by an Institutional Review Board, which is
mostly staffed by non-psychologists. They have to justify the possible
knowledge gained, the methodology used, and if there is any deception
involved, all participants get a mandatory debriefing, where the purpose was
explained to them. The research shows that debriefings are very effective, and
the vast majority of participants do not leave the lab feeling violated; most
are happy to contribute to scientific knowledge (or just collect their
cash/course credit).

I have personally witnessed labs temporarily shut down for violating IRB
rules. And without IRB approval, no legitimate journal will publish your
research.

While Milgram and Asch's experiments were deceptive, I think their results
reveal harsh, necessary truths about ourselves. Zimbardo is a different case,
in that the Prison experiment was methodologically unsound. It's highly
unlikely he'd get IRB approval today. CIA collaboration (like Gottlieb's
dosing unsuspecting civilians with LSD) existed entirely outside academic
influence, and while shameful, don't reflect academic standards. (Gottlieb
wasn't even a psychologist, he was a chemist.)

As for the very basic assumption under question, the difficulty is that many
people will not act normally when they know what the researcher is looking
for. Most people are decent, and will want to help the researcher. Some will
react in the opposite way and seek to mess with the research. The most ethical
way to compensate for temporary deception is a complete "fessing up" right
after, and most participants agree.

------
vacri
Subtitle: _Psychologists used to manipulate and deceive their subjects with
impunity. Did the end justify the means?_

And the answer is an easy, explicit "no". This is why we now have ethics
committees. If you do actually need to deceive your subjects for your study,
then your study has to pass an ethics committee who will weigh up the benefits
of the research against the required deception.

But yet again we see another article that basically treats the entire field of
psychology as starting, ending, and only encompassing Milgram's infamous
experiment from 54 years ago...

------
jamesrom
I remember in secondary school our psych teacher told us he was going to
conduct an experiment. Without going into the full setup, it was a simple
experiment where 3 of the 4 participants were "in" on it and had canned
responses to see if there was any influence on the behavior of the actual
single subject.

We were all brought into the classroom one by one so that each person would be
deceived, then debriefed, and then would take part in deceiving the next
person brought in.

The absolute raw feeling of being deceived is quite an intense emotion... Even
when you know everyone prior to you was also deceived.

So to think a doctor who maintains a confidentiality agreement with their
patient, someone who is trusted completely by their patient, might deceive
them. Ethical considerations aside: it is just a shitty thing to do.

~~~
wodenokoto
Test subjects are not patients.

------
Zigurd
Valid market research often requires deception. If the test subjects know what
the test is, their behavior isn't indicative of real-world behavior. For
example, a market test for magazines might leave them in the "waiting room"
for a taste test of cookies. There's no ethical problem in that.

Whatever the problems with Milgram's experiments as experiments that show
quantifiable results, they are a compelling exercise in investigators'
prejudices and an exploration is how easy it is to put someone in a position
of petty authority and elicit cruelty to the point of torture in them.

We ought to be refining Milgram's experiment designs, and designing the
working environment of police to counter those effects that are measured.

~~~
Animats
_" Valid market research often requires deception."_

Willing to go to jail for fraud, deceit, or unjust enrichment for that?

------
kijin
> _If your subjects half-suspect that you are deceiving them, what are you
> really measuring?_

You are measuring how people react when they suspect that they are being
deceived by an authority figure.

And the result is that many of them obey you anyway.

As people did, and continue to do, in countless dictatorial regimes throughout
history. Many North Koreans probably know that the Kims have been lying to
them about all sorts of things. Kim also knows that they know. And yet the
regime continues to spew lies, and ordinary people continue to pretend to buy
those lies.

I'm pretty sure that Milgram's experiment failed to account for all sorts of
confounding factors. Scientifically, it might or might not tell us anything
useful about human nature. But when it comes to the dark side of social
psychology, those very factors can help make the conclusion even more relevant
to the real world.

In a sense, Milgram's team was experimenting not only on their subjects but
also on themselves. They showed that a certain combination of authority,
deception, and context is highly effective at producing obedience. Hopefully,
their result will help us design future political institutions in such a way
that it becomes very difficult for any single person or group to obtain that
kind of combination.

------
danharaj
I felt the most emphatic "no" when I read the headline. It is a form of power
to be granted the ability to deceive without social repercussions. It is
unethical to turn science into an institution of power; it is already tangled
up with many such institutions, and many of them control medicine and many
have done atrocious things to human beings. Science without ethics degrades
science and turns it into an apparatus of violence.

~~~
partomniscient
Another emphatic "no" here.

Science without ethics has already degraded into an apparatus of profiteering.

The worst thing about the medical industry is the lack of collecting ongoing
informative feedback. From a game theory/power concern it's actually in their
interests to avoid collecting data which may reveal their mistakes, but the
clients/patients are all worse off as a result. This in turn also creates a
positive feedback mechanism as a need for increasing the size of the medical
sector to deal with said uncaught mistakes and continues ad-infinitum.

The subjectiveness on the mental health side of things only amplifies this
effect. Can anyone else see a problem here...?

------
drdeca
If one consents to being deceived, that seems fine?

(To the degree which it is reasonable to consider the consent to apply to)

------
adultSwim
No

------
u23KDd23
After their complacency in supporting the use of torture, it's surprising the
APA and psychology as a profession hasn't been completely dissolved yet.
Clearly the use of their research (which isn't even reproducible ...) has only
found uses for nefarious purposes rather than helping people.

~~~
pvaldes
The big problem with psychology, specially earlier, is that sometimes it seems
just a species of "religion of manipulation" based in non scientific methods
and not reproductible studies that did not probe anything (or even can be
contradictory).

It was a big success because the "Lets flip this coin and let god decide if
you go to jail or not" was still a huge 50% of 'free', so is changed by "If
you please us and do something that we expect you to do, you could be free".
There are not clear rules to play now, so you can bassically manipulate the
percentage that pass the game at will.

