Hacker News new | past | comments | ask | show | jobs | submit login
Is it OK for psychologists to deceive? (aeon.co)
22 points by pepys on Sept 9, 2015 | hide | past | favorite | 21 comments



When I was a kid, I had a psychologist who ratted out many of the things I told him, supposedly "in confidence," to my parents. It was a deceptive abuse of doctor-patient confidentiality , and I promptly stopped talking to him when I found out he had deceived me.

In general, I would say "no." It's never okay for a psychologist to deceive their patients, or to experiment on people without their knowledge. Talking to a professional about mental health issues puts a person in a vulnerable position. When a psychologist abuses that vulnerability, it is an extreme betrayal that would make it less likely for the patient to be able to open up to anyone else about their issues, thus obstructing their ability to receive mental health treatment in the future.

Performing psychological experiments on people without their knowledge could similarly aggravate underlying psychological issues, with potentially serious consequences. This is some Nazi doctor-level shit.

Mental health issues are in some ways more problematic than physical health issues, because culturally we don't take them as seriously as "real" diseases. Too often our medical system treats mental health issues as a moral inadequacy, not a health issue. The fact that we even think twice about the question posed in the OP shows how seriously messed up our priorities are.


> It's never okay for a psychologist to deceive their patients

This is not true, as there are certain cases where doctor/patient confidentiality is explicitly allowed to be breached - if you credibly threaten the practitioner or other people, then the duty of care changes. If you credibly threaten to murder someone, then a mental health staffer is entirely within their rights to breach your confidence.

Your own situation you describe, of course, does not appear to be one of those cases.


He's not talking about the current status of the law. The law has absolutely zero to do with right and wrong. He's talking about morality. He seems to believe that if you agree to help someone with their mental problems, that you shouldn't turn around and betray their trust for ANY reason -- that's the only way any real trust can be developed. I tend to agree.

If someone said "you can trust me, except for this laundry list of things you can say in which case I'll have you locked in a cage and force you to take medication against your will" would you trust them? Because that's the reality of every conversation with a psychologist currently -- it's just not made explicit.


I said 'duty of care', not 'law'. If you credibly threaten the practitioner or third parties, then morality most definitely says you should protect those people from harm. I mean seriously, make the moral comparison: break someone's trust or let someone get maimed or die.

To put it bluntly, it's stupid to demand a practitioner ignore credible threats (stress credible) against their own life, in favour of some weirdly puritan moral theory. Practitioners are people too, with their own lives, loves, and foibles... and, frequently, their own mental illnesses. They're not any more disposable than the patients they treat. Just because you're interested in helping someone doesn't mean that you commit to do so even unto your own death.

> If someone said "you can trust me, except for this laundry list of things you can say in which case I'll have you locked in a cage and force you to take medication against your will" would you trust them?

Psychologists can't prescribe medicine or force detention - you're tilting at a windmill here. Psychiatrists can do both, though with some limitations on how they can detain. Similarly, credible threats are hardly a 'laundry list'.


When you see a psychiatrist or psychologist they must not say "anything you tell me is confidential". They must say (something like) "I'll treat everything you say as strictly confidential, although their are some important exceptions. I have a legal duty to report child abuse; and if I think you pose a risk of harm to another person I have to divulge that".

This allows confidentiality within the law and does not deceive the patient.


You are now just arguing just to argue. Your response is irrelevant to the discussion.


It's not clear to me that your experience was unethical. As parents are the legal guardians of kids, where is the responsibility -- to the adult, or to the child?

Your third paragraph is very hyperbolic and taints the otherwise reasonable points I believe you made.


I would say that the the responsibility of the psychologist (and all doctors) is to their patient, which in this case is the child. That does not necessarily mean that it is unethical to disclose what was said to the parents, as doing so may be in the best interest of the child.

However, there is a significant second order concern with violating confidentiality. In future instances, the child may be less willing to speak "in confidence" with a psychologist.


Former cognitive neuroscientist here.

This article is supposed to be talking about psychological research, but it starts with an anecdote about a bank teller's UFO prank? Psych research has had a checkered history, for sure, but this straw man doesn't help the author's case. How does this reflect on academic research standards?

For those who don't know, modern research standards (i.e., the last 40 years) require all research to be approved by an Institutional Review Board, which is mostly staffed by non-psychologists. They have to justify the possible knowledge gained, the methodology used, and if there is any deception involved, all participants get a mandatory debriefing, where the purpose was explained to them. The research shows that debriefings are very effective, and the vast majority of participants do not leave the lab feeling violated; most are happy to contribute to scientific knowledge (or just collect their cash/course credit).

I have personally witnessed labs temporarily shut down for violating IRB rules. And without IRB approval, no legitimate journal will publish your research.

While Milgram and Asch's experiments were deceptive, I think their results reveal harsh, necessary truths about ourselves. Zimbardo is a different case, in that the Prison experiment was methodologically unsound. It's highly unlikely he'd get IRB approval today. CIA collaboration (like Gottlieb's dosing unsuspecting civilians with LSD) existed entirely outside academic influence, and while shameful, don't reflect academic standards. (Gottlieb wasn't even a psychologist, he was a chemist.)

As for the very basic assumption under question, the difficulty is that many people will not act normally when they know what the researcher is looking for. Most people are decent, and will want to help the researcher. Some will react in the opposite way and seek to mess with the research. The most ethical way to compensate for temporary deception is a complete "fessing up" right after, and most participants agree.


Subtitle: Psychologists used to manipulate and deceive their subjects with impunity. Did the end justify the means?

And the answer is an easy, explicit "no". This is why we now have ethics committees. If you do actually need to deceive your subjects for your study, then your study has to pass an ethics committee who will weigh up the benefits of the research against the required deception.

But yet again we see another article that basically treats the entire field of psychology as starting, ending, and only encompassing Milgram's infamous experiment from 54 years ago...


I remember in secondary school our psych teacher told us he was going to conduct an experiment. Without going into the full setup, it was a simple experiment where 3 of the 4 participants were "in" on it and had canned responses to see if there was any influence on the behavior of the actual single subject.

We were all brought into the classroom one by one so that each person would be deceived, then debriefed, and then would take part in deceiving the next person brought in.

The absolute raw feeling of being deceived is quite an intense emotion... Even when you know everyone prior to you was also deceived.

So to think a doctor who maintains a confidentiality agreement with their patient, someone who is trusted completely by their patient, might deceive them. Ethical considerations aside: it is just a shitty thing to do.


Test subjects are not patients.


Valid market research often requires deception. If the test subjects know what the test is, their behavior isn't indicative of real-world behavior. For example, a market test for magazines might leave them in the "waiting room" for a taste test of cookies. There's no ethical problem in that.

Whatever the problems with Milgram's experiments as experiments that show quantifiable results, they are a compelling exercise in investigators' prejudices and an exploration is how easy it is to put someone in a position of petty authority and elicit cruelty to the point of torture in them.

We ought to be refining Milgram's experiment designs, and designing the working environment of police to counter those effects that are measured.


"Valid market research often requires deception."

Willing to go to jail for fraud, deceit, or unjust enrichment for that?


> If your subjects half-suspect that you are deceiving them, what are you really measuring?

You are measuring how people react when they suspect that they are being deceived by an authority figure.

And the result is that many of them obey you anyway.

As people did, and continue to do, in countless dictatorial regimes throughout history. Many North Koreans probably know that the Kims have been lying to them about all sorts of things. Kim also knows that they know. And yet the regime continues to spew lies, and ordinary people continue to pretend to buy those lies.

I'm pretty sure that Milgram's experiment failed to account for all sorts of confounding factors. Scientifically, it might or might not tell us anything useful about human nature. But when it comes to the dark side of social psychology, those very factors can help make the conclusion even more relevant to the real world.

In a sense, Milgram's team was experimenting not only on their subjects but also on themselves. They showed that a certain combination of authority, deception, and context is highly effective at producing obedience. Hopefully, their result will help us design future political institutions in such a way that it becomes very difficult for any single person or group to obtain that kind of combination.


I felt the most emphatic "no" when I read the headline. It is a form of power to be granted the ability to deceive without social repercussions. It is unethical to turn science into an institution of power; it is already tangled up with many such institutions, and many of them control medicine and many have done atrocious things to human beings. Science without ethics degrades science and turns it into an apparatus of violence.


Another emphatic "no" here.

Science without ethics has already degraded into an apparatus of profiteering.

The worst thing about the medical industry is the lack of collecting ongoing informative feedback. From a game theory/power concern it's actually in their interests to avoid collecting data which may reveal their mistakes, but the clients/patients are all worse off as a result. This in turn also creates a positive feedback mechanism as a need for increasing the size of the medical sector to deal with said uncaught mistakes and continues ad-infinitum.

The subjectiveness on the mental health side of things only amplifies this effect. Can anyone else see a problem here...?


If one consents to being deceived, that seems fine?

(To the degree which it is reasonable to consider the consent to apply to)


No


After their complacency in supporting the use of torture, it's surprising the APA and psychology as a profession hasn't been completely dissolved yet. Clearly the use of their research (which isn't even reproducible ...) has only found uses for nefarious purposes rather than helping people.


The big problem with psychology, specially earlier, is that sometimes it seems just a species of "religion of manipulation" based in non scientific methods and not reproductible studies that did not probe anything (or even can be contradictory).

It was a big success because the "Lets flip this coin and let god decide if you go to jail or not" was still a huge 50% of 'free', so is changed by "If you please us and do something that we expect you to do, you could be free". There are not clear rules to play now, so you can bassically manipulate the percentage that pass the game at will.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: