
“Power pose” research is the latest example of scientific overreach - NN88
http://www.slate.com/articles/health_and_science/science/2016/01/amy_cuddy_s_power_pose_research_is_the_latest_example_of_scientific_overreach.single.html
======
J-dawg
I was told the "7%-38%-55% rule" during an awful management training session
on giving presentations. The instructor said that this rule is based on
research that people get only 7% of the meaning of what is being said by a
speaker from the actual content, while the remaining 93% comes from body
language and tone of voice. I looked around the room and couldn't believe that
everyone was nodding along with this nonsense. It doesn't stand up to the
slightest bit of logical scrutiny. How do you break down the "meaning" of
something like a presentation into percentages? Does this mean I can stand in
front of a crowd and babble complete gibberish, and providing that my tone of
voice and body language are good I'll still get 93% of my point across?
Anyway, I finally got around to googling where this "research" comes from. It
turns out it was a (flawed) study where someone would say a _single word_ and
the listener would be asked to describe the inferred meaning [0]. Extending
this idea to an entire presentation is clearly ridiculous.

I guess my point is, many people seem happy to believe utter nonsense without
a moment's thought, providing it gives them a clever-sounding, apparently
counter-intuitive anecdote to impress their friends with.

[0]
[https://en.m.wikipedia.org/wiki/Albert_Mehrabian](https://en.m.wikipedia.org/wiki/Albert_Mehrabian)

~~~
derefr
> Does this mean I can stand in front of a crowd and babble complete
> gibberish, and providing that my tone of voice and body language are good
> I'll still get 93% of my point across?

Presuming a military or political speech, that sounds about right. Rousing
speeches are still rousing even if you don't speak the language they're
written in!

The point of a lot of presentations is effectively to dump a bunch of words
that move the audience's emotional state closer to the presenter's emotional
state. In "emotional rhetoric", the body language and tone is primary; the
words themselves are secondary, much like slides—they exist mostly to serve as
a substrate for social-alliance signalling. (Imagine interactions between
candidates during a presidential debate.)

The words in the speech _can_ also aid the one-in-ten audience members who are
too stubborn and analytical to be swayed by anything other than the facts.
Even then, though, most emotional-rhetoric speeches usually have an
accompanying leaflet/programme/manifesto/whitepaper specifically for these
people. A speech is almost never the proper place for the data to convince
people of the soundness of the plan; instead, a speech's proper role is in
making clear what _emotional affect_ the speaker holds toward the ideas
they're presenting. Which is, in a political environment, usually the most
important thing to know about an idea: it's the metadata that lets you know
whether the idea is really going to be tried or not, caught up in bureaucracy
or not, etc.

~~~
cafard
[http://teachers.sduhsd.net/mgaughen/docs/Mencken.Gamalielese...](http://teachers.sduhsd.net/mgaughen/docs/Mencken.Gamalielese.pdf)

------
0xcde4c3db
The article hints at the pattern, but I think it's worth mentioning directly:
it's fairly common for a psychologist with a pet theory to publish a self-help
book based on it. This creates an incentive to exaggerate the scope, size, and
certainty of the effect they claim to have discovered, because their status as
a scientific expert on the phenomenon becomes a marketing tool. Even if they
don't engage in any deliberate misconduct, this could still create bias that
leaks into study design and analysis/interpretation.

~~~
cJ0th
Ironically, these days you find more useful self-help advice by finding
similarities between different religions than by reading up on brand new
scientific findings.

~~~
pjscott
Because the former approach finds things that are obvious enough to have been
noted by multiple cultures independently, while the latter approach finds
things that seem unlikely enough to be novel -- right?

A social scientist will not become famous as a deep, original thinker by
telling people to work hard, be patient, keep a reputation for honesty, and
eat their vegetables.

~~~
cJ0th
As for many popular social scientists I can't even tell whether they aim at
becoming famous or whether they're just very ignorant fellows who really think
they've "cracked it".

------
shritesh
We were shown the TED talk on our first "Professional Communication" class
this week. It sounded like placebo from the get-go (like the majority of the
_most-viewed_ TED talks). But hey, who am I to question a Harvard professor?
Thankfully, someone else did.

~~~
randycupertino
The thing is, placebo is actually extremely effective... so if you believe
power poses are really helping, they probably are! Same with the power of
prayer, or voodoo dolls, or whatever...

------
mettamage
Thanks HN for making a graduated psychology student more critical. When I was
studying psychology at uni I had no help from peers in being critical with
regards to psychology. It's a lot harder being critical when no one really
challenges your thoughts on the subject.

Now on to a method that does invoke "neuroendocrine and behavioral changes".
The Wim Hof Method, it takes about 10 minutes to do to feel a strong effect
IMO. Don't want to hijack the topic but I thought it was a fitting counter
example :)

The results are also not barely statistical significant, it's more like 5
deviations away from the norm with regards to their main RQ.

Paper is here:
[http://www.pnas.org/content/111/20/7379.full](http://www.pnas.org/content/111/20/7379.full)

Claims about behavioral changes are mine (and I'm just a guy on the web who
takes cold showers every day), the research team focused on immune response.
But as you can see in their charts about the adrenaline boost one gets,
behavioral change occurs in my experience at least.

------
Fede_V
The problem with this is that a sexy splashy finding gets a completely
unwarranted level of attention. A study with 21 patients should never be
published in the first place (unless it's something like a medical case study
- that's different).

Peter Thiel quipped that "The eccentric university professor is going extinct
fast". He is completely right - and what's replacing them are incredibly media
savy extroverts that are incredibly apt at marketing their own studies.

------
striking
I've wanted to put together a website that highlights bad science and
resulting journalism (both when journalism exposes bad science, and when
journalism furthers bad science).

Would anyone be interested in seeing something like that?

~~~
tokenadult
_I 've wanted to put together a website that highlights bad science_

Have you taken a look at PubPeer?[1] I guess I don't have the right touch in
submitting articles to Hacker News. I've submitted a few that mention this
problem of checking and exposing bad science publications, linking to PubPeer,
but the articles I submit[2] don't usually enjoy as much discussion from you
and other Hacker News participants as I would expect, based on the frequent
statements I see here that people would like to help clean up scientific
research. There is already a site for that, and it's called PubPeer.

[1] [https://pubpeer.com/](https://pubpeer.com/)

[http://retractionwatch.com/2015/08/31/pubpeer-founders-
revea...](http://retractionwatch.com/2015/08/31/pubpeer-founders-reveal-
themselves-create-foundation/)

[http://www.sciencemag.org/news/2015/08/pubpeer-s-secret-
out-...](http://www.sciencemag.org/news/2015/08/pubpeer-s-secret-out-founder-
controversial-website-reveals-himself)

[2]
[https://news.ycombinator.com/submitted?id=tokenadult](https://news.ycombinator.com/submitted?id=tokenadult)

~~~
striking
PubPeer looks like it's very focused towards professionals who want to engage
in peer review. I'm curious to see if it's possible to leverage a wider group
of users.

I like PubPeer because it can help fix science. But what's also broken is
journalism/popular media, and how it encourages "pop science" attitudes. It
sensationalizes unproven and dubious content, actually making the public
dumber.

Would you agree that there's room for a site that wants to fix science
journalism and how the public interacts with science? Heck, it could even link
to PubPeer, without directly replacing it.

~~~
tokenadult
_Would you agree that there 's room for a site that wants to fix science
journalism and how the public interacts with science?_

Problematic publications about science are certainly a big problem, worthy of
your attention and mine, and my posting history over 2624 days here on Hacker
News may suggest that it is one of my pet issues. How much interest this issue
gains here on Hacker News is one thing I look at as I ponder what to do about
the problem--it will take a lot of work with a lot of collaborators (some of
whose work is cited in my various submissions and comments here) to tackle
that problem.

As far as I know about who can participate on PubPeer, absolutely anyone can
participate, as long as they have something to say about a particular
scientific paper.

------
seibelj
My drug of choice is placebo. The more I use, the better it gets. I recommend
it to everyone

~~~
Gravityloss
Turns out the study which verified the placebo effect was actually seriously
flawed.

It's because, on average, sick people tend to get better even if nothing is
done. The placebo study had no control group. So they can't say giving a
placebo is better than doing nothing.

~~~
SquareWheel
Not an expert myself, but I somehow doubt there was only a single study
showing the effectiveness of placebos. It's a very common subject and a key
component of current scientific testing (control groups).

------
xefer
As I was reading this I had a bleak vision of being in a meeting with a bunch
of alpha types all trying to out "power pose" each other

~~~
drxzcl
It sounds like it would be straight out of a "Silicon Valley" scene.

------
DanielBMarkham
Is it just me, or does "scientific overreach" sound like a terrible euphemism?

If the same people were on late-night TV peddling this, I doubt we'd be
calling it "overreach".

For those of us who truly love science, it's important to treat these things
exactly the same, whether it's a newspaper article, TED talk, or "Incredible
Mysteries" TV show. Being mealy-mouthed isn't doing anybody any favors.

~~~
bphogan
I agree.

One scientist does a study, finds a thing.

Another group attempts to repeat the study (as one does in science) and finds
different results.

Sounds like science to me.

~~~
semi-extrinsic
No. It's bad science. I'd go as far as to say it's fucking crappy science.
Which is what TFA is saying.

I mean, look at the figure in TFA where they show the original study effect
size plus/minus two standard deviations, and the _huge_ error bars (at 95%
confidence) are just _barely_ excluding the null hypothesis. With a sample
size of 21 people. That is so pitiful even the Mythbusters would've said "We
need to do a bigger experiment." But this Harvard professor instead said
"Hooray, a significant result!" and went and did TED talks and books and all
sorts of publicity stuff.

Scott Aaronson recently quipped "(...) there was much discussion around the
discovery that most psychology studies fail to replicate. I'd long assumed as
much, but apparently this was big news in psychology!"

The observation in the hard sciences that psychology has a big problem goes at
least back to Feynman's "cargo cult" speech in '74\. The fact that it's taken
them forty years to catch up to this fact speaks volumes about the field.

~~~
bphogan
Presenting things that are not peer reviewed is generally considered bad
science, is it not?

This kind of "science" happens here on HN every day and nobody bats an eye.

"We used XYZ framework and it's the best thing ever!"

Then tons of people use XYZ framework.

Blog posts.

Videos.

Fad happens.

Fad is over because "Why I'm switching to ABC framework."

With tons of benchmarks of course.

My point is that this happens all over the place. Someone makes a "discovery",
markets the hell out of it with great speeches, bravado, etc. It gets
coverage. It "grows legs" if you will. People repeat it and spread it.

The "science" part I was referring to was the part where someone else attempts
and fails to repeat the initial findings.

~~~
semi-extrinsic
> Presenting things that are not peer reviewed is generally considered bad
> science, is it not?

Not really. Most conference presentations present stuff that's not (yet) peer
reviewed. A Master's thesis is not peer reviewed. Etc.

> This kind of "science" happens here on HN every day and nobody bats an eye.

Sure. And bad cooking happens every day in the kitchens of many homes in the
US. But when bad cooking happens in the kitchen of a Michelin star restaurant,
and the cook still serves it up as gourmet food, it's a really big problem.

------
johnchristopher
FWIW: Jordi Quoidbach is another researcher (from Harvard), he worked with Dan
Gilbert now that I think about it, who heavily relies on the power pose and
other outstanding claims from positive psychology experiments to promote...
positive psychology (and some self-help books).

