
Bias Against Novelty in Science - wyndham
http://www.nber.org/digest/jun16/w22180.html
======
jrapdx3
Having experienced bias against publishing truly novel findings, I can testify
this form of bias is a potential hazard re: scientific progress.

Back in the early 2000's, in the course of clinical practice I encountered
some unexpected outcomes of treatment which correlated to a particular, if
unexpected diagnosis. I tried to find info about this, but to my surprise a
thorough search of the literature came up empty.

I felt a duty to report my observations. Putting it into a formal paper was
difficult for me to accomplish, but did get it organized. Believing in the
idea of open access, I decided to submit to a newly established open-access
publisher.

Peer review did not go smoothly. Because the subject had never been reported
before, at least in a peer-reviewed journal, the reviewers expressed doubt
about the reality of the data. One "peer" was determined to disparage the
report to the extent of making glaringly inaccurate, distorted comments about
the nature of the condition in the subjects of my report. After a rigorous
defense of data and procedure the editors decided to ignore the negative
review and the report was published.

Novelty is not only apparent in terms of methodology or cross-discipline
application of ideas, but also arises when attempting to share previously
unreported observations. The closed-mindedness of many in the academic
community (who are the predominant peer reviewers) is apparently a pervasive
problem in the sciences, and possibly endemic in academic culture in general.

~~~
Hondor
It sounds like your review process went very well. Those hardened anti-
novelists have probably rejected piles of genuinely crank material. They're
needed to keep the literature free of true rubbish. Of course they're not
perfect so sometimes rubbish gets through and sometimes good work gets
blocked, but if the researcher really believes in their work, they'll go to
extra lengths to break through, as you did.

Maybe the world would be a slightly better place if your overly skeptical
reviewer had reviewed this [http://briandeer.com/mmr/lancet-
paper.htm](http://briandeer.com/mmr/lancet-paper.htm)

~~~
jrapdx3
Keeping "rubbish" out of the journals is a reasonable goal, though judging by
what's in the journals I read, you're right rubbish still leaks through from
time to time.

I believe I was the beneficiary of good fortune in overcoming the barriers,
but luck is a very thin and unreliable conduit to success. Indeed I know
authors not as lucky, and their good work was blocked.

Fields of research are often highly specialized and a small community of
interest can become insulated from voices outside that world. It's my
impression that was the basis of the difficulty I encountered.

Not that I have a solution to the problem, but I've wondered how many
worthwhile novel contributions by "outsiders" not affiliated with a field's
main research centers are being lost under current peer review regimes.

------
untilHellbanned
Yep. Professor in molecular biology here. Been at numerous universities you
know. I can confirm very unscientific groupthink as the main driver of science
progress nowadays. Being a lone wolf, rogue isn't really a viable option given
the funding climate. Perfect sheep extends all the way from college to the
professor level. Think this would surprise most lay people given the
conception of scientists as off in their own world.

~~~
biomcgary
I think a number of things contribute to the rise(?) of groupthink in the
sciences (particularly in biology). One is the increasing age [0] of the peers
in the peer review systems. Younger people are typically willing (and able) to
perform more risky novel research. However, the biggest single factor is the
drop in science funding, which leads to fear and an inherently conservative
response (on the part of both grant applicants and reviewers).

[0] [http://scienceblogs.com/transcript/2009/12/03/nih-grants-
by-...](http://scienceblogs.com/transcript/2009/12/03/nih-grants-by-age/)

~~~
digi_owl
There is also the whole NPM-ish attempts at using things like publication
frequency and number of references to gauge the ROI on research, and therefore
the performance of the scientists involved.

All this runs smack into Cambell's law.

[https://en.wikipedia.org/wiki/Campbell's_law](https://en.wikipedia.org/wiki/Campbell's_law)

~~~
dnautics
wow, thanks, I had never heard of Campbell's law, and it really puts into a
concise form what I believe about a lot of things (from science publishing to
interest rate manipulation).

~~~
digi_owl
I think i originally ran into a British variant, called Goodheart's Law, while
reading about reforms to their National Health Service.

[https://en.wikipedia.org/wiki/Goodhart's_law](https://en.wikipedia.org/wiki/Goodhart's_law)

The particular example being of how hospitals where given a negative rating
based on how many patients were left waiting on stretchers (the rating system
being introduced in an attempt at producing market like forces within the
NHS). To get around this, the staff started removing the wheels on the
stretchers while the patient was waiting. That way it was a bed, not a
stretcher...

------
rtpg
I feel like if you break down what's being said here, that it's just a
consequence of having to communicate ideas to people.

I publish "Gravitational constant is actually (some slightly more specific
thing)", with extension of existing methods. Someone sees that and already
will be convinced just by the title!

I publish "Gravity is actually caused by micro-gnomes pushing things around
with walkie-talkes", and it's pretty novel. I don't really have existing work
to base it on.

Why should anyone believe my work has any basis in reality then? The burden of
proof is on me to convince people. Fool-proof methodology, simple
explanations. Bringing it back to current understanding helps. But I'm the
person saying everything is different, the world owes nothing to me just yet.

Not to mention that the header graph is totally confirmation bias. Yes, you're
going to cite the original paper citing the effect you're studying. You
probably won't cite most incremental research. Definitionally novel papers
that get any traction will collect references.

\---

I actually have an example of my own novel research! I'm the first person to
post about a bug with Ubuntu's (at the time) new IME manager[0] on Ask Ubuntu.
Turns out there was a real thing (and it wasn't me being silly). So now I have
550 points on Ask Ubuntu because of this question, and get notifications about
it all the time.

I have cornered the "Ubuntu 13.10 keyboard bug" citation space. Mainly for
being first. And I have gotten extra rewarded for it. Novel papers enjoy the
same perks when they get any traction.

[0]: [http://askubuntu.com/questions/356357/how-to-use-altshift-
to...](http://askubuntu.com/questions/356357/how-to-use-altshift-to-switch-
keyboard-layouts)

------
0xcde4c3db
It's likely not the same phenomenon, but I'm reminded of the story of
Millikan's oil drop experiment and how the accepted value slowly changed from
Millikan's reported value to the curent accepted value.

[https://en.wikipedia.org/wiki/Oil_drop_experiment#Millikan.2...](https://en.wikipedia.org/wiki/Oil_drop_experiment#Millikan.27s_experiment_as_an_example_of_psychological_effects_in_scientific_methodology)

------
rcpt
Their full paper is here
[http://cepr.org/sites/default/files/news/CEPR_FreeDP_180416....](http://cepr.org/sites/default/files/news/CEPR_FreeDP_180416.pdf)
. If I followed it correctly they measure "novelty" by:

\- bucketing citations into journals

\- building a network of journals with edge weights defined by the number of
papers that cite both journals

\- Using cosine similarity between journals (cf
[https://en.wikipedia.org/wiki/Similarity_(network_science)#C...](https://en.wikipedia.org/wiki/Similarity_\(network_science\)#Cosine_similarity)
) in this network to define how banal it is to cite them together

\- For a new paper, they recompute this matrix and define novelty by summing
1-banality over the new edges

I don't know if their method reduces to a standard thing from network science
(eg. betweenness, centrality, ?) but I would not be surprised if it did.

A closely-related paper with a totally-different writing style did something
like this a few years back:

"Citing for High Impact" [https://cs.stanford.edu/people/jure/pubs/citations-
jcdl10.pd...](https://cs.stanford.edu/people/jure/pubs/citations-jcdl10.pdf)

------
Jozrael
My preconceptions here are the reverse - that mundane studies seeking to
replicate existing studies to confirm their findings are boring, and as such
are not pursued or funded, despite their importance. I understand that 'novel'
is being used in a slightly different context here, though.

~~~
untilHellbanned
In biological sciences it's a well known secret that most of what you write
your grants on is for work that you've already done. This is not about being
"boring" vs interesting. It's pragmatism to survive and get funding.

~~~
jonlucc
Is it not that way in other sciences? It's funny that I just assumed all
sciences did the same thing. In fact, my wife works in a museum, and I was
stunned to find out that they write grants for work that isn't even _started_
yet.

~~~
arcanus
> Is it not that way in other sciences?

I can speak to the fact that this is _absolutely_ the case in the physical
sciences.

------
yabby
Status-quo bias is a phenomenon that is well documented in the psychology
literature:
[https://en.wikipedia.org/wiki/Status_quo_bias](https://en.wikipedia.org/wiki/Status_quo_bias)
Bias for the established ways of thinking seems especially common (to me) in
the physical sciences. I do not think that should be a surprise since the
fundamental view of natural processes does not advance quickly. Perhaps this
is why Einstein remarked that if you really believed in your ideas then you
should quit being an academic and go get a job like a lighthouse keeper. That
maybe works in mathematical physics, where all you need is a pen and paper,
but in the other sciences it does not seem like a great plan. What I have
observed many do when they find their ideas fall foul of the times is they go
get a line-of-business job that does not require a lot of mental energy. That
is a plan I have seen work for many. Finance is particularly good for that. It
is a field where knowing how to think in contrarian fashion and also knowing
how to run with the herd are both valuable. I see a lot of mathematical
physics folks do that as a way to reconcile the glacial pace of scientific
development with the urge to think and do something new and innovative. There
are quite a few famous mathematicians who paid the bills with finance work.
For instance, Gauss wrote insurance contracts and used his distribution to
help price the risk. Of course, in modern times Jim Simons runs a hedge fund.

------
robotresearcher
These notions sound familiar.

[https://en.wikipedia.org/wiki/The_Structure_of_Scientific_Re...](https://en.wikipedia.org/wiki/The_Structure_of_Scientific_Revolutions)

------
powera
Good! There _should_ be a bias against new things until the new things are
proven to be correct.

Optimizing science based on "what do people who want to get tenure want" seems
like a great way to be bad at science.

~~~
MollyR
I agree, I have some experience from doing medical research, and this seems
like a given to me. My friends who work in the softer sciences like psychology
have different opinions though, since they often work in degree's of
correctness.

------
ploplop
My aunt's been studying the sun magnetic field for decades. She has measured
(yes, measured) parts of the sun inner magnetic field (quite complex to do
actually). Div(B) not null there. She can't get it published because Maxwell's
equations and it's a "new" discovery. Maxwell starts with "in the void", I
think we can agreed that centre of the sun is not the void. It is incredible
that scientist MEASURING stuff can't get its measures published in recognized
science papers.

~~~
pdonis
_> Maxwell starts with "in the void", I think we can agreed that centre of the
sun is not the void._

Maxwell's Equations apply in the presence of sources, i.e., not just "in the
void". That includes the equation Div B = 0. In other words, "sources" are
electric charges and currents; there are no magnetic charges and currents. So,
as GFK_of_xmaspast commented, most likely there is an error in her
measurements somewhere, and she should be trying to find it.

------
beloch
Is this bias, or does it just take time for other researchers to go into areas
opened up by novel papers and then publish other papers that cite those novel
papers?

------
yk
Intuitively this sounds like confirmation for the current system. The thing
is, that it takes time to understand truly novel research, while a measurement
like the discovery of the Higgs has a immediate impact, because the Higgs
mechanism was studied for 40 years.

(To be clear, I believe that the focus on citations is stupid for other
reasons, but this does not seem to be a failure of the system.)

------
mathattack
If anything I've heard the other thing - that it's hard to get published
unless something is novel enough. There's not much interest in studies that
just confirm or slightly extend, which is why reproducibility becomes an
issue.

~~~
biomcgary
Somewhat novel research is fine, just as long as it conforms to what is
expected. Novel AND counter to the expectations of the field is much harder to
publish.

~~~
untilHellbanned
sad but very true

~~~
skosuri
Can you think of a low impact paper that later turned out to be too novel to
get published in a high impact journal at the time? How often do you think
this happens? I for one don't see this a lot. What I see more often are people
that think there work is super novel, and either it's not, or they don't have
enough evidence and don't know the field well enough to know that they are
wrong.

------
return0
What if it is an education issue? Maybe by training young scientists for wayy
too long before they start doing science, we "overtrain" them, and their
mental models are over-fitted to current science. Naivete can sometimes be a
major source of creativity (that's why people who cross disciplines sometimes
have great impact).

------
byroniczero
Obligatory mention of Rupert Sheldrake: [http://www.sheldrake.org/books-by-
rupert-sheldrake/the-scien...](http://www.sheldrake.org/books-by-rupert-
sheldrake/the-science-delusion-science-set-free)

~~~
akiselev
Obligatory mention of pseudoscience in a topic about real science: check.
Might as well hail Deepak Chopra as Einstein reincarnated.

[https://en.wikipedia.org/wiki/Rupert_Sheldrake](https://en.wikipedia.org/wiki/Rupert_Sheldrake)

~~~
byroniczero
Obligatory knee-jerk accusation of “pseudoscience” at the mention of
Sheldrake: check. It’s a topic about overly conservative attitudes in the
scientific community. Thanks for demonstrating that tendency with your
insulting comment.

~~~
mr_overalls
Sheldrake's ideas of "morphic resonance" are the worst kind of
pseudoscientific garbage. The man literally believes that insulin molecules
inherit a collective memory from all insulin molecules that have existed in
the past.

He has constructed a New Age world of fantasy based on a fundamentally new
type of physical phenomenon that is completely unsupported by physicists.

Extraordinary claims require extraordinary evidence, and he has not delivered
such evidence.

