
Half the Facts You Know Are Probably Wrong - tokenadult
http://reason.com/archives/2012/12/24/half-the-facts-you-know-are-probably-wro
======
DanielBMarkham
I'd peg it at over 90%, depending on how your define terms.

One of the difficult parts of being a teenager and young adult is that you
learn the rhythm and gist of how things operate without actually learning how
they operate. This is one of the reasons "being cool" is such a big deal at
that age -- the idea is to learn to mimic and fit in with the rest of how
society works, not actually to be able to manipulate it. Faking it is much
more important than actually doing it. That's what everybody else is doing
anyway.

Somewhere in your 30s or 40s (perhaps sooner or later) you realize that much
of life is like that: you don't really understand how a car works, only that
most of the time you can get in it and go places. When it doesn't work, you
take it to another person. She may also not understand what's wrong. But most
all of the time she does. If she doesn't, she takes it to another person, and
so on. You keep getting closer and closer to total understanding but you never
reach 100%. That same pattern applies to medicine, advanced physics, and most
everything else. When there's a billion areas of knowledge, even 99% knowledge
leaves a helluva lot of holes. There's this surface layer of generalities and
half-truths that work so often that it's just not worth diving down into the
details on everything else.

On top of that, other people are very happy to share their own generalizations
and heuristics, so we're kind of operating in a rumor-of-a-rumor mode. The
vast majority of the time it doesn't matter, but sometimes, like when you ask
questions about how startups work and most of your surrounding culture gives
you bad answers, it does. Most of life isn't a geometric proof built on the
linear progress of science; it's a bunch of heuristics strewn together to make
something practical that you can use.

What we "know" is a product of generalizations and communications about half-
truths from the environment we're in. This is one reason why it's so critical
to spend a lot of time challenging the things you think you know, especially
in areas that make a big difference to you personally. It also means that you
must accept a deep and abiding peacefulness in not knowing very much at all
about most of the stuff you actually work with in your day-to-day life.

~~~
boboblong
>You don't really understand how a car works, only that most of the time you
can get in it and go places. When it doesn't work, you take it to another
person. She may also not understand what's wrong.

Oh, please! "She"? Let's all ruin the flow of our prose in an attempt at
social engineering that probably isn't even worthwhile, shall we? I suppose
you didn't bat an eyelash at Denzel Washington as savant pilot Whip Whitaker
either.

~~~
oftenwrong
If the parent had used "he", would you be complaining?

~~~
burke
No, because like it or not, "he" is the de facto pronoun for a person of
irrelevant gender.

------
greenyoda
The article says:

 _In 2005, the physician and statistician John Ioannides published “Why Most
Published Research Findings Are False” in the journal PLoS Medicine. Ioannides
cataloged the flaws of much biomedical research, pointing out that reported
studies are less likely to be true when they are small, the postulated effect
is likely to be weak, research designs and endpoints are flexible, financial
and nonfinancial conflicts of interest are common, and competition in the
field is fierce. Ioannides concluded that “for many current scientific fields,
claimed research findings may often be simply accurate measures of the
prevailing bias.”_

If Ioannidis' findings apply to areas of science other than biomedical
research (where there is also competition, pressure to publish, conflict of
interest, etc.), then most scientific knowledge is already false at the time
that it's published; you don't have to wait 45 years for it to become false
like this article claims.

Here are some references to Ioannidis' work:

\- The cited PLoS paper (2005):
[http://www.plosmedicine.org/article/info:doi/10.1371/journal...](http://www.plosmedicine.org/article/info:doi/10.1371/journal.pmed.0020124)

\- A less technical article in The Atlantic: "Lies, Damned Lies, and Medical
Science" (2010): [http://www.theatlantic.com/magazine/archive/2010/11/lies-
dam...](http://www.theatlantic.com/magazine/archive/2010/11/lies-damned-lies-
and-medical-science/308269)

Edit: Corrected spelling of Ioannidis' name; it's misspelled in the article.

------
reasonattlm
The scientific method and the community of science that surrounds it is truly
a powerful machine - able to take the worst aspects of human nature, sailing
atop a river of garbage specked with half-wrong answers, and spin that mix
into the gold of technology. It doesn't matter what your right to wrong to
nonsense ratio is when it comes to deciphering the world; so long as you have
the will to progress and your sifting mechanism is good enough, accumulating a
whole pile of right is just a matter of time.

The front line of science is a messy place; a mostly wrong messy place, as any
of us who have spent time there know. A recent study claimed massive error
rates across all scientific papers - which is not a surprise to scientists.
The closer to the edge of knowledge you come, the more wrong you'll find - a
great frothing sea of wrong, enthusiastically generated by scientists in
search of nuggets of right. It's all part of the process, and you have to step
back from the details in order to see where the process is taking you. In any
complex field, and biotechnology and medicine are about as complex is it gets
outside astrophysics, validating truth takes time. Scratch any unanswered
question and it'll bleed papers and reviews, a dozen for any given position on
the topic.

~~~
maratd
> The front line of science is a messy place; a mostly wrong messy place

Reading the article, the most interesting aspect was the concept of a "half-
life" of knowledge. This, of course, applies not just to the front lines, but
also to the rear guard.

By definition, a new idea that proves to work well obsoletes an old one that
didn't work quite as well.

------
lotharbot
> _"In the past half-century, all of the foregoing facts have turned out to be
> wrong.... As facts are made and remade with increasing speed, Arbesman is
> worried that most of us don’t keep up to date."_

It can take a lot longer than a half-century to catch up.

For example, it's surprising how often I hear people talk about "exponential"
population growth, which was the going theory in 1798 [0] but known to be
inaccurate by 1838 [1]. I hear similarly out-of-date comments on topics
ranging from evolution to Bible manuscripts with shocking regularity.

People sometimes learn "facts" from trusted friends or preferred authority
figures, who learned them the same way, going back generations without anybody
thinking to double-check. I suspect the emotional cost of finding out a friend
was wrong provides just enough friction to keep people from looking things up,
even in a world with wikipedia and snopes at our fingertips.

[0]
[http://en.wikipedia.org/wiki/An_Essay_on_the_Principle_of_Po...](http://en.wikipedia.org/wiki/An_Essay_on_the_Principle_of_Population)

[1]
[http://en.wikipedia.org/wiki/Logistic_function#In_ecology:_m...](http://en.wikipedia.org/wiki/Logistic_function#In_ecology:_modeling_population_growth)

~~~
baddox
To be fair, a lot of laypeople use "exponential" to mean superlinear. Not to
mention that the "logistic function" you cited is essentially exponential
until it hits saturation.

------
ryguytilidie
My first foray into science was working in a lab at NASA on an internship. The
scientist I was working with came up with a hypothesis and instead of testing
it in a scientific way, simply disregarded the information that didn't agree
with her thesis and used the information that did. I'm obviously not saying
all scientists do this, but the way she did it, and the complete lack of self
awareness for what she was doing make me feel like it happens fairly often.

I also feel like its a pretty good analogue for modern life. People seem less
willing to discover the truth than they are to use facts that support the
point they want to make while ignoring the inconvenient parts of the facts.

~~~
splat
I think the way the scientific method is generally presented does not really
accurately represent how science is really done. It's true that there are many
scientists who more or less follow the scientific method to the letter. But I
think it's more common that scientists individually don't follow the
scientific method terribly closely. It's only scientists _collectively_ that
follow the scientific method.

One scientist might come up with a hypothesis, never test it, but nevertheless
believe it devoutly. But science progresses because _other_ scientists are
sceptical and test the hypothesis.

It's in the first scientist's self-interest that the hypothesis is proven
true, but it's generally more in other scientists' self-interest to prove the
hypothesis false.

Some of the greatest scientists of the 20th century closed their eyes to
overwhelming evidence that their pet hypothesis was wrong. One of the classic
examples was Geoffrey Burbidge (known best for showing that the heavy elements
were created in stars and released into the universe in supernovae) who
believed until the day he died that quasars were galactic objects.

------
ealloc
> Most of the DNA in the human genome is junk

This topic is still very controversial and I don't think the article should be
citing it in this context.

It was only a few months ago that the ENCODE project released its first
results claiming that 80% of the genome is "functional", and many scientists
have noted that their definition of "functional" is much too broad. It
includes any piece of DNA that becomes bound to protein at some time, while it
is known that many proteins bind nonspecifically.

~~~
InclinedPlane
Yup. Most of the DNA is not directly involved in coding proteins but we've
been finding out more and more of how DNA works. Significant parts of DNA are
there merely to help maintain chemical and solubility properties so that the
DNA coils correctly. And far more of it is there for gene regulation.

------
sold
A counterpoint - The Relativity of Wrong, famous essay by Asimov.
<http://chem.tufts.edu/answersinscience/relativityofwrong.htm>

------
DanBC
An interesting BBC Radio Four programme about this kind of stuff specifically
about the mind / brain: (<http://www.bbc.co.uk/programmes/b016wzs9>)

I'm cautious about children. I've seen plenty of boys learn a long list of
(for example) dinosaur names. Is this something I should try to avoid with my
son, or should I encourage it, or should I just remain neutral? I have no
idea.

I do know that I want to encourage him to think about relationships between
things, rather than just learning a list of names. (For example; "That tree is
evergreen, which means it keeps its leaves all year round. It does that
because [...]" rather than "That's a Leylandii".

~~~
aes256
> I do know that I want to encourage him to think about relationships between
> things, rather than just learning a list of names. (For example; "That tree
> is evergreen, which means it keeps its leaves all year round. It does that
> because [...]" rather than "That's a Leylandii".

I don't suppose there's any harm in loading your children up with trivial
knowledge, but I would certainly put more effort into teaching them thinking
skills than specific knowledge.

These days, if you have the necessary thinking skills, the sum of human
knowledge is within reaching distance. If you know how to research a subject,
how to corroborate and verify the veracity of positive claims and theories
presented to you, you can find out pretty much everything about anything in a
matter of minutes.

I suppose it's the old "give a man a fish" adage applied to knowledge.

~~~
jonhendry
"I don't suppose there's any harm in loading your children up with trivial
knowledge, but I would certainly put more effort into teaching them thinking
skills than specific knowledge."

I find that kids are like sponges for details like names. They'll soak them up
whether you're teaching them intentionally or not. Or if you're trying to get
your kid to learn animals or stars or whatever, they'll be memorizing scores
of baseball statistics, or Pokemon, or airplanes.

Teach thinking skills; the trivia will take care of itself.

------
nunb
I was telling somebody my theory on this the other day, and I thought I might
share it here.

Basically information is power. Due to informational asymmetry[1] being
exploitable via economic, political and other forms of power, it is almost
always in somebody else's interest that you be misinformed.

Therefore the person with the most to lose (and the person with the best
interest in being informed) is you. And instead of seeking out the forms of
information that empower us, we fritter away our attentions on trivialities
and pop culture (yes, including programming pop culture). A good example of a
prophet who tries to steer us programmers the right way is Kalzumeus, imho.

Part of the reason we fritter away our time is that it is simply easier to
consume the mass-delusion, because the search for truth can be fairly lonesome
(since you are the only interested party).

Under the circumstances outlined above, it is no wonder that most of what we
believe may be false, and part of a larger (self-bootstrapping) conspiracy to
keep us in the dark (cue ominous music, break out the tin-foil hats).

[1] <http://en.wikipedia.org/wiki/Information_asymmetry>

------
hypersoar
I try to keep the question "What do you know, and how do you think you know
it?" in my head as much as possible (I think I picked it up from Harry Potter
and the Methods of Rationality). I mentally recite it a lot, often without
really thinking about it. Every once in a while, though, the thought catches
me as I'm recalling some long-known fact, and I'll realize that I don't have a
good reason to believe it. Many of these are things I was told long ago by
friends/parents/family/teachers and stored away, their sources forgotten. The
maxim has proved useful in unlearning things I was too young to disbelieve
when I learned them.

~~~
31reasons
I know its good to have a healthy self doubt but this makes dumb people look
more confident (who don't revise their facts) than the smart people who are
too cautious with their knowledge. This is the reasons most dumb people end up
being the boss of the smart people.

------
javajosh
Obligatory xkcd(s):

<http://xkcd.com/843/> (Misconceptions)

<http://xkcd.com/1053/> (Ten Thousand)

<http://en.wikipedia.org/wiki/List_of_common_misconceptions> (The link
referred to in 'Misconceptions')

EDIT: Darnit.

~~~
Centigonal
Your links are swapped!

Is that a typo or humor about facts being wrong?

------
mikeash
Now I know that half the facts I know are probably wrong. But which half is
_that_ fact in?

~~~
betelnut
That reminds me of the old joke about medical school: half the things you
learn are wrong, and half you forget - you just have to hope it's the same
half.

------
edtechdev
This is a naive view, with inaccuracies in at least one of the "facts" it
claims to be wrong. It's what I would call a level 2 grad student view. Level
1 - you believe every research article you read and everything your advisors
tell you, and you glom onto one area or viewpoint in particular without
considering alternatives. Level 2 - you reject everything that isn't in line
with your viewpoint. I've seen professors stuck at level 1 and 2. Level 3 -
you start to see what's 'right' and 'wrong' (and more useful or less useful)
about everything you learn.

Consider this "fact" from the article, which has no citation whatsoever:

"Increased K-12 spending and lower pupil/teacher ratios boost public school
student outcomes."

Which the author claims is "wrong."

Actually the evidence suggests that it is right, although the effect of class
size alone may be small. And it is not just a matter of how much an effect
reducing class size has on learning, but also in what circumstances and to
what extent. And even with these "facts", it doesn't mean that big classes are
inherently always "bad" and useless - MOOCs are still quite useful even though
90% of students don't finish them. And large class lectures actually could be
made more effective if given after (not before) a lab or other interactive
learning activity. But a wealth of research has showed how one on one tutoring
is generally the most effective method of teaching (which also directly
contradicts the author's assertion that reducing class size has no effect).

[http://www.districtadministration.com/article/does-class-
siz...](http://www.districtadministration.com/article/does-class-size-really-
matter) [http://www.ppta.org.nz/index.php/-issues-in-
education/class-...](http://www.ppta.org.nz/index.php/-issues-in-
education/class-size?start=4)

------
rustynails
Half of the facts are wrong may be a little steep. Do we breathe air? Does it
contain oxygen? I'm sure some enthusiast will give me the composition of air
(inc some stat on nitrogen:). There are many trivial facts that we know with
confidence.

However, in a recent conversation with my mother, I asked her about a talking
donkey in the bible. she confirmed the reference saying "the lord works in
mysterious ways" and "donkeys could talk back then". Maybe there is some truth
that 1/2 of our facts are wrong... If you skip religion, I view my mother as a
very clever woman who is very astute with business. She was indoctrinated at a
young age.

~~~
georgemcbay
Religion is everywhere, even where it is specifically disclaimed (see also:
The Singularity).

We're hardwired for it.

~~~
Peaker
Are you claiming that the idea that AI will eventually be smarter than humans
and thus render human work mostly useless, a religious idea?

~~~
ars
I'm not the OP, but: Yes.

And on top of that I'm claiming that the singularity will never happen.

We have incredibly fast computers with enormous amounts of memory, yet are no
closer to AI.

And even given infinite computing power, we still wouldn't have AI, so the
race for computing power is a red herring. (It's not like we know how to do
it, but the computers are not powerful enough - we don't have the slightest
clue how make AI.)

(I define AI as the ability to learn any topic at all with unstructured
instruction - like reading a book, or being told verbally. Additionally after
learning the material an AI must be able to invent and synthesize new material
from the old.)

And on top of that computers aren't getting any faster, that's basically
stopped. (Although if AI is parallelizable it might not matter.)

~~~
nikcub
> We have incredibly fast computers with enormous amounts of memory, yet are
> no closer to AI.

Supercomputers today are the equivalent of a tractor - they do simple things
that people can do but at a greater scale. Nobody ever looked at the first
tractor and claimed it was a step towards better AI, and nor should they when
they look at supercomputers.

------
tgb
Please, please don't listen to this article! The author is far worse than the
processes he is demeaning. This article gives no reason whatsoever to believe
the title. The main argument seems to be "we learn lots of new things,
therefore most of what you previously learned is wrong." This has numerous and
obvious problems. We can easily be learning new facts about things that were
previously not considered or not known. Most facts that humans know are not
scientific facts but rather personal. The large majority of people don't know
very much about recent experimental results and only hear about ones after
they have been so long and well-established that there is little chance they
are significantly wrong. A fact can be 'wrong' while still being nearly right.
That the list of ten largest cities in the US has changed just means that the
person only knew the list of ten largest cities at the time that they learned
it.

If 'half of our facts were wrong' is taken at face value, then our truth-
determining process is no better than flipping a coin. Do you think we could
engineer bridges, build computers, get to the moon, sequence the human genome,
or write software if every time we didn't know something we just asked it as a
yes-no question and flipped a coin?

So as much as I want to support skepticism and free-thinking or whatever this
article is aiming at, this article is wrong and bad. I have the means to go
out and determine true facts and I do this routinely.

------
pinaceae
The problem is the word "fact". Somehow this is now perceived as an eternal
truth, rather than a statement which is true right now, but might change soon.

Some of the examples point this out nicely, let's take the 10 biggest cities
in the US. Isn't it obvious that this "fact" only works with a timestamp
attached to it? The 10 biggest cities in 1990?

Maybe this is more about the language in science, textbooks and science
journalism. A little more precision would make the real facts stand out
better.

------
chernevik
Re: The title: Not for me, baby. I haven't many "facts", and most of those are
math or avowed articles of faith. I learned a long time ago to replace "is"
with "I have read that" or "I have heard". My opinions are wrong all the time
-- but not for long.

------
jonhendry
Several of these are things that were the good-faith best effort at the time
based on available knowledge and/or technology. (DNA, cold-blooded dinosaurs,
star size limits, etc).

I don't much like the depiction of scientific knowledge as "manufactured
facts".

------
trustfundbaby
Complete fantasy tangent here but ...

Its amazing to me how much effort we as human beings put into memorizing and
retaining 'factual' knowledge like this. Can you imagine how far we could go
if all this knowledge could literally be loaded into our brains at an early
age so that instead of basic arithmetic (for example) kids in elementary were
already doing quantum physics and things of that nature.

I know its far far away, but it really really fascinates me to think of the
implications of just doing away with having to relearn "learned" things so
that we could devote energies to far more advanced topics.

~~~
itcmcgrath
Assuming that the act of learning doesn't 'grow' the brain in a meaningful way
as part of the process. I think that's probably an assumption too far though.

~~~
ijl
When you learn, the myelin between the relevant neurons increases. The Talent
Code discusses this at length, and argues that skill in something is a
function of myelin strength.

------
smsm42
>> Half the Facts You Know Are Probably Wrong

Now to which half does this fact belong?

------
samspot
Then article speaks about the half life of truth, but that is really just how
long it takes us to find out it was never true in the first place. But even
that is assuming that in 100 more years we don't find out it was actually
correct. We need to make a distinction between truth that becomes outdated and
facts that were never true in the first place. So what percentage of current
scientific conclusions are likely to be wrong?

------
jonhendry
Given the source, I can't help but suspect there's an underlying desire to
willfully misunderstand how science works in order to dismiss all research
that could be used to support or justify regulation.

Hey, you never know, maybe in ten years we'll find out tobacco use _doesn't_
cause cancer. Better remove all the restrictions that have been placed on
tobacco until we're _certain_.

~~~
splat
I would agree that having a healthy scepticism of those in authority is a very
fundamental part of libertarian philosophy. But if you find the author's
political inclinations objectionable you can read the original book by Samuel
Arbesman that Ronald Bailey based his article on.

------
charonn0
We are wrong more often than we are right, but the scientific method has
proven itself a powerful error-correction tool.

------
monochromatic
> Arbesman notes that “the number of neurons that can be recorded
> simultaneously has been growing exponentially, with a doubling time of about
> seven and a half years.” This suggests that brain/computer linkages will one
> day be possible.

What?

~~~
jonhendry
Current brain-computer interfaces have 96 needle-like electrodes, so can
interface with 96 neurons. That produces an enormous amount of data at the
sampling rates used, but is already being used for things like controlling
robot arms.

In 7.5 years, perhaps it'll be up to 192 neurons, if the connected computers
and processors can keep up with the data flow.

7.5 years after that, maybe it'll be up to 384. Etc.

------
MrJagil
These days, QI is my favourite TV show for exactly these reasons.

[http://www.youtube.com/results?search_query=qi&page=&...](http://www.youtube.com/results?search_query=qi&page=&utm_source=opensearch)

------
ams6110
I guess I have a small problem with this use of the word "fact" which to me is
something provably true. What this piece is actually saying is that half of
scientific theory is probably wrong.

~~~
greenyoda
Only in mathematics are things _provably_ true. In science, things are
considered true based on the best theories and experimental evidence we have
so far, and our observational abilities increase as technology gets better
over time. For example, Newtonian mechanics was considered "true" for
centuries, until it was replaced by relativistic mechanics. In the 18th
century, we just didn't have the equipment to measure phenomena at quantum
scales; now we do. As we observe the world in greater detail, we find that
things that seemed to be true before are now false. That's the best that
science can do.

~~~
sea6ear
Actually, I'd say that in mathematics, things are provably true given that the
initial assumptions are true. However, mathematics does not so far as I know
deal with the truth of those initial assumptions (axioms), it simply accepts
them as true.

Where this gets interesting is that you can say, what if this axiom was not
true? Let's assume it's not. And then possibly come up with a whole new system
of mathematics (like non-euclidean geometry) which may turn out to have actual
use in previously unsolvable problems.

~~~
lotharbot
I think it's better to conceive of mathematical axioms as being part of
mathematical system X or not, rather than as true or false. That is, they're
not self-evident or universal or true according to some external standard,
they're merely postulated as properties of a given mathematical structure.

See <http://en.wikipedia.org/wiki/Axiom#Non-logical_axioms>

~~~
sea6ear
That's seems like a useful understanding.

------
yarrel
Including this one.

------
wissler
It's not that widely held bogus views turn out to be wrong by accident or from
some congenital defect in human reasoning.

Issac Newton's physics, while not applying to all possible contexts, still
holds as correct over a vast range, and that's because it's properly
identifying an aspect of Nature's behavior, and it did so due to Newton's
extreme methodological diligence. Some of these other "facts" that get
overturned likely did not adhere to Newton's Rules of Reasoning, or had some
other flaw that led people to jump to conclusions prematurely.

Of course, it's exceedingly unpopular to point out that we should learn why we
make mistakes and then strive to not make them by scrutinizing the source of
the error and adopting better methods. Most people are amenable to having
their mistakes pointed out, but are extremely offended if you try to point out
that the way they are arriving at conclusions is mistaken. It's as if most
people don't want to believe that yes, there is a difference between wisdom
and foolishness, and instead want to believe that the highest form of
foolishness is in fact to believe in wisdom.

