
Graphics That Seem Clear Can Easily Be Misread - adunk
https://www.scientificamerican.com/article/graphics-that-seem-clear-can-easily-be-misread/
======
wodenokoto
This effect has a name, The Simpsons paradox [0]

The graph in [1] explains it much better than any words could.

[0]
[https://en.wikipedia.org/wiki/Simpson%27s_paradox](https://en.wikipedia.org/wiki/Simpson%27s_paradox)
[1] [https://www.analyticsindiamag.com/understanding-simpsons-
par...](https://www.analyticsindiamag.com/understanding-simpsons-paradox-and-
its-impact-on-data-analytics/amp/)

~~~
marzell
Seems to be the statistics equivalent of gerrymandering.

~~~
abtinf
Gerrymandering is an intentional act to achieve specific outcome.

Simpson's Paradox can strike even the most committed truth-seeker trying to
understand and interpret the available data.

Edit to add: Simpson's Paradox could conceivably thwart a gerrymanderer if
they have an erroneous model relating demographics to voting patterns.

~~~
marzell
This is all true.. However what I meant is that the way gerrymandering can
indicate a winner that is not the majority choice, is quasi analogous to how
the Simpson's paradox exemplifies a weighed average that disagrees with the
subtotals of the data.

~~~
abtinf
Oh, I see—you mean from the perspective of the voter/pundit/analyst/politician
trying to understand what an election means and their mandate for action. Sort
of like the debate over electoral vs popular vote.

I hadn’t thought about it that way or seen the word gerrymandering used in
that way, but it makes sense.

------
everyone
The title alone made me think of a different phenomenon that occurs to me.

Its to do with websites. Lets say there a website where one downloads some
free software from, for example. If there's a really big bright clear download
link, Ive been taught to not click on it, as its probably going to an ad or
some scam site. Ive learned to always ignore it and search for the tiny little
text hyperlink that actually leads to the real download file.

Ive noticed this trained behavior of mine now applies to all websites, even
when the big link is actually the real one. I will often miss it, and waste
time search for the 'real' one as my brain now automatically filters the big
clear ones out.

~~~
pure-awesome
The name for this effect is Banner Blindness:

[https://en.wikipedia.org/wiki/Banner_blindness](https://en.wikipedia.org/wiki/Banner_blindness)

------
zubi
If you take a statistics class, one of the very first things that are taught
is to be careful not to infer a wrong cause-effect relation when presented
with a "correlation".

A typical example is "given any city, there is a strong correlation between
number of churches and number of crimes commited." This is pretty much true
everywhere in the world but that does not imply one is causing the other. This
correlation can simply be the natural outcome of more populous cities having
both in higher numbers compared to less populated ones.

~~~
austincheney
I have found it is pretty common for people to use simplified justifications,
such as cause and effect, to support their conclusions on a subject. If the
relationship between the cause and the effect is not clearly stated it is very
common that they may not be properly reversed during backwards analysis from
an end point to a start point. In that case the qualifying behavior is a form
of cognitive conservatism demonstrated through a selective bias.

While that form of thinking may sound incredibly stupid, example: how could a
person confuse a cause for its effect, it is exceedingly common. I have seen
incredibly smart people make this mistake. The mistake is the non-cognitive
behavior at play that unduly influences what is otherwise a very logical and
straight forward conclusion. Objectivity is a practicable personality trait
not aligned to logic or math skills.

[https://en.wikipedia.org/wiki/Cart_before_the_horse](https://en.wikipedia.org/wiki/Cart_before_the_horse)

------
pierrebai
The article is very narrow and focuses on a single example.

What's more, the real take-away is that you can put side by side two graphs.
It doesn't mean there is any causal relationships between them. The example
only seems convincing because both graphs are health-related. If they were
Pac-man highscore vs milk production, it would show how hollow the article is.

~~~
catacombs
> The article is very narrow and focuses on a single example.

Alberto Cairo is quite the polarizing figure in the data visualization world.

------
yboris
A very-related classic reading on this is a short-and-sweet

"How to Lie With Statistics"

[https://www.amazon.com/How-Lie-Statistics-Darrell-
Huff/dp/03...](https://www.amazon.com/How-Lie-Statistics-Darrell-
Huff/dp/0393310728)

------
shujito
the page appears blank to me

~~~
robbyking
At first I thought you were making a joke, but it's blank for me, too, even
with adblockers/https redirect disabled.

~~~
shujito
Sorry, I wasn't joking (:

The page now loads correctly

------
neves
Alberto Cairo, who produced the charts in the paper is about to publish the
book "How Charts Lie: Getting Smarter about Visual Information". His other
books are excellent, I bet this one will also be.

------
queercode
In other words, some people don't know how to read graphs.

~~~
chc
No, that's not at all what this is about. A better one-sentence summary would
be "Accurate data can be very misleading if, for example, it's viewed at the
wrong level of granularity."

------
abtinf
XKCD on "Curve-Fitting Methods and the Messages They Send":
[https://www.xkcd.com/2048/](https://www.xkcd.com/2048/)

