
Why We Remember So Many Things Wrong - clarkm
http://www.newyorker.com/science/maria-konnikova/idea-happened-memory-recollection
======
kijin
The weird way that our memory works, and how it differs from the way computer
memory works, is also a fundamental reason why it's so difficult to get
privacy and data retention right in the digital age.

Ever had someone with "photographic memory" pay attention to your daily life?
It can get creepy very quickly, because she notices and vividly remembers all
sorts of things that you never imagined anyone would ever notice. But
nowadays, with Facebook and Instagram and sophisticated tracking everywhere,
all sorts of strangers have _literally_ photographic memory of what you said
and did last summer. I've seen lots of people obsessively delete stuff from
their timelines after only a short while, not because they're paranoid, but
because feeling uneasy is only a normal response to any violation of our
intuitions about how memory is supposed to work. (Unfortunately, our normal
response doesn't have normal consequences anymore; the data still exists
somewhere.)

We're meant to forget. We're made to rewrite our memories as time goes on,
just as any living thing constantly rebuilds itself over time. _Homo sapiens_
never evolved to be reliable preservers of pixel-perfect information. No
matter how much our technology and legal framework tries to promote
99.999999999% durability as the golden standard of memory (Amazon S3 actually
promises eleven 9's), we need to remember that this concept of memory is a
very modern invention that most people still have trouble coping with. Maybe
we never will. Maybe we never should. Maybe computers will learn to forget
like us, rather than us learning to remember like them.

~~~
Swizec
And yet in a way, computers are ALREADY learning to forget like us. Show me
one search engine that doesn't promote recent-ness. All the big internet-wide
searches do, all the per-page social media search engines do as well.

And it seems that creeping deep through someone's social timeline is fast
becoming a faux pas. You already avoid doing it, and if you do end up doing
it, leaving a trace is mortifying.

To the point that people will never mention to you in person that they saw
something from your profile that happened a long time ago. As someone with a
strong-ish internet presence, it's funny to see how flustered people get when
they accidentally reveal themselves to know more than they feel it's polite
for them to know.

Same way as we've always had the norm that you let people tell their stories
even though you've heard the same story ten times already. It's just polite.

~~~
TeMPOraL
> _Show me one search engine that doesn 't promote recent-ness. All the big
> internet-wide searches do, all the per-page social media search engines do
> as well._

And this seems like an annoying bug, not a feature. Internet is becoming more
and more of an ephemeral place. That interesting news article you read few
years ago? It's URL has probably changed twice, then it was deleted, and even
if not, unless you remember the exact wording of the headline, you probably
won't find it on Google as it will be buried under three pages of recent
tweets that match your search query - tweets no one will care about or
remember next week.

People do want to remember some things perfectly - that's why we invented
writing, textbooks, diaries and libraries.

~~~
scholia
It's annoying as hell because Google promotes "freshness" over quality. You
can get a very high position in Google by knocking off someone else's tech
news story while removing half the facts and adding nothing of any
consequence, as long as your do it quickly. This kind of blogspam also
vanishes quickly, but only after the damage has been done.

Meanwhile anything that might actually be worth reading is somewhere around
page 10...

~~~
slfnflctd
> Meanwhile anything that might actually be worth reading is somewhere around
> page 10...

...much like the way things were in 1997 with Google's predecessors. Funny how
despite exponentially more layers of code running (with eons of iterative
advances), it all still somehow comes full circle.

~~~
mreiland
It's an interesting observation that agrees with my experience as well.

~~~
TeMPOraL
I feel like I keep seeing this all around. It seems that there's a point in
the product after which any further optimization to "extract more value"
actually degrades things. Comapnies need to learn when to stop messing with
their product further and focus on something else.

------
stephengillie
In gaming, the concept is called a "replay", where instead of recording the
pixels on the screen in every frame, they instead record all inputs processed
on every frame, and just replay them thru the same engine. The action is
technically idempotent in the game world.

Where this breaks down is when features get updated between revisions. If your
game patched the "jump" function to increase upward momentum from 1.1 m/s to
1.13 m/s, the Replay would be incorrect. You would be jumping onto platforms
you couldn't get up to before, moving faster, maybe even dodging enemy attacks
that hit you when you played that match.

The human neuroprocessor is always changing and growing, always revising
itself. Thus memories replay incorrectly. You apply old feelings to new mental
patterns, and sometimes they lead to weird places. Or sometimes you mistake
something easy for being difficult, because your memory data is out-of-date
for your current processes.

~~~
theoh
Are you sure you mean "idempotent"? I don't see how it makes sense in your
example.

~~~
Chinjut
I think they're using "idempotent" to mean "If you do it again, you'll get the
exact same result as the first time".

(The usage isn't quite the same as in the jargon sense of "f is idempotent in
case f(f(x)) = f(x)", and I thus found it jarring as well, but it's possible
to see the connection in that above gloss.)

~~~
theoh
If so, they are just abusing the term. Just "identical" would have been better
than "technically idempotent" which is a pretentious mess.

------
peterwwillis
The amygdala doesn't have a direct connection to the visual cortex, as this
article suggests. The amygdala receives stimuli from the thalamus, just as the
neocortex does, and (iirc) the amygdala's outputs can affect the neocortex,
but that's not how the neocortex gets its signals. The signals go to both the
amygdala and neocortex independently, like a fiber optic splitter.

Because it takes so long to think, and probably because the amygdala is so
small and simple compared to the neocortex, the visual senses we receive hit
the amygdala _well before_ they hit the visual cortex. We experience pain and
pleasure memory response from senses before we even know what those senses
are.

You may have two memories of a horse, one just a plain-old horse, and one a
horse while you were getting shocked. Since the amygdala senses the horse
first, the emotional memory is returned first, which is why any attempt to
bring up "horse" will dig up one of the emotional memory horse records. You
then have to wait for the neocortex to catch up and say, hold on a second,
this is just a regular horse. When your neocortex is completely swamped by the
amygdala it is called "amygdala hijack".

The quoted paper is basically saying "the hippocampus and amygdala can affect
each other", which is basically true: usually one does not rule the other, and
they work in concert to give you a full picture of things. And just like the
amygdala can provide an emotional veneer over logical memories, the
hippocampus can temper the emotional response, for example with mindfulness
practice.

------
superobserver
Hyperthymesiacs generally don't have this problem/solution (however you look
at it). Personally, I'd rather not remember things wrongly. The only problem
with hyperthymesia is you need to have the right personality for it not to be
a burden to you (very high emotional stability, for instance), so it wouldn't
be great for everyone.

Since human beings are still evolving, I don't think there's any reason to
believe we're meant to do things one way rather than another.

Edit: I should note that hyperthymesiacs are not immune to false memories,
despite their superior memory:
[http://www.pnas.org/content/110/52/20947.abstract](http://www.pnas.org/content/110/52/20947.abstract)

------
marincounty
The only thing think I remember succinctly is the loss of my virginity!
Everything else is a fuzzy haze.

------
digital-rubber
Negative memories are simply stronger. You want to avoid that in the future.

Another interesting read on it: [https://www2.bc.edu/elizabeth-
kensinger/Kensinger_CD07.pdf](https://www2.bc.edu/elizabeth-
kensinger/Kensinger_CD07.pdf)

