
The Brain Is Not a Computer (2016) - cjauvin
https://aeon.co/essays/your-brain-does-not-process-information-and-it-is-not-a-computer
======
baddox
Oh dear. This isn’t my field at all, and maybe I’m just doing a bad job of
understanding their point, but this sounds completely bogus.

Really...the brain doesn’t create representations of visual stimuli or store
memories? Under what possible definitions of those words can this statement be
sensical?

Surely the author believes that visual stimuli cause measurable changes in
brain state, and that people can indeed remember past visual stimuli. Then how
is it true that brains don’t create representations of visual stimuli and
store and retrieve them? I’m at a loss here.

Perhaps the author means that the brain doesn’t do these things _in the same
way as digital electronic computers_ we’re familiar with. That’s certainly the
case at the most basic level.

~~~
TheOtherHobbes
I'm not sure I understand the piece either, but I _think_ it's trying to say
that human memories are associative, sequential, and distributed rather than
localised.

So there isn't "a representation" in the discrete sense. It's more like the
entire system changes, and it's impossible to physically scan specific
elements of it to retrieve selected content.

You can trigger selective recall, but you're triggering a complex and noisy
process which generates an experience that may include remembered elements -
not pulling out a predictable bit pattern.

There isn't an exact equivalent in CS. Traditional binary memory is obviously
nothing like human memory. Neural nets have some superficial similarities, but
they lack generality.

I'm not completely convinced by the argument, but I'm glad someone is making
it.

The problem with it is that we can remember specific discrete facts quite
easily. If you ask me how many flats the key of F major has, I can tell you
without being distracted by other memories.

What we don't know is how that fact is represented, how exactly my brain
changed after I learned it, how similar those changes would be to changes in
other brains learning the same fact, whether everyone has similar subjective
experiences on recall, or how to scan someone's brain to check whether or not
the fact is known.

~~~
behringer
I think researchers are working on this very thing, and it seems like the
brain does store "visual representations" or at least it can't be ruled out at
this point.

[https://www.theregister.co.uk/2013/08/20/mindreading_mri_spo...](https://www.theregister.co.uk/2013/08/20/mindreading_mri_spots_letters_in_the_brain/)

And of course, why wouldn't the brain store a visual representation of what
you remembered. That would be the easiest way to store and retrieve it, which
is why we do that on computers as well.

~~~
slowmovintarget
> Why wouldn't the brain...

Because it isn't how brains work. Recollection is re-experience.

The article you link says researchers taught a model how to match patterns to
letters, given the presumption that they are letters, for a single subject's
MRIs taken while they were experiencing the sight of words and letters. Not at
all the same as saying a brain stores data.

~~~
behringer
At some point the data must be stored physically in the material that makes up
our brain. How that is stored could very well be a 1 to 1 mapping, and
shouldn't be ruled out.

~~~
slowmovintarget
And what if it is instead stored as I felt x, which led to y, which made me
think z? It is conditioning of the pattern into neuronal potentials meaning
that we can experience the same thing again. This is not the storage of data,
it is the conditioning to react the same way at the start of a similar
cascade.

We don't record the memory, we make it easier to feel and think the memory
again. Like muscles adapting to exercise, our brains adapt to experience. Keep
it up long enough and we get good at it.

~~~
behringer
If that's the case, how can anybody recall 1000 digits of pi or whatever
insane number they're up to these days? People don't recall it based on their
feelings. It's memory storage. How it's done is still being worked out.

~~~
tnzn
Because you don't recall the digits, you recall yourself learning these
digits. While doing so, you experience, you have feelings, although weak.

------
leftyted
The dollar bill thing seems silly. The fact you can draw anything without
looking at a dollar bill means something is being stored, right? That means
the brain stores information. There's no way out of that. And that fact that
you can draw the dollar bill on cue means something is being retrieved. No way
around that either. It doesn't matter how the information is represented. The
brain as a computer analogy doesn't specify that "neurons are bits" or
whatever.

I don't expect the brain to work like any computer we've ever built (which
seems to be the point of view this writer is attacking), but I do expect that
it has the capacity to store, retrieve, and process information and so the
computer analogy seems useful.

~~~
imh
Yeah, you could make the same complaints about jpeg compression.

~~~
zzzeek
Jpeg compression was my thought exactly. This author knows extremely little
about computers.

------
mannykannot
It is trivially true that the brain is not a digital electronic computer. You
cannot, however, use that simple fact to show that the brain is not some sort
of information-processing device, and as for the notion that brains do not
store information, I wonder what he thinks memories are.

The author concludes by asking "Given this reality, why do so many scientists
talk about our mental life as if we were computers?" He offers no support for
the proposition that this view is common, and I suspect he is often taking, as
literal, speech that was intended to be metaphorical.

------
seiferteric
The author seems to have far to narrow an idea of what a computer is.

~~~
naasking
This seems like the appropriate response to the article. In fact, the article
is even factually wrong on the matter of what we're born with, and it's even
wrong on what it considers "information".

I suspect this misunderstanding of "information" is the core of the confusion.
He needs to revisit physics and learn some computer science, because
information and physics are inextricably intertwined, so the brain very much
operates on information using rules.

Edit: and further, the brain is a finite state automaton due to the Bekenstein
Bound, a physics theorem.

------
presscast
In a former life I was a cognitive neuroscience researcher.

This reads like a piece written by someone who heard a neuroscientist take
issue with the "brain as computer" metaphor, but didn't quite grasp what it
was all about.

The "brain is not a computer" meme has to do with the fact that the brain does
not process information in the same way as a _digital_ computer. It is not
saying that the brain is not a symbol-processing/computational system.

------
charleshmorse
I think us commenters are all on the same page here :)

The author is almost making it seem like models are reality and that people
think that. They're not and I don't think anyone has ever thought they were...

Further and like other comments already mentioned, the brain is thought of and
treated as a turing machine, not a digital computer. It's done this way,
because the brain can be mapped to the definition of a turing machine.

And I have to defend Von Neumann. In his book, he explored turing
equivalencies between the brain and computer concepts at the time used to
implement the digital turing machine, he didn't actually think that the brain
was a one-to-one mapping to a digital computer... He knew the difference
between models and reality.

Even for the history of models the author mentions (hydraulics, automata,
etc.), these all contain some turing equivalencies if implemented correctly
and they were simply using the language and examples at the time to express
this.

The author also continues to mangle any and all ideas of modeling,
abstraction, and equivalence throughout the whole article. With regard to his
'uniqueness problem', I mean 'information loss' is modeled digitally for a
reason.. just because humans are lossy, doesn't mean we can't model them that
way. Think of a compressed image file.

I don't think there's a single researcher worth their salt that thinks the 'IP
Metaphor' is gospel. That is just a grossly unscientific idea to assume.

We're all free to choose any model or collection of models we wish to
approximate reality, but some of them work better than others and the brain is
a complicated thing to model.

The author is trying to dramatize a triviality.

~~~
sykic
How can the brain be mapped to the definition of a Turning machine? It doesn’t
have an equivalent to an infinite tape and it doens’t work accoriding to
anything like the table of rules for a Turning machine. Can you point me to a
reference for this claim?

Most of the comments I’ve read don’t like the article but almost all of the
commenters I’ve read don’t seem to have studied this issue. It gives me the
impression that these are visceral reactions. The article is not an article
for experts. It’s expository in nature.

One thing that stood out for me was this quote:

 _The Future of the Brain (2005), a snapshot of the brain’s current state
might also be meaningless unless we knew the entire life history of that
brain’s owner – perhaps even about the social context in which he or she was
raised._

If true this seems to me (very much a non-expert) to give serious doubt to the
notion that the brain is a computer.

~~~
mannykannot
The 'infinite tape' issue is a red herring that sometimes appears in
discussions about these issues. Real computers (such as the one I am typing
this on) are informally described as 'Turing equivalent' because they can
implement a Universal Turing Machine up to the limitation imposed by their
finite memory. An alternative way of looking at it is that their model of
computation, augmented with unbounded memory, could implement or simulate a
Universal Turing Machine.

This is not the equivocation that it may appear to be, as it establishes a
sort of asymptotic boundary between what is possible and what is not (the more
memory we have, the closer we can get to it.) It also means, for example, that
we don't have to wonder if there is one computer instruction set or
architecture that can perform computations that are impossible by another
(again, up to having sufficient memory to complete it.)

The author of the claim you are questioning has not, so far, returned to
explain what he means, but I think he is saying that the brain is Turing-
equivalent in the informal sense given above: we can compute like a Turing
machine, up to the available tape/memory (though with a very limited tape, if
we are not writing things down...)

If that is so then I (one of the people here criticizing the article) must say
that I don't think it is relevant. An alternative interpretation of the
statement, that it says it has been shown that that there is a Turing machine
equivalent to the human brain, would seem to depend on believing (as I happen
to) that the brain's functioning is a matter of electro-biochemistry that
could, in principle, be simulated by a computer, but no-one, so far, has given
a demonstration, or even a convincing explanation, of how that works at a
Turing-machine level of abstraction.

With regard to the quote you offer: I think it is a simple case of rhetorical
overreach -- one might need to know the entire history of that brain to fully
understand everything there is to know about its current state, but that does
not mean that, absent that full history, the state is meaningless. In
understanding what a person is thinking, what they remember (which is an
aspect of their brain's state) is more important than what actually happened.

~~~
sykic
I’m a mathematician and tend to take things literally. I should not have
mentioned the infinite tape part. What I should have said is that according to
the article we don’t store memories in the way that a Turing machine does.
There is no tape as such and there is no set of rules that the brains abides
in terms of how to do the next step so to speak.

I gathered that the quote I referenced means that the state of a brain at time
t is not sufficient to reconstruct memories or other meaningful information.
The fundamental point of contention between you and others criticizing the
article appears to be that you all believe that there is a storage mechanism
in the brain in a similar (analogous?) fashion as a computer. I gather the
author claims this is not so. Information is not stored in neurons in such a
way that one “retrieves” it by accessing a storage location.

I don’t know enough about this stuff to intelligently comment on the veracity
of it. I just know that someone far more knowledgeable than me and just about
everyone else commenting says that our intuition about how this stuff works is
wrong. That alone is worth causing me to reconsider my intuition on this
stuff.

~~~
mannykannot
Speaking for myself (I don't necessarily agree with everything that has been
said in opposition to this article), I think you are missing my point about
memory.

The author is saying our brains do not function like our digital computers,
something I think we do all agree on. It is not so clear how the author thinks
our brains do work, but he apparently wants us to stop using computer
metaphors when discussing their function.

He would have this prohibition extend to the notion that our brains store and
retrieve information, which is absurd; one might as well argue that a computer
is not a Turing-equivalent device because RAM is not a tape. The author says
that scientists will never find copies of words or grammatical rules in the
brain, and if, by copies, he means coded in something like UTF-8, then that
is, of course, true, but beside the point: if his brain did not have some
mechanism that supports the storage and retrieval of this information in some
manner, how was he able to write the article in the first place? He claims you
won't find copies of Beethoven's 5th. symphony in a brain, but I suspect that
at least Beethoven himself, and many conductors of the piece, have had just
that - and the soloists who play his piano concertos are not reading from a
score, so where does that come from?

I think the author may have ended up making these absurd claims because he is
trying to use the trivial brain-does-not-function-like-a-computer argument to
prove something that is just an unargued-for intuition: he doesn't seem to
think RAM (or perhaps any form of physical information store) could possibly
be the foundation for something that works like human memory. He is apparently
unaware of the extent that software such as neural networks (or even
relational databases) have already extended the concept of information storage
and retrieval beyond the simple model of randomly-addressable bytes (which
does not, of course, make the point that human memory _is_ like a computer's;
what it does show is that the author's low-level comparisons are insufficient
to make the larger point he is trying to squeeze out of them.)

~~~
sykic
My take on the article and in particular the quote that I referenced in my
original post is that the author does not think memory is stored in the way
that you and I think it is. The way I think of the brain working with regard
to memory is analogous to how computers store information. I’m unable to model
it in any other way. But then there’s the quote in the article that even if I
had a snapshot of the brain at time t I would not be able to reconstruct
something meaningful without knowing the history of that brain’s owner.

I don’t know enough to understand how that is possible or why someone
knowledgeable about this stuff thinks this. I have basically the same
conception of the brain and how it works as you do. But I’m confronted with
the fact that a person far more knowledgeable than me thinks otherwise. It is
that fact that causes me to persist in my view with caution. The author may be
a crank. I don’t know.

------
cuspy
Models are not equivalent to the phenomena they describe.

Computational models are not an exception to this.

There is not even a single "part" or "function" of the brain that we fully,
exhaustively understand through a computational explanation. All claims of
certainty are premature.

What's really fascinating and really needs the attention of historians and
anthropologists is why in this current historical moment so many STEM educated
people who are otherwise very bright end up confused about this. Maybe the
answer is obvious though.

------
your-nanny
The author's notion of computer does not serve him well. It is too grounded in
his experience of digital computing devices rather than an understanding of
computing as a kind of process. Furthermore, the field of computational
neuroscience is doing quite well, thank you. Tempral difference learning is
both an algorithm and instantiated in brains in some form.

------
warent
Isn't this directly contradicted by the grid cells?
[https://en.m.wikipedia.org/wiki/Grid_cell](https://en.m.wikipedia.org/wiki/Grid_cell)

It sounds like because the author doesn't understand how neurons create a
representation of reality, they're splitting hairs and saying it doesn't.

------
8bitsrule
The author mentions Beethoven. Consider what a virtuoso pianist must go
through to create a performance of a 40-minute-long sonata. Yes, the
performance includes her personal interpretation of what's written on a piece
of paper. It is almost certainly influenced by performances others have
created.

But when it comes to the individual notes, their sequence had damn well better
be literally correct for the entire performance. If not, someone in the
audience will certainly notice that one flubbed note.

So in learning the work, "she was changed in some way" all right. As some
members of the audience had been ... _identically_. And that 'some way'
certainly resembles pulling bytes out of 'storage'.

------
0_gravitas
I've been meaning to write an essay for a long time now about how our personal
abstractions of things and how they are defined/work can make nuanced
discussions about certain ideas difficult; I believe this post is a victim of
that.

------
aresant
“The brain has simply changed in an orderly way that now allows us to sing the
song or recite the poem under certain conditions. When called on to perform,
neither the song nor the poem is in any sense ‘retrieved’ from anywhere in the
brain, any more than my finger movements are ‘retrieved’ when I tap my finger
on my desk. We simply sing or recite – no retrieval necessary.”

I actually like this as an idea that our tools for understanding brain
functions are still too primitive and the traditional comment base compute
models are lacking.

~~~
baddox
I’m struggling with the definition of “retrieve” used here. According to the
meanings of “retrieve” that I’m familiar with, I can’t conceive of any sense
in which songs or finger movements _aren’t_ retrieved by the brain.

~~~
through_17
"I can’t conceive of any sense in which songs or finger movements aren’t
retrieved by the brain."

And that's the issue, isn't it? I think he's using "retrieve" to mean
something much closer to what a digital computer does. Ie, there's a single
"place" in the brain where the song is compressed and stored. My definition of
"retrieve" (and yours, I think) is implicitly more relaxed; I say that a rough
distributed system that reacts to stimuli like songs by being able to
approximately reproduce them later counts as "retrieval."

As other commenters have noted, the article uses a very restrictive definition
of computation/retrieval. I mean, earlier in the article, he gives a
definition of the same game that the "Lifelong Learning" and "Reinforcement
Learning" people use.

I think he's actually on the same page as many learning theorists, and is just
trying to make it clear to a general audience that a very "tight" match
between the brain and Von Neumann machines isn't reasonable.

~~~
baddox
Perhaps that’s what he meant, but does anyone even claim that that is the
case?

------
hyperion2010
The author seems to be hung up on a distinction between two representations
and trying to argue that they cannot be the same thing, despite the fact that
we have abundant evidence that both are readily interconvertible. Now, I would
agree that a neural net that converges to a grammar might not be a grammar,
but at that point we would seem to be missing the forest for the trees.

------
ozy
Imagine a database that stores strings using a common prefix method, one could
make the same claim: this database does not store or retrieve strings. And yet
it does.

The model of what something does is implemented by an underlying mechanism.
But for many reasons the mechanism doesn't have to be, and often isn't, a
naive translation of the model.

------
radarsat1
> Those changes, whatever they are, are built on the unique neural structure
> that already exists, each structure having developed over a lifetime of
> unique experiences.

What does he mean by "neural structure" here and how is it different from
"memory" and "representations" which supposedly we don't have?

------
jacobmoe
Funny that "computer" was originally a metaphor applied to the machine from
the human occupation.

------
boazbarak
Might be an interesting experiment to train a neural network to distinguish
between different currencies, and then visualize the features that correspond
to the “one dollar neuron”. It might turn out not that far from the drawings
of the author’s students.

------
age_bronze
Dunning Kruger effect going on with the author of the article. He's too
ignorant of the subject he's writing about, to understand the difference
between the low-level way computers works and the high level sophistication
algorithms can exhibit.

Quite ironically, I think his line of thought shows precisely why the brain is
probably quit like a computer. The algorithm going on in his brain was
probably like this:

1\. Assuming I'm like a computer leads to negative emotions (because lack of
free will and reduction in self-esteem it implies). 2\. Therefore give high
weights to facts contradicting this, and low weights to facts supporting this.
3\. For a range of subjects regarding the behavior of the brain, do: 3.1. If
the subject feels like it's logically supporting my view on the subject, add
it to the article. 3.1.1. Anything I know the brain does and I have no clue on
how can a computer do, will automatically feel like it supports my conclusion.
Since I'm pretty clueless as to how computers work in general, most things are
actually going to seem like something a computer can't do. 3.2. Otherwise,
ignore this and keep going on to the next example.

------
ilaksh
This article would have been more relevant several decades ago when AI
research started. But now I think it's mainly a strawman argument because the
models are more realistic and different from what he is talking about.

------
zzzeek
The author smugly assumes we all lack imagination in how the human mind might
work, when in reality, it is he who lacks imagination in how computers or
algorithms might someday work.

------
ThomPete
Neither is a computer a brain.

But the brain and the computer are both pattern recognizing feedback loop one
just isn't as developed yet.

The computer doesn't see the image but neither do I. We simulate it.

------
dang
Discussed at the time:
[https://news.ycombinator.com/item?id=11729499](https://news.ycombinator.com/item?id=11729499)

------
gus_massa
The article has so many errors that it is hard to write a reply. Let's pick
one:

> _The information first has to be encoded into a format computers can use,
> which means patterns of ones and zeroes (‘bits’) organised into small chunks
> (‘bytes’)_

The author should have read
[https://en.wikipedia.org/wiki/Analog_computer](https://en.wikipedia.org/wiki/Analog_computer)

------
bkdbkd
Meta question: Since this article is a repost and the comments from 2016 and
2019 say the article is largely incorrect, by what process did it make it to
the front page. \- tl;dr: this is a repost, and panned article, how's it here?

