
A New Theory of How Consciousness Evolved - curtis
http://www.theatlantic.com/science/archive/2016/06/how-consciousness-evolved/485558/?single_page=true
======
tim333
The new theory summarised (from the paper):

>The theory begins with attention, the process by which signals compete for
the brain’s limited computing resources. This internal signal competition is
partly under a bottom–up influence and partly under top–down control. We
propose that the top–down control of attention is improved when the brain has
access to a simplified model of attention itself. The brain therefore
constructs a schematic model of the process of attention, the ‘attention
schema,’ in much the same way that it constructs a schematic model of the
body, the ‘body schema.’ The content of this internal model leads a brain to
conclude that it has a subjective experience.

Kind of makes sense to me - that the mechanism that forms a model of your body
or a place map or whatever also forms a model of your mental setup of senses,
memories and the like.

~~~
nwah1
Everyone in this thread sounds like a p-zombie.

~~~
simonh
Says the guy posting the most p-zombie like post on the entire thread.

~~~
coldtea
What exactly makes it "the most p-zombie like post on the entire thread"?

It might be, but we'll never know from mere snark.

~~~
taneq
It might be, but we'll never know _at all, by definition._

------
niccaluim
This article was a little confusing for me. When I read "consciousness," I
think of private, subjective experience—qualia. This meaning of consciousness
is a very hard problem indeed, and any new theory is bound to be interesting.
(See Chalmers for a great summary:
[http://consc.net/papers/facing.html](http://consc.net/papers/facing.html))
But the article seems to be talking about meta-cognition, not conscious
experience. Meta-cognition is interesting in its own right I suppose, but far
less so than what I normally think of as consciousness.

~~~
thinkloop
I felt the same, it doesn't try to address the "hard" problem of
consciousness, as described in the article you linked.

It's such an interesting problem because for one, its hard/impossible to prove
consciousness. If you built an android that had every if-statement imaginable
coded into it so that it behaved exactly like a human, including answering the
question of whether it is conscious affirmatively, and showing signs of pain
and suffering in bad situations, how will you ever know if it's all just a
simulation, or if real feelies are behind it. At what point does it cross over
from an extremely complex and useful behavior machine, to one that experiences
actual qualia.

Another issue is the purposelessness of consciousness. Why does there have to
be _experience_ tied to our reactions to stimuli, if the resulting behaviors
are identical. Isn't there quite a bit of wasted energy and skull space that
could be better used for more survival oriented if-statements. All of our
behavior seems possible without the associated feelings.

The issue of androids is going to be an annoying one in a thousand years.
People will be polarized, with some championing android rights because they
seem to be conscious, while others will be adamant that they are just complex
machines that can be treated any which way. Personally I'd probably err on the
side conscious, just in case.

~~~
Houshalter
>Isn't there quite a bit of wasted energy and skull space that could be better
used for more survival oriented if-statements. All of our behavior seems
possible without the associated feelings.

Intelligence isn't about a giant list of arbitrary if then statements though.
Think of artificial neural networks. They learn behaviors from experience and
reinforcement. They take actions they predict will lead to reward.

How is that fundamentally different than what humans do? I mean our brains
algorithms are probably much more complex, but the principles should be the
same. Shocking a neural net with negative reward, and it will strongly desire
to avoid that again. How is that different than pain?

Add on a bunch of hard wired goals and behaviors more complicated than "pain"
and "reward", and you have yourself a conscious being. Not fundamentally
different than a human.

~~~
adwn
You _do_ have subjective experience, don't you? I mean, you don't just process
external stimuli, combine them with an internal state, and generate output
signals, like a program would – you can actually "feel" things and see them
with "your inner eye"?

I'm not being sarcastic here. Maybe there are people that don't have
subjective experience, just like there are people that can hear colors.

~~~
eli_gottlieb
> You do have subjective experience, don't you? I mean, you don't just process
> external stimuli, combine them with an internal state, and generate output
> signals, like a program would – you can actually "feel" things and see them
> with "your inner eye"?

Hold on. Did you just claim that people with aphantasia (ie: possibly an
inability to feed causal models from someplace (probably the neocortex?
IANANS, just work with some) back to the visual cortex and get the visual
cortex's "portrayal" of the conditional simulation embodied in the model) have
no subjective experience?

What if someone has no "inner eye", but does have "inner ears", an "inner
tongue", "inner hands" and all the rest? Or do you group all possible "inner
senses" used for imagining "what something is like" into "qualia" without
separating them into distinct functionalities that are in bijection with the
available physical senses?

Because you might have actually said something interesting here!

~~~
adwn
> _Did you just claim that people with aphantasia [...] have no subjective
> experience?_

I don't think so. I wasn't talking about imagination (as in: thinking of a
cube and rotating it in your mind), but rather about experiencing the input
from my physical eyes. It's as if there's someone inside my head that
experiences and feels what I physically see. Of course, this doesn't help: the
homunculus model is thoroughly debunked. [1]

> _Or do you group all possible "inner senses" used for imagining "what
> something is like" into "qualia" without separating them into distinct
> functionalities that are in bijection with the available physical senses?_

There are qualia like anxiety or restlessness that are not linked to physical
senses, so that's a no on the bijection. I have no idea whether those "inner
eyes, ears, etc." that link imagination to physical senses are related to
qualia – it's so difficult to imagine myself into different minds. Besides,
coming up with an objective, clear, non-recursive definition for subjective
experience is way above my paygrade.

Another random thought that I just had: People with depression sometimes hurt
themselves physically, because that's the only way they can still feel
something (according to them). Maybe this is related to a dampened subjective
experience, which allows only extreme inputs to be experienced?

[1]
[https://en.wikipedia.org/wiki/Homunculus_argument](https://en.wikipedia.org/wiki/Homunculus_argument)

~~~
eli_gottlieb
>It's as if there's someone inside my head that experiences and feels what I
physically see.

No, you only imagine that. And I'm being serious: people who aren't brought up
being told that consciousness is like having a little homunculus in your head
simply never feel "as if" there was one.

>There are qualia like anxiety or restlessness that are not linked to physical
senses, so that's a no on the bijection.

I would normally call those emotions rather than qualia. Hmmm. Would you say
the "qualia-ness" of things comes from the internal perception that the
"feelings" are _external_ to the person who feels, and _impinge_ on them?

>Another random thought that I just had: People with depression sometimes hurt
themselves physically, because that's the only way they can still feel
something (according to them). Maybe this is related to a dampened subjective
experience, which allows only extreme inputs to be experienced?

But dampened affect (dampened emotional signaling in the brain) is already a
known and seemingly somewhat understood symptom of depression.

~~~
aninhumer
>people who aren't brought up being told that consciousness is like having a
little homunculus in your head simply never feel "as if" there was one.

So you're saying people brought up differently are p-zombies? Because that's
the only way I can interpret that.

Feeling like there's a homonculous is fundamentally what experiencing the
world is like for me, and has been since long before I was able to understand
complex philosophical questions about conciousness.

~~~
eli_gottlieb
>So you're saying people brought up differently are p-zombies?

No, I'm saying that people brought up differently actually identify our selves
with our actual, physical bodies, and don't imagine/visualize a tiny little
figurine in our head. We have a _different_ experience of the world, not _no_
experience.

To me it seems weirder to say, "When I get bitten by a dog, the little
homunculus gets a loud alarm ringing in his little room saying, 'BITTEN BY
DOG!'" than "When I get bitten by a dog, its teeth feel sharp and painful."

------
acqq
The paper from the same author where he formally presents his ideas:

[http://journal.frontiersin.org/article/10.3389/fpsyg.2015.00...](http://journal.frontiersin.org/article/10.3389/fpsyg.2015.00500/full)

~~~
kanzure
I thought I had read a very similar idea in
[http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4010745/](http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4010745/)
but on closer inspection I see that it's referencing that same author...

------
ttctciyf
This passage from the article:

> Even if you’ve turned your back on an object, your cortex can still focus
> its processing resources on it. Scientists sometimes compare covert
> attention to a spotlight. (The analogy was first suggested by Francis Crick,
> the geneticist.) Your cortex can shift covert attention from the text in
> front of you to a nearby person, to the sounds in your backyard, to a
> thought or a memory. Covert attention is the virtual movement of deep
> processing from one item to another.

very much put me in mind of an unusual book: _A Life of One 's Own_ (Marion
Milner, 1934) - in particular, this:

    
    
      At any moment there exist in the fringes of my thought
      faint patternings which can be brought to distinction
      when I look at them. Like a policeman with a flash-light
      I can throw the bright circle of my awareness where I
      choose; if any shadow or movement in the dim outer circle
      of its rays arouses my suspicion, I can make it come into
      the circle of brightness and show itself for what it is.
      But the beam of my attention is not of fixed width, I can
      widen or narrow it as I choose.[1]
    
    

1:
[https://books.google.co.uk/books?id=ntg6OE7haSgC&pg=PA77](https://books.google.co.uk/books?id=ntg6OE7haSgC&pg=PA77)

------
robg
A theory that's not testable is mere conjecture. Worth reading is Nagel's What
is it like to be a bat?

[http://organizations.utep.edu/portals/1475/nagel_bat.pdf](http://organizations.utep.edu/portals/1475/nagel_bat.pdf)

------
amelius
I'm stuck with a related philosophical problem, perhaps somebody here can help
me out.

The problem is that consciousness happens "now", but "now" has only a meaning
in an inertial frame of reference, and simultaneity in physics depends on the
chosen frame of reference.

To phrase it differently, you generally don't feel what you felt yesterday, or
what you feel tomorrow. You only feel (or experience) what you are feeling at
this very moment. But if one (mathematical) point in the brain is conscious,
by the time it has communicated to another point in the brain, that "feeling"
has been lost to time.

So either consciousness happens only at distinct (mathematical) points in the
brain, or, somehow, consciousness can span a short non-zero time interval. You
experience not just "now", but also briefly in the past and perhaps into the
future.

This seems contradictory, so a better way of looking at it is needed.

~~~
carapace
This is a very interesting point.

The article takes "consciousness" to mean the contents of awareness, and not
awareness (subjectivity) itself, so from my point of view it doesn't even
begin to touch on the "hard problem".

One interesting clue is that, as you point out, subjectively "it" is _always_
now and you are _always_ here. Whatever "you" are, you're the origin-point in
spacetime for the contents of your awareness.

This is true even when you are dreaming, which seems significant.

We know that phenomenon that are too fast or too slow cannot be perceived, and
I think you're right that that indicates something important about how
subjectivity functions.

------
corecoder
The article (haven't read the paper yet) resonates with things Dennett has
said, as many commenters have pointed out. It goes further though, as it
details a few different structures and goes beyond speculation, as most of the
theses are empirically testable, at least in theory.

Two questions:

* Are there any objections to the theory, apart from: but consciousness is magic?

* What are good resources to learn the history of the idea that complex brain activity emerges from neurons competing with each other?

------
pointernil
"New"

The papers and work on this theory go on for years already and its due time it
gains more/wider public attention, imo.

I wonder if it will cause/is causing already a kind of
antropo-"disappointment" just like heliocentric theories caused when they un-
throned the earth-centric theories... we humans love magic, especially when we
are told/we think to be magic, right?

~~~
wallacoloo
I love the seemingly improbable, not "magic" \- it just so happens that they
overlap.

Children are still fascinated by flying machines. Adults generally aren't _as_
fascinated, just by having grown up with them, regardless of how well they
understand them. But both groups know the machines are grounded in reality, as
opposed to being "magic".

So no, I disagree. And I think it's irrelevant. The people who want to believe
in science will be happy to expand their understanding (this is still theory).
The people who don't will keep on believing whatever it is they believe. There
are still people out there who think there's a god that raises the sun each
morning and sets the moon at dawn, despite what science tells us.

------
dr_
For those who are interested in hearing Graziano discuss consciousness
further, alongside Chalmers and Tegmark, I recommend this NYAS video from last
month:

[http://www.nyas.org/events/Detail.aspx?cid=43d077ca-947f-4f4...](http://www.nyas.org/events/Detail.aspx?cid=43d077ca-947f-4f40-b01e-71569a0719e8)

------
sillysaurus3
Are insects conscious? Why or why not?

~~~
thinkloop
It's impossible to be sure, but since it seems consciousness follows a
spectrum when it comes to living beings, with more complex beings having
"more" consciousness and less complex beings having less, my guess is that all
living things have some form of consciousness, including plants and insects.

As soon as anything "prefers" any stimulus over another, I think consciousness
becomes an emergent property.

~~~
back_beyond
Does this spectrum imply that humans have varying levels of consciousness? If
so, I think this presents an interesting moral question.

~~~
simonsquiff
I'd certainly say there is a spectrum of varying human consciousness from
fetus>baby>young child>child

~~~
back_beyond
Does it vary within each category?

~~~
codeulike
Lets say it does. Perhaps a brain damaged person in a coma has less
consciousness. Or perhaps someone who is just asleep has less consciousness.
Whats your interesting moral question?

~~~
back_beyond
I seem to have forgotten

