
Consciousness Is Not Mysterious - callumlocke
http://www.theatlantic.com/science/archive/2016/01/consciousness-color-brain/423522/?single_page=true
======
o_nate
Don't waste your time. This article is deeply confused. Even if consciousness
is just a brain-created illusion, who is experiencing the illusion? That
experience is what we mean by consciousness in the first place, so this
explains nothing.

~~~
snowwrestler
The argument of the article is that no one is experiencing the illusion. The
entirety of the experience _is_ the illusion itself.

Essentially: consciousness does not exist. We only think it does because the
thing we think with is wrong. But because we think with it, we can't tell the
difference. So asking "who is experiencing the illusion" is begging the
question.

Look at it this way: what evidence can you use to distinguish between an
object that is actually conscious, and an object that is merely faking
consciousness? This is commonly known as the Chinese Room problem.

[https://en.wikipedia.org/wiki/Chinese_room](https://en.wikipedia.org/wiki/Chinese_room)

But whereas the Chinese Room has been used to argue that computers can never
be fully conscious, this article reverses the equivalence: maybe the Chinese
Room proves that _nothing_ is fully conscious, and it is our concept of
consciousness that is mistaken, not the rest of the observable world.

This is conceptually similar to the mental leap behind General Relativity:
since we can't find any way to distinguish between gravitational and inertial
acceleration, maybe they are in fact physically equivalent and
indistinguishable--then build the theory from there.

Starting with the concept that all consciousness is an illusion allows us to
side-step a lot metaphysical conversation and just try to figure out the
mechanics of the illusion--how it works, not whether it exists.

~~~
objectivistbrit
An actual Chinese Room would be unimaginably vast - whole planets, solar
systems, devoted to the knowledge we hold implicitly.

To illustrate - imagine you give the room (in Chinese) the text of Lord of the
Rings, and then ask it which member of the fellowship corresponds to the "Wise
Old Man" archetype. Any reasonably smart human will give the answer,
"Gandalf". But the room would need to (in Chinese) make systematic notes of
the characters, their relationships, their actions, and then somehow integrate
all this into a categorisation under the appropriate archetype.

I'm actually agnostic as to the question of whether such a room could be built
-- as I'm agnostic towards the possibility of AI -- but assuming the room
would not be inconceivably complex masks the complexity of the ordinary
workings of human consciousness.

\---

    
    
      Essentially: consciousness does not exist. We only think it does because the thing we think with is wrong. But because we think with it, we can't tell the difference. So asking "who is experiencing the illusion" is begging the question.
    

As for your essential point, I hold that words like "think", "experience",
"illusion" all rest on the more fundamental concept of "consciousness".

~~~
snowwrestler
What if the Chinese Room already exists, and we call it a brain? It's the most
complex object we know of in the universe and it collects and processes input
for decades. That's a pretty vast amount of stored information from which to
produce an answer to your question.

We wouldn't expect a 3-year-old to provide the correct answer, but we would
expect a 30-year-old. This is a big hint that the brain does in fact collect
and store information for later use. This would be analogous to filling in the
books in the Chinese Room. If there hasn't been enough time for all the books
to get filled in, we won't expect the room to give correct answers.

In addition, if someone has never heard of Lord of the Rings, we would not
expect to get a correct answer either. That page of that book in the Room
would be blank.

As for which concepts rest on which, we have to be careful, when discussing
consciousness, not to mistake grammar for reality. There is no way in English
to express the concept of an illusory "I" in the first person. That doesn't
mean it is an impossible concept.

~~~
objectivistbrit
Whether the mind is equivalent to the brain is an open question. If someone
built an artificial mind, that would massively raise my belief in the
possibility that the mind is purely physical. But I hold that this is vastly
more difficult than assumed.

My point with the Lord of the Rings was precisely that you give the books to a
person who hasn't read them, and give them a few weeks to work through them. A
Chinese Room would take vastly longer to "process" the book and it's
"understanding" (represented symbolically) would take up vastly more room than
the book itself. Even to understand simple word problems (to do with arranging
blocks, for example), the algorithm used by the Room would have to explicitly
encode our implicit knowledge of spatial relationships, and be able to apply
this knowledge to arbitrary problems. To understand literature at a human
level, the algorithm would have to encode standard knowledge of human
psychology, society, physics, etc.

In principle this could all be performed with a giant lookup table -- but such
a table would be the size of several solar systems' worth of matter. So I
don't hold that a Chinese Room is impossible in principle, but I do hold that
building a practical AI would require great advances in the understanding of
our own consciousness.

