
Why Minds Are Not Like Computers (2009) - prostoalex
http://www.thenewatlantis.com/publications/why-minds-are-not-like-computers
======
le0n
A great article on similar issues is "Turing Test: 50 Years Later":

[http://crl.ucsd.edu/~saygin/papers/MMTT.pdf](http://crl.ucsd.edu/~saygin/papers/MMTT.pdf)

It touches on the "physics is computable therefore the mind is computable",
and similar such arguments, in a fascinating and IMO balanced way.

------
taniraja
Is cognitive dissonance computable?

------
sudo_free_cake
This is why people not trained in CS shouldn't speculate on it, when they do
you get garbage like this.

The brain performs computation, it is by definition a computer. Before modern
computers there was a human job, called, computer. One who performs
computations. The analogy of the brain to a computer is perfect, because the
brain is a computer.

~~~
justinpaulson
Except the author does in fact have a BS in CS. IMO, the brain does far more
than performing computation. Sure humans were once "computers," but our
ability to perform computation is not a valid argument to that computation, or
the inner workings of our mind, being the same as digital computers performing
computation. You failed to look up the author's credentials and I feel like
you may have also failed to read the entire piece.

~~~
theaiguy
> the brain does far more than performing computation

In your opinion? Can you be more specific, and give examples of things the
brain does that are not computable?

This is trivially true in some senses, neurones have analogue responses, that
the biology does a good (but not perfect) job of thresholding. But then, the
same thing can be said of transistors. It's just we're able to engineer their
analogue responses out much more successfully than evolution has. It is also
true that the brain is connected to a much broader system which is undoubtedly
analogue (i.e. the body), but then again, it isn't clear that isn't true of
any non-abstracted computer.

Comparing theoretical and idealised computing to embodied brains might feel
insightful, but it doesn't actually resolve any of the real issues in the
philosophy of AI.

Also, a BS in CS isn't a good minimum qualification for competency in the
philosophy of AI. I wouldn't read much into that.

~~~
justinpaulson
I was mainly commenting on the OP's position that the author was 'not trained
in CS'.

Here is a list of things I believe that the human brain performs outside of
computation:

The irrational motivations that take us over when we feel love. The way that
our mood can affect a decision. Taking a walk to enjoy the beauty of the
sunset. Writing a satirical short story to express a political fallacy.
Painting an image that we saw in a dream. Using metaphors to explain an idea
or to validate an argument. Telling a joke and understanding why it is funny.
Buying a shirt because it looks cool.

There are many more. But to me, there is definitely more to the mind than
simple computation. There is a quality to our experience that is completely
lost when our inputs and outputs are equated to the workings of a digital
computer.

~~~
theaiguy
Ultimately these arguments come down to qualia. But there is no reason to
think qualia is a) present in other brains except your own, but b) not present
in any non-brain information processing system. Philosophers of mind have
tried to make that argument, but end up appealing to intuition.

In terms of computability, folks like Boden and Sloman showed in the 90s that
emotion is compatible with computability. Even more so, that emotion is
implementable in symbolic computation. Of course, one could declare such
systems have no qualia of emotion. But can you do more than declare that,
while not simultaneously creating arguments that could apply to other brains?

I get you are working on intuition. But there's fifty years of actual research
been done on this. Waving your hands and appealing to your 'humble opinion'
isn't how scholarship works.

~~~
justinpaulson
Firstly, I never said my opinion was humble.

Secondly, a good argument is not a proof. We have such a limited understanding
of our own emotion and cognition that it is impossible to say that we have
proven anything about how compatible emotion is with computation. Hands are
continually waved over the existence of qualia and the argument, like you
presented yourself, is that we have no way of proving that qualia won't exist
for an artificial intelligence that has been programmed for emotional
compatibility. I believe that is a fallacy, and it is side-stepping the root
of the argument about qualia, and the real root of what self-realization and
consciousness is. By saying, look I can make this machine do everything you do
and act as if it feels like you do, you are not proving that you have
encapsulated all that is cognition. You are only proving that you can mimic
cognition. It could be retorted "Well how do you know this machine doesn't
really feel like you feel?" I don't know that. But we really aren't learning
anything about consciousness by ignoring the question with such a fallacy.

I believe AI has a very important role in rapidly evolving our way of life as
we continue in this technological evolutionary cycle. However, I think it does
nothing to teach us about ourselves and how our minds actually work. It is
nothing more than mimicry. And nothing can be proven based on how an AI bot
operates for the materialist or the idealist, so it is aimless to think that
AI is how we will understand our own cognition.

~~~
theaiguy
> is that we have no way of proving that qualia won't exist for an artificial
> intelligence

You misunderstood my objection. The issue is not whether we can prove such a
thing, but can we differentiate between different information processing
systems in arguments that qualia exist at all. You assume qualia is a thing
that brains do. On what basis do you assume anyone other than you have them,
such that implies that no other systems do?

Qualia is one of those topics academics tend to roll their eyes at when it is
brought up. Because it appears to do a lot, but in most cases is a variation
of 'because I feel like it should be true'.

~~~
cactusface
If you can simulate matter, certainly you can simulate a brain. Qualia appear
not to derive from pixie dust but accumulated experience in the world. You
might need to train a brain for a long time to develop qualia. You might not
be able to copy qualia from one brain to another. I think these are the main
objections, but I'm not sure.

Personally, I'm not convinced that simulating matter is feasible, never mind a
living organism, never mind an intelligent living organism, which is what the
brain is, when you account for all of it.

We know how far away Andromeda is. We know how to build a spaceship.
Therefore, it is possible to go to Andromeda. Is it though? What if the Earth
doesn't have enough resources? How big is your brain simulator allowed to be?

------
palosanto
"If we achieve artificial intelligence without really understanding anything
about intelligence itself then we will have no idea how to control it."

Exactly. It's fascinating/scary to me how people still talk about A.I. or
machine consciousness when we basically have no idea how consciousness works
in general.

~~~
sullyj3
People who say things like that don't understand how to reason using limiting
cases. We have artificial intelligence right now. Maybe not artificial general
intelligence, but plenty of stuff traditionally thought of as being solely the
domain of minds, is now performed by software. I'd say we understand it pretty
well, and we certainly have the ability to control it.

~~~
scarmig
The one iron-clad rule of artificial intelligence: any successful attempt at
implementing it makes it not artificial intelligence, just something that
computers are able to do.

(Though I guess there is the second rule of over-optimistic timelines, to be
fair.)

