
On the Analogy Between Mind/Brain and Software/Hardware (1992) - wsgeorge
http://www.massline.org/Philosophy/ScottH/mindsoft.htm
======
svara
The article makes an interesting point in fleshing out an analogy between (a
specific reading of) the concept of "software" and the mind, but does so in a
needlessly condescending tone, which is very distracting.

Take this example: _" To begin with you may think of software as a series of
instructions to the whole system (including both the hardware and the rest of
the software) [...] This is as far as programmers in business data processing
centers usually get. That is why so many of them keep insisting that "no
computer can possibly think; if you know anything at all about computers you
should know that they just follow directions". This is an incredibly naive
view which only reflects the narrowness of their own conceptions and
programming practices."_

I would argue that defining software as "a series of instruction to the whole
system" is perfectly valid. The author, instead, takes "software" to mean the
functional properties of dynamic states of the system. Which also sounds like
a valid definition, which leads to the mind / brain analogy.

edit: formatting

------
joe_the_user
This 1993 article definitely reads like the Mind/Brain concept in the era of
Gofai (good old fashioned AI).

It's quite interesting just for its historical outline, even if the outline is
pretty sparse (you gotta start somewhere). The main thing that I think modern
AI/Machine-learning has discarded is the concept that intelligence is based on
logical specification.

And I think it's entirely reasonable to say that a mind isn't "like software"
if you mean human-constructed, top-down, logical-intentions based software.

The thing is, though, that neural networks and similar advances haven't left
us with any alternative analogies for the mind/brain relationship. Should we
come up with a new one?

~~~
goatlover
Also, the mind isn't like software in that it doesn't get copied to other
minds or mediums. Yeah, there are sci fi scenarios or futuristic predictions
for this sort of thing, but it's not a reality at this point. Another
difference is that there is no machine code for the brain. There isn't some
set of primitive instructions programmers could target to instruct the
connectome.

~~~
titzer
> There isn't some set of primitive instructions programmers could target to
> instruct the connectome.

I wouldn't be so quick to rule that out. Think about the difference between
short term memory and long term memory. Short term memory is on timescales far
shorter than synaptic formation, so it stands to reason that it might take the
form of electric patterns held in buffers in the brain (i.e. it is not encoded
using synapses). If that is true, then why can't buffers that hold electric
patterns also hold a set of instructions that can be retrieved from storage?

E.g. maybe short term memory is more like an FPGA (easily reprogrammable
circuits) and long term memory is more like an ASIC (hardwired synaptic
connetions that are far more dense and long-lasting).

------
ozy
A fascinating subject. From a medium can come a new system of patterns. Like
waves, really hard to figure their math, if you study water molecules. Plus
waves are medium independent, and it gets really interesting as waves cross
into a different medium (it bends).

Software is like waves, it needs a compute medium, but is independent from it.
And software in the broad interpretation as information processing, indeed
your brain seems like a compute medium, your mind is the information
processing.

As for consciousness, the article mentions a follow up, anybody know if that
exists somewhere? Personally I think consciousness is a pattern of information
processing, a lot like reinforced learning[1]: keeping a history of
observations, actions, and plans, and continuously introspecting and judging
those plans.

So I wonder, if you run such an algorithm on a computer, would that feel like
something to that program?

[1]
[https://en.wikipedia.org/wiki/Reinforcement_learning](https://en.wikipedia.org/wiki/Reinforcement_learning)

------
xg15
The article is interesting but the historical overview seems dubious to me and
quite biased to show that "originally" people already thought it was a
physical thing.

What about ancient egypt for example? It has extensive philosophy about the
afterlife and the "soul" yet is not mentioned at all.

~~~
nemo1618
The author states that:

    
    
      > Until recently there were really no very good analogies available
    

On the contrary, our theories of mind have mirrored our latest technology for
thousands of years. First, a hydraulic model, with "humours" flowing
throughout the body's plumbing; then a mechanical model with tiny machinery;
then an electo-chemical model with charges and currents.

This article argues strongly that our present computer-based metaphor is
likewise off-the-mark: [https://aeon.co/essays/your-brain-does-not-process-
informati...](https://aeon.co/essays/your-brain-does-not-process-information-
and-it-is-not-a-computer)

~~~
wsgeorge
> This article argues strongly that our present computer-based metaphor is
> likewise off-the-mark

Well, it is a metaphor, so, by definition, off the mark. The article only does
well to hammer home the fact, as does the OP (at least, early on).

The metaphor eventually breaks down when you take it too far, too literally,
otherwise it wouldn't be a metaphor. It will be the exact same thing.

The brain-as-hardware-mind-as-software is a useful model for dealing with the
mind-body problem, and giving us a fair idea of how a virtual mind can exist
in (on?) a physical brain.

Of course, you can't fault earlier ideas of thinking about the brain: it is
only natural (and beneficial) to think of the mind in terms of the most
complex man-made (or man-understood) technology currently available, without
necessarily holding fast to it dogmatically when a better model is found.

Think of it as necessary stepping stones to the destination.

If anything, it is naive to think that the brain/mind works exactly like our
current computing architecture. It is also naive and premature to completely
banish the idea that the brain is some sort of computing mechanism.

~~~
mannykannot
Whenever this topic comes up in HN, a sometimes quite large fraction of the
comments do make, or are predicated on, what seem to me to be unjustifiably
specific analogies to current software technology (I guess that's no
surprise.)

~~~
lgas
I imagine it's because they're implemented in PHP.

~~~
xg15
That's totally flawed thinking.

The brain is of course written in JavaScript.

------
shujito
I think this should have a [1992] tag on the title

~~~
Animats
Good point. It sounds very similar to the mind/brain stuff in Snow Crash.

------
pmontra
I doubt that somebody's else mind (the program) would run well on my brain
(the hw) and viceversa. Mind/brain are probably more like the first computers,
programmable by manually reconfiguring their hardware. So, more like FPGA than
general purpose CPUs. Interestingly we're moving in the direction of blurring
the difference between hw and sw. Example: all the stuff that Apple is moving
in custom made processors on their phones that once was run by the main CPU.

~~~
bognition
I'm curious as to why you think this

~~~
pmontra
Because even if our brains share a common general design they grow and become
different. Synapses connections are created or destroyed according to our
personal history, the experiences we had etc. To upload your mind into my
brain would mean to rewire my brain at least down to the synapses. There could
be some excess neurons left or not enough neurons for you to be there. Not to
talk about possible preexisting damages that would prevent recreating the
exact structures that let you be you. Plus, all the connections with the rest
of the body and the influence that my other systems (endocrine, hormones, etc)
could have on the way your mind is allowed to perform. In the best case we
should have to find an alternate configuration of my brain that would produce
your mind. That configuration would be different from the configuration of
your brain. I concede that's not impossible, but it's not how we're doing it
with computers and I wonder how we're going to test that. Make the two copies
talk together for a while until they agree they're the same person?

Compare that with computers which are built from standard parts. Every
processor of the same model is the same down to the nanometer scale, every
RAM, bus, HDD or SSD are the same. Reboot with a different OS or image and you
get a totally different system. Boot two equal systems (because they can be
equal) with the same OS and you get two exact copies. After all we've built
general purpose computers to be general purpose, didn't we?

