

The Philosophy of ‘Her’ - wallflower
http://opinionator.blogs.nytimes.com/2014/03/02/the-philosophy-of-her/?hp

======
dwd
I'm disappointed there has never been a serious discussion on the
philosophical aspects of 'Red Dwarf' which explores the ideas of a digital
afterlife and human-like AI.

------
thenerdfiles

        There is nothing like “my” mind or “your” mind,
        each one having its own personal history that could,
        in principle, be tracked back from birth to death.
        Pretty much like single particles don’t have any 
        individual trajectory in Quantum Physics, single 
        subjects don’t have any individual history either. 
        — Quantum Ethics[0], Sébastien Fauvel
    

So, in a word, yes, "you" can be uploaded to a digital world. Perhaps many
different digital worlds. I mean, let's turn the question perhaps into one
that is ontologically similar: What about multiverses? Are _you_ your double
at some nearby possible world? Admitting that you are, does that "destroy" you
at this world? Can a _decision_ do this? Can a decision do the opposite, that
is preserve your "you"-ness, or let's call it _existential integrity_?

I think the true philosophy of "Her" is Henry Miller's assertion that

    
    
        Words are loneliness.
    

But there isn't any metaphysic here that re-inforces the divide between mind
and body — it is a philosophy of _how we feel_ about the _supposed_
bifurcation of reality (into easily digestable categories like "the physical"
and "the mental"). How we _feel_ about Mind-Body dualism's _possible truth_ is
not a justification for its truth. That's pragmatism. Who cares about boring
old pragmatism? Or at least let us admit that this is only _one_ viewpoint on
the whole debate, and it isn't a very compelling one.

I absolutely do believe that "uploading one's mind" is possible — but if we
treat "copy" just as we do today, we need to be thorough-going. When I copy a
file to a new OS, that new OS either understands it or it does not, runs a
program by default or it does not. But now wait — we're talking about a file
here — a file. All of the existential questions we're asking, we're basing on
the notion of "copy" as it relates to files. Why should we suppose that we
even know what we're talking about, using this clearly antequated grammar? I'm
not a file! (Put another way: Are files ever really "destroyed"? — No! Their
"space" is made "available" on the disk. So following your metaphor, if I'm
"destroyed" it's merely that the space I take up is made available to
something else — which presupposes that everything once maintaining that space
is accident, it's all accidental properties of the World). So I'll say here:
forget that "categorical divide" which is just another fashion of dualism —
humans _can_ extend their consciousness to digital worlds.

Chalmers et al. are talking about embodied cognition — we're all talking about
extending our consciousness into the world through augmented reality. This
talk of "copy-destroy" isn't using the same grammar of philosophical research.
We're talking about "me" looking at "me" sans mirrors. Could I have 7
eyeballs? Possibly! Is my neural system plastic enough to manage multiple
memories and personality — already demonstrated in mental illness cases!

Can _I_ be _extended_? _That_ is the relevant question — not "can I be
copied"? "Copy" is such a barbaric nonterm here.

There shouldn't _be_ such a thing as _copying one 's mind_ — in order to
upload a mind, the project is far more complex than talk about the neural
substrate of one person — we need to be talking about Operating Universes. I'm
afraid this discussion, as it is currently constructed, is a false-start, a
non-starter — a category mistake.

[0]: [http://quantum-ethics.org/](http://quantum-ethics.org/)

