Hacker News new | past | comments | ask | show | jobs | submit login

This is similar to the Moravec Transfer [1], a proposed method of (slowly) transitioning to a non-biological brain. Indeed the whole concept of "the self" is important to maintain, even if it is to some degree an illusion.

1. http://everything2.com/title/Moravec+Transfer




There is no guarantee that the subject will not lose a bit of consciousness on every neuron that is being replaced (even if, to outsiders, it appears not to be the case).


I would expect a hybrid meat-computer personality to notice that at some point in the transition. If not -- if consciousness is so profoundly ethereal that a person's own consciousness can be gradually replaced with a simulated one without the person themselves even noticing -- then what's the difference between the "real" consciousness and the simulation?


>then what's the difference between the "real" consciousness and the simulation?

Well, isn't that the whole idea? Objectively, there is no difference. If there was, uploading wouldn't ever be possible.

The only potential difference is the subjective impression of "this consciousness that I am now is the same consciousness that I was yesterday" versus "this new consciousness is an exact clone of mine, but is not really 'me'".

This incremental upload technique simply tricks your subjective self into thinking there is absolutely no change in continuity.


> This incremental upload technique simply tricks your subjective self into thinking there is absolutely no change in continuity.

In other words, your subjective consciousness is somehow changed but you can't subjectively tell? What does that even mean?


No - rather there is no change in consciousness but if you do the upload in one jump, your subjective self sees the discontinuity and mistakenly thinks there was a change in consciousness. You have to do the process gradually to trick it out of its error.


These are tiny, atomic, piecemeal changes that happen slowly over time.

If one of your neurons randomly dies right this instant, you would probably not subjectively notice, even though that neuron composes your subjective consciousness. Your neural network is highly redundant and fault-tolerant.


Yes, neurons in our brains are dying every day, but it doesn't affect our subjective consciousness. Are you simply saying that uploading, if it were done carefully, would not make any more difference to our subjective consciousness than neurons dying every day?


Well, suppose the entire self-awareness part of your brain simultaneously dies. You wouldn't notice.


You wouldn't notice because "you" would no longer exist, by hypothesis. But I would expect that there would also be a huge objective difference in your behavior, easily noticeable by others. The "self-awareness part of your brain" is not disconnected from the rest of your brain and body; if it dies, the rest of your brain and body is going to be drastically affected.


What if you kept all the original neurons and re-assembled them into your original brain? Who would be the real "you"?


Well, you'd have 2 mental clones of yourself, essentially.

It depends where those neurons were sent off to in the meantime and what they were doing, but with such a big jolt, it's quite possible they would perceive themselves as "clone" (after understanding the situation at large) while you in your new machine self would be the "original you".


Why would it be a clone, and why would it perceive itself as a clone, considering that it is made of up the original matter and in the original configuration? I would assume that a technology sufficiently powerful enough to re-assemble disintegrated matter into something as complex as a brain would also keep the state identical to it's original form as well. Assuming that thoughts and perceptions are completely physical, and the brain is the same state as before, there is no reason for it to experience a jolt or anything else that indicates something special occurred.


The difference would be in the case of a simulated consciousness, I would been dead and wouldn't get to experience any of the pleasure that fake-me experiences, for making a simulated consciousness is the same as making a philosophical zombie.


A philosophical zombie is an incoherent concept. The idea that your consciousness could disappear while leaving all of your external behaviors unchanged makes no sense. Your consciousness affects your external behaviors; if you were not conscious, your external behaviors would be different.


There is no reason to think that it would, either. If you did a Moravec transfer with real neurons that were grown from your own stem cells, such that the resulting brain is more-or-less physically identical to the original brain, do you still think the subject would "lose" consciousness?


I rather think the appeal of a Moravec transfer is that a single result-mind with an upgraded substrate is preferable to two result-minds, one on the superior substrate and one still on the original substrate. Doing it that way is cruel since the original will still decay and eventually die.

(Thanks for this by the way, I've long thought about this idea but didn't know it had a name.)


Wow! And here I thought Greg Egan invented this [1]. Thank you so much for the reference.

[1] Greg Egan - The Jewel


Nitpick: Egan's design is different in principle; it's a self-contained device running a bog-standard neural network that, over a number of years, trains itself on the brain's inputs and outputs. So the actual mechanics of how it operates doesn't necessarily reflect the structure of the biological brain.

I found that really fascinating as an idea. Being identical is one thing, but does being arbitrarily close to your behaviour make it you?


That's all just boiling frogs. Why should we consider the a structure that subsumes the space of an organic human brain to be a living continuation of the original brain?

The deconstructionist concepts that our constituent atoms cycle through a complete change-over in less than a decade, does not equal total organ replacement with a non-biological surrogate. Those sorts of ideas sweep a grand swath of what-it-is-to-be-alive under the carpet with a sentence-long sound bite.

Replace a knee or a hip with a comparable structure of equivalent practical necessity, and that person will still be missing those body parts.

Claims such that, simply because the replacement is infinitesimally complex, that this sufficiently satisfies the requirements of "being a living human" will still be wrong.

You might enjoy a puppet that acts just like your dog, or your grandma, but it won't bring them back, or keep them alive beyond their expiration date. Put your grandma's Moravec replacement brain inside your dog's body, and tell me she's still alive.

A human without a human brain is not a human, but instead a soothing, reassuring puppet, perhaps hoisted aloft by hypothetical autonomous, and nigh-imperceptible nano-cyber-strings.

But still a lifeless corpse all the same.


>Put your grandma's Moravec replacement brain inside your dog's body, and tell me she's still alive.

That's an ontological conundrum that's far from settled. I'd argue the information we have points to our biological intelligence not being particularly special or more "real" than other forms.

What I can tell you is that your hypothetical monstrosity there would probably play fetch and bake some really good cookies. Heck, I'd probably name it Dogma.


> But still a lifeless corpse all the same.

How can you possibly know this? And I'll point out that no one but you has posited that the result would still be a human brain.

What do you think happens if you freeze a brain in liquid nitrogen, then safely thaw it out and restart it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: