The I that is me doesn't survive a nights sleep. Every night my brain does a re-shuffle of my personality and memories creating a new incremental version. Same goes for head injuries and reading a book. I'm still the me from before, just more so, and not. I have a vague feeling of continuity but it's almost non-existent. I am so different from myself ten years ago as to be a distinct person.
I have an intuition that any direct brain interface with an outside neural network would require small incremental changes similar to what we experience when we sleep. Then again I also have the feeling that early adopters of this tech will be lucky to get away with their minds intact.
Maybe in later versions it will be like turning on a switch and it's just there. Like opening your eyes or waking up from a dream. But I doubt that will happen in the next century.
This sort of meditation fills me with existential dread, namely the idea that every moment you are annihilated and replaced by a new person with most of your old memories, and nobody even notices.
And when "I" wake up I'm like, wait why do I have all of these bills to pay, and why is everything in such a mess? Clearly nobody asked me about any of that.
Sorry, future "me" - if you had kept me around and not insisted on replacing me wholesale, I might have been able to help you out, but given that you're getting rid of me and taking all of my stuff I think I have no reason to help you in any way.
I come to terms with this by realizing that the alternative is true horror. Can you imagine eternal stagnancy as a disembodied human soul? Never experiencing anything new, endless bliss or pain would be indistinguishable or result in withdrawal and self-annihilation. At least if my body dies, it will still change over time and become something else.
I think of a soul as the wave of my life passing through time. It will continue after I die, the ideas and things I create will propagate through time like the wind of a butterfly's wings touching the souls and lives of those around me. It could create a hurricane, or dissipate into meaningless tiny changes. Eventually though chaos mixes the souls of everyone into entropy.
I think this will happen within the next 10 years even without a brain interface. VR is already quite immersive even with low resolution and only vision + sound.
I firmly believe that the brain is a form of quantum computing. I also believe that moving the consciousness from a 90yo close to death to a cloned body containing a newly grown brain or then later on a silicon based one will be possible in the future.
The best TV show that encapsulated this was Travellers[0]. A damn good show which was cancelled just when things were getting interesting.
I would say it's possible by entangling the particles from the source brain to the recipient one. Once completed the original source is destroyed, so it's effectively a move and not a copy.
There's also the 6th day movie [1]. Imagine being able to spin up multiple copies of yourself and 100x your own output? I believe that will also be possible in the future. Probably outlawed due to lots of ethical reasons. But... I'd like to think why not? Me[1] lives on earth. Me[2] lives in Mars, Me[3] lives on a moon in the asteroid belt. Me[4] is currently navigating out of the solar system. Since all the brains are entangled. I'd know everything in an instant.
How do deal with that is another story. Unless I'm AI at that point, which means the collective are all just a bunch of VMs anyway :P
Can you expand on your entanglement idea? I’m not sure how it would work. You could enable particles without connecting neural systems functionally (although I’m not sure we can show qualia/consciousness is actually located).
The issue of course is that due to the nature of subjective experience it cannot be empirically tested.
I found the 'worm' of time and space view in the article that is not preserved by recreation to be irritating and naive. Go to sleep on a plane and you are effectively transported to a new place 'disconnected' from where you were. It doesn't cause confusion or entirely new beings to come into existence that would be confusing to the "old you".
Whether these kind of people like this pretty distinguished author likes it or not, there will be scanning, uploads, and eventually entire worlds where people live.
The world of Reamde and "Fall, or Dodge in Hell" look like the extremely obvious and likely future world. These heavy philosophical pieces are also extremely unpersuasive. They come across like something from a bizarre academic world that is disconnected from what is likely in reality. And I'm an academic (but in cs, not this stuff).
Identifying with the self is the source of all suffering, attachment to the illusion of being a person. So wouldn't this be a technological nirvana?
Edit: Especially given that selfishness is the root of all evil.
Fortunately for justice, the guilty poster won't survive contact with my complaint: they'll be perpetually changed by it to a new and different person.
The ideas in that article suffer from the same lack of creativity that a lot of these philosopher written pieces also suffer from (including the main article here). Someone will program the ai, the robot, to want to survive, to want to destroy competitors. It's frankly incredibly irritating to read things like like that sci am article you give, because there's some bizarre belief that only how "things evolve" will govern them. Well I got news for all people that support this naive view - people will program, adapt and alter the ai's to act like they would desire, at least as a starting point. Whether it's some drone that kills people autonomously, or a future potential ai, someone will have influence over it's beliefs and point it in a direction that they desire.
That’s the entire point of the article. The AI itself will not develop a need to harm. Users/creators of the AI will program that into it. But AI by itself even with creators of the best intentions has real problems that affect society today. And talking about a hypothetical future has distracted the public narrative from the real problems with AI that we need to solve now.
They even address what you are saying in the article. They mention the looming danger of weaponized AI which is exactly when someone has altered the program to do harm.
I have an intuition that any direct brain interface with an outside neural network would require small incremental changes similar to what we experience when we sleep. Then again I also have the feeling that early adopters of this tech will be lucky to get away with their minds intact.
Maybe in later versions it will be like turning on a switch and it's just there. Like opening your eyes or waking up from a dream. But I doubt that will happen in the next century.