Granted that consciousness is not fully understood, can we not agree that it is entirely manifest in the brain? I mean, you don't believe that human consciousness somehow survives the body after brain death, do you?
Even if the cloned memories could be streamed from your brain into a running simulation, allowing a memory of crossing the threshold from the physical world to the digital one to be created in the simulation, the 'patient' will not have this experience. The patient will still have to experience death.
Going further, let's imagine that we want it really, really badly. Let's destroy each neuron immediately after taking it's state and pushing it into the stream, so that the whole process of uploading and death are completely synchronized and nothing of the patient's consciousness remains afterward.
I'm sorry, but I still do not think that the patient will experience being uploaded into a computer. It will only experience having it's brain fried until death. For the clone, the memory might just be glorious, like some Hollywood special effect sequence, but personally I find this wholly unsatisfying.
I'm not saying it shouldn't be done, either. I'm only saying that I suspect some people may have the wrong perspective about it. If you sign up for this procedure, you are not signing up for immortality, but merely to donate a copy of your memories to whomever is running the simulation.
You would be like an organ donor, but instead of saving a life, you'd be feeding your mind to the new Zombie Second Life.
It would not benefit you, but it might benefit someone else.
As for me, I'm a selfish, arrogant bastard with a desire for physical immortality and superhuman capabilities.
I'm not sure. Your body changes all the time. Your mind is always running on different hardware than it was just a few moments before. The changes are just small at each step.
>I'm sorry, but I still do not think that the patient will experience being uploaded into a computer.
If the copy/delete process was (perceptually, hence effectively) instantaneous, what is the difference?
Lets say your biological body was going to die, in a day.
But you have the option of being perfectly copied into a new biological body now, with the current body being instantly destroyed. Would you take the option? I think I would.
Whats the problem with that?
How is it any different than what happens to your body on a day-to-day basis, now, anyway?
I don't see the problem, when we put it in these terms.
The question goes to consciousness, and our special identification with our own perspective.
We can posit that our "consciousness", as we experience it, resides entirely in our physical bodies, and that exact duplication of some subset of those bodies can create a consciousness exactly the same as ours. So any observer outside our body will be indifferent to whether they deal with the original or the copy.
But unless there is some communication between the two consciousnesses, you will experience the copy as something outside yourself. You won't have its memories after the moment of duplication, and you won't experience continuation. If your body dies, your consciousness dies. So while the world can have endless copies of you, and perhaps each copy feels like you, you still experience mortality. Your consciousness ends with the particular physical construct sustaining it.
At least, in the posited model.
The gholas of Dune are aware of this, and don't imagine themselves immortal though they know copies of themselves will be made after their departure.
It's he same problem as teleportation (star trek teleportation specifically)
They scan you, break you down, then transmit and create a new you.
Now if you re-order this a little.
Scan, transmit, create and then break down the original.
There are two of you, then the original is killed.
This shows that during the original process the first is scanned and killed, then a new person created. This is exactly how mind uploading would work in terms of continuity for the person being uploaded. (It's also why I would never ever use a teleporter.)
This is exactly what mind uploading is, they scan you and upload you. Creating a new you. Then you die and the other you lives on.
This is very clearly shown if you scanned and activated the copy before the first person died. Are you trying to imply that you would suddenly, as a single entity, be experiencing two perspectives at once? Because that just wouldn't be possible.
It is technically immortality for your memories, but not the specific physical instance of you.
Going to sleep does not cause the brain to actually power off. It's still running, your consciousness is just different. You still wake up from loud noises and light, and external stimuli can still influence your dreams.
>Going to sleep does not cause the brain to actually power off. It's still running, your consciousness is just different. You still wake up from loud noises and light, and external stimuli can still influence your dreams.
Anesthesia, then, or fainting. You cannot wake up from loud noises, and you have no consciousness.
One can also image a transfer process that is just like what naturally happens with regular body cells changing every 7 years or so.
E.g your brain is slowly, over a period of years, replaced cell by cell with new compatible neurons. One by one, your old neurons are replaced with the new, designed ones.
Those can read the state of the neuron they exchange AND exchange signals with your regular neurons, so before and after one is replaced, your brain is still functioning normally. No "power off" phase.
That is what the person above describes when they say they want to be injected with nano-bots to be converted. And that would, I believe, give a seamless transition with no break in the stream of consciousness.
However this is completely different from the concept of mind uploading where your brain is scanned and simulated separately.
> can we not agree that it is entirely manifest in the brain?
This is the old materialist vs idealist, monoist, dualist etc. debate in the philosophy of mind. It isn't as much of a slam dunk as you think it may be, and the gulf in opinion is broad and branches out into many tributaries.
Materialism is the belief that everything in the world, including mind, was material, but that has progressed into physicalism since late nineteenth century physics showed that not every force is made up of matter.
I used to be very interested in this topic and leaned towards the idealist argument of mind being made up of more than material, but I lost interest as the debate on both sides tends to spiral and involve religion and spirituality. But I still tend to believe that if we completely cloned a brain materially, we would still not have a mind, that it would be missing something.
My own conclusion is that we simply do not know, and that to me is more interesting than knowing as the pursuit is more rewarding than the end goal. There will always be open and unresolved questions in science and philosophy. We have yet to explain seemingly more simple phenomena such as gravity, so explaining the mind (or indeed altering it or cloning it) seems so far out of reach.
You express views and attitudes very similar to mine.
For people who find these things interesting, I recommend checking out the so called hard problem of consciousness
James Trefil, physicist - so they aren't all dopey philosophers ;) - notes that "it is the only major question in the sciences that we don't even know how to ask."
> I still do not think that the patient will experience being uploaded into a computer
Creating a copy is like having a Siamese twin you didn't know about, because it was sharing 100% of your body and mind. But with the copy you are split apart for the first time.
If you are being uploaded at the time of the split, then one of you experiences the seamless first-person transformation into digital form. The other you stays biologic and dies, immediately or eventually depending on the procedure.
The irresistible urge is to mope that "you" are the biologic one, the one that dies. But this isn't true, you really are both. The one that became digital is you in every sense, it woke up in your shoes, it took your date to the prom, it has your personality, it will make the decision you would make going forward.
If you think the copy is inferior or inconsequential by virtue of not being the original, then consider what if we right now are all copies? Would our lives and first-person experiences be less genuine and less valuable if it was revealed there the true original versions of us existed elsewhere? In an upload scenario the copy is you.
But why bother? Unless you really think the world is worse off without you and your contributions, why would you participate in a service like this? It's not going to do you any good or let you live a day longer. Your copy will go on having a perfectly happy life being the new you, but why should any of us be excited about that?
You're discounting the experience of the copy and identifying solely with the original biological human. Instead you have to really internalize that both experiences are equally yours.
That said, I wouldn't get excited about it. Uploading is a bald afterlife myth with the same capacity to beguile as the religious versions. Better to focus on the here and now.
I may as well internalize that Warren Buffett's experiences are equally mine, or a strain of bacteria. If we're not defining "self" through any real continuity and instead just making it an arbitrary label, there's no reason everybody can't be me. It becomes a bit meaningless.
Sleep and anesthesia break continuity, as has be reported elsewhere in this thread, so self is not about moment to moment continuity.
The reason your copy is equally you vs. Warren Buffet is that your copy shared every molecule in your body and every thought in your head for your entire life up until the moment of splitting off.
Let's imagine you run the simulation when you are still alive. Are both copies you? Wouldn't each copy believe it's a full consciousness and not half a consciousness?
Assume you do a regular (or even automatic and continuous) sync between the copies. Which copy would you prefer survived, if one of them were to die?
Both copies would claim to be you, and would have equal right to that claim. Yet each copy would be a fully conscious person and would immediately diverge into its own individual from the shared point on.
This of course sounds like a contradiction, both are you and both are individuals? That's because our language and concepts just don't have the muscle for this situation. "Both are you" is short-hand since the wold "you" becomes ill-defined or at least radically transformed. Consider Hofstadter's "twin world" concept from his "I am a Strange Loop" for how a single "individual" could really be made up of multiple individuals.
From an objective point of a view it's much better if the digital version survives, because we're assuming the biological original has a shorter life span.
You avoided the question, what would you prefer? There would exist two (or n) instances of you, with a shared history, but no shared present. Maybe you posit the question is moot because the concept of you dilutes at the point where a copy is made? I can't imagine how it would dilute enough for the physical you change its self-preservation instinct, though.
On the other hand, I'm not so sure that the digital version is preferrable, there are lots of maladies that would be trivial on digital versions, for example destroying the being, controlling it and altering in any way. All that stuff is (so far) harder on the biological world.
(PS: Thanks for the Hofstadter pointer, I stopped following him at The Mind's I)
I don't like "would you kill X or Y" questions. For one details matter, and we have no details. For two it's just impossible to say sitting here in my comfy chair how I would react in some dire life-or-death situation.
Lets go over this carefully as the concepts are slippery.
Lack of continuous experience of an event does not mean you died, it is merely memory loss.
Assuming the very reasonable theory that consciousness is classical and emergent from the network structure and interactions of neurons and glia then it should be possible to encode this consciousness on a turing machine. If we go a step further and replace certain collections of simulated cells with black boxes that behave the same way given an isomorphic set of inputs then we can have consciousness even more cheaply.
Your idea is not necessarily more grounded. Assumptions you make are: the substrate does not matter as long as it is not digital, you assume the replacement material will not itself have effects that result in vastly divergent actions in the long run. Yet the perturbations a neuron suffers will differ from those of nanodiamodons, will this be significant? You also take for granted the new replacement brain will not allow a very large viewpoint shift due to the new complexities and speeds of thought possible. That some invariance of self remains from morphism to morphism. That there is more resemblance between you and the the final being than you and a lemur like ancestor.
It is likely that the beings of the future will have far more sophisticated definitions of self and identity.
I personally think that physical bodies will give way to digital minds and its only a matter of time. Whether linear time or log time I can't say. But physical bodies are resource hogs. Progress is energy intensive and so is ever more complex thinking. Imagine a being whose memory is so dense that its thoughts have a gravitational pull of their own and its mind risks collapsing into a blackhole... Eventually theres going to be a lot of pressure to compress thinking beings and squeeze as much thinking capacity from matter as efficiently as possible.
You not only misstate my assumptions, but seem to miss my point entirely.
Here is my point: the original human will not experience continuity, therefore it's instinct for self-preservation will not be satisfied. The best it can hope for is to take comfort in knowing that a copy will survive. Personally, this does not comfort me.
I don't doubt the possibility of a very high fidelity copy and supporting simulation, for all intents and purposes. I also think there may be sound reasons to pursue it. But I don't think it will benefit those who are copied, beyond any positive thoughts and feelings it may give them before they die.
Being selfish and programmed for self-preservation, I desire physical immortality instead. I have no interest in donating a copy of my memories to a simulation project.
I hope this clears it up; my interest is quickly waning, and I have work to do.
The claim to the identity is not the issue, the issue is that the person who got copied will die and experience that death.
The existence of a copy does not make the original person to resucitate or otherwise keep perceiving and thinking.
To express it in a bad analogy: you can have a bit by bit backup copy of a harddrive, but when a power surge burns the CPU and the disk, you have to throw both away. You can buy a new CPU and place the backup, but the hardware is different, there is a shutdown moment, and when you power back, the continuity is lost, it's a different entity what gets booted up.
To preserve the consciousness of a person, between the "hardware change" I see no other option to the existence of something like a central repository of consciousness outside both the body and the computer hosting the simulation that gets automatically attached to a particular set of memories/perceptions/experiences (and whatever else defines a consciousness), so when you die, it stores your consciousness and when the simulation is booted up, the continuity is triggered. I find that far fetched.
In abstract, philosophical terms, it might not be. Even in policy terms it's probably not (other than it might be cheaper to store people in SANs.) In practical, day to day terms, of course it is! My own instance of self doesn't want to cease to exist.
> Granted that consciousness is not fully understood, can we not agree that it is entirely manifest in the brain?
I don't think we can agree on that. There certainly are many claims of consciousness existing beyond the human body. For example, see a "Unified Field" of consciousness.
> I mean, you don't believe that human consciousness somehow survives the body after brain death, do you?
Literally billions of people believe it does (see: pretty much all major religions). Maybe human consciousness doesn't survive brain death, but some other form of consciousness does?
I think its possible to contend that our physical body is sufficient in providing a framework for our human consciousness to emerge, but it may not be necessary in maintaining it.
Think of a stroke patient. Often stroke patients lose their memories / personality temporarily, but then regain it some time later. The injured portion of the brain heals in some respect, but the memories/neural functions of the injured area often move / reorganize to a different location in the brain (which is why sometimes a stroke patient can feel sensation in say their hand when someone touches their face).
I think this comes to show that our consciousness/memories/etc are not necessarily tied to a specific source of physical matter, as even the brain can reorganize the "coded" functions to different areas inside it.
I believe that human consciousness is entirely manifest in the brain (and, of course, it's effected by other nerves elsewhere in the body). But moreover, I believe that brain is performing "normal computation," by which I mean Turing complete computation. Because of that, there is no real reason why the same computation responsible for human consciousness couldn't run equivalently on other Turing complete hardware.
I could imagine how a suficiently advanced machine could simulate one or more consciousnesses.
What I can't imagine is how you, the person sitting in front of the computer will somehow wake up in the computer. I can only see copies waking up. Even if you are dead. For the people who love you, having a copy would be great, but you will still be dead.
I can't believe I'm bringing this up, but the Arnold Schwarzenegger movie "The 6th Day" really matches my views on the subject, and also matches your views.
(Obvious spoilers for the movie)
In the movie Arnold is cloned, wakes up later, and goes about to find out what's going on. At one point he finds out HE is the clone! Back to our discussion. From the point of view of the Copy, the procedure was successful - the consciousnesses was transferred to the machine and the copy continues existence. From his POV he does wake up in the computer after the procedur. For you, though, it's a copy that wakes up. Perhaps your brain is destroyed during the upload process and you stop to exist. For the copy you (that is, him) continue to exist and avoid brain death.
If a scientist asks you, "Was the upload successful?", you (the original) will answer "No". The copy you will answer "Yes".
For the digital copy to not notice the transition, there would probably have to be more going on—like a Matrix-esque simulation of the physical world or a realistic android to house the copy.
Even if the cloned memories could be streamed from your brain into a running simulation, allowing a memory of crossing the threshold from the physical world to the digital one to be created in the simulation, the 'patient' will not have this experience. The patient will still have to experience death.
Going further, let's imagine that we want it really, really badly. Let's destroy each neuron immediately after taking it's state and pushing it into the stream, so that the whole process of uploading and death are completely synchronized and nothing of the patient's consciousness remains afterward.
I'm sorry, but I still do not think that the patient will experience being uploaded into a computer. It will only experience having it's brain fried until death. For the clone, the memory might just be glorious, like some Hollywood special effect sequence, but personally I find this wholly unsatisfying.
I'm not saying it shouldn't be done, either. I'm only saying that I suspect some people may have the wrong perspective about it. If you sign up for this procedure, you are not signing up for immortality, but merely to donate a copy of your memories to whomever is running the simulation.
You would be like an organ donor, but instead of saving a life, you'd be feeding your mind to the new Zombie Second Life.
It would not benefit you, but it might benefit someone else.
As for me, I'm a selfish, arrogant bastard with a desire for physical immortality and superhuman capabilities.