Discover more from Philip’s Newsletter
Thoughts on how we might be stuck with this mortal coil.
Let me start by defending that I am at least somewhat qualified to talk about this subject (although who is, really?). I’ve worked for 25 years on ‘uploading’ people into virtual worlds as avatars, which is about as close as we’ve gotten yet to the actual idea. I have a physics background and lots of software experience with the simulation of different things. From a more dialectic perspective, I’d also have argued 20 years ago that uploading was both possible and inevitable.
In a nutshell, ‘uploading’ is the proposal that we could make a copy of a brain’s state (the neurons and synapses and whatever else we need to copy) that could then be uploaded into a virtual environment where it could be connected to a virtual body and re-animated through a sufficiently detailed simulation that copies what the original neurons were doing. You - the adventurous uploadee - would awaken to find yourself in a new digital world, but with your sense of ‘you’ intact. How amazing! But could it really be done? And, perhaps more importantly, would we be happy with the outcome?
Although there is still some debate about it, the majority of scientists believe that we will eventually be able to make simulated versions of brains that actually do think like ours. The recent progress in AI with large language models and image generators reinforces this thinking, since we are clearly getting closer and closer to computers that can generate text and images with a complexity and nuance rivalling our thought process. The faster our computers get, the more they validate the idea that we will be able to simulate brains the same way we are now able to simulate things like turbulent fluids flowing or robots that can do backflips. It seems very likely that once we have fast enough computers to simulate as many neurons with as many synaptic connections as our brains, they will be indistinguishable in terms of their capabilities. So it is feasible (at least to most scientists) to imagine AI ‘brains’ that are as powerful as ours. There are holdouts who think that some sort of as-yet-undiscovered quantum-mechanical magic will forever separate digital brains from human brains, but these are a shrinking minority of voices. But still… creating digital beings as capable as we are is quite different than uploading a person from the earthly into the digital realm.
To upload ‘you’, I must first define a boundary between ‘you’ and ‘everything else’. Is that boundary the brainstem (as those who have already chosen to cryogenically freeze their heads must surely hope), or is the boundary closer to the edge of your skin? When we walk, for example, we use circuitry in the spinal column that dictates the details of our gait. People having had certain types of spinal injuries have to re-learn walking. The details of which nerve cells in the motor cortex connect to which muscle fibers and how patches of skin map to mechanoreceptors are different for each person. Inhabiting a different virtual body with entirely different nerves would almost certainly not feel like ‘you’, and might well be an experience closer to a continuous and painful white noise - with none of the outgoing and incoming signals matching your expectations. How long (if ever) might it take to re-map this experience to match the new digital body?
But even if we scanned a person out to the surface of their skin, we wouldn’t be done. What about your gut biome? We are donut-shaped creatures (think about it carefully), with half our ‘exterior’ being on the inside of our bodies. The brain actively communicates with the gut, with some of that communication being assisted by organisms in the stomach and intestines that don’t even share your DNA. Do we need to also upload those little creatures as as part of ‘you’? It may well be (and scientific experiments increasingly suggest) that the behavior of microbes in your stomach can be thought of as part of your mental state. The ‘you’ that you contemplate with such seeming precision is really a big mess of stuff in close formation. There isn’t a hard boundary. Our imaginary uploading scanner will find no markers telling us which things belong to what person - and therefore no way to know when to stop scanning.
In fact, the perception of a boundary between the self and the rest of the world seems to be more an artifact of perception than a physical difference. If two people are holding hands and we zoom in with a microscope on the place where their skin touches until we can see the molecules moving around, we will find the laws of physics being obeyed across the boundary. There will be no evidence of one person ending and the other beginning. They will be as one, in every way we can detect. But both of them will still feel very distinct, to themselves. They will claim that they are different - able to move separately and make independent decisions - any yet no proof of that separateness will be seen in the microscope. And what about conjoined twins having two minds but one body - how would we upload them?
There is another related thought experiment that you can play out a few different ways: The basic idea is that you are sick, need a surgery but might die in the surgery were it not for the ability to make a ‘backup copy’ of your brain. Now imagine a couple ways that could go. The first is the case where you didn’t know you were ‘backed up’ before going under. This is obviously deeply unsatisfying since clearly you experience dying, despite the fact that (unknown to you) a second copy of you was made, but that will be very unsatisfying to the dead person on the table. A second option is to make a backup of you just before the surgery - in which the ‘you’ lying on the table would then be OK with being killed, right? Again, this is obviously unlikely to be satisfying for you. Then perhaps consider the alternative where an identical backup version was standing right there talking to you trying to convince you it was OK to die. Even talking to it, you would of course have an unshakeable feeling that ‘you’ had not transferred to the robot - that it was somehow an impostor that only ‘seemed’ to be exactly like you. Somehow, no matter how you think it through, it feels like there is still some version of you that has to die. So what are we missing? Is there something, somewhere, that is not making the leap from old you to new you?
What if the idea of ‘you’ doesn’t have any meaning at all when referring to less than everything? What if we aren’t able to upload ourselves, but instead discover that we cannot exist separate from everyone and everything else in the universe? And… what if part of what feels so funny about virtual reality is related to this same observation? Virtual worlds are populated with avatars controlled by people that are outside the simulation - puppeteering the avatars with strings leading back to the human world - back to our world. And yet we also want those virtual worlds to feel whole and consistent - with predictable and deterministic laws of physics. But at the boundary of the avatar and the world those laws must be broken to give the human operator the experience of free will. Maybe the inconsistency at the boundary of the avatar is deeply unsatisfying because it is impossible and inorganic. Maybe through failing in trying to create these worlds we discover that we can never upload ourselves - because we do not exist.