“Take very high resolution scans of brain structure to characterise how someone’s neurons work, recreate it in a digital format then put it in another robotic, virtual or biological body.”
This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
It's really kind of odd that people talk about brain transplants, or living forever, and then talk about uploading your mind to the cloud or swapping someone else's healthy brain into your healthy body, or scanning your brain and recreating it, and make it sound like it is going to help you to live forever. If my brain is scanned and uploaded to the cloud, a copy of me would live on forever, but me, literally, me, will still deteriorate and die, then rot away into nothingness. My mind, my personality, my essence, me, will not live forever under any of these solutions. I'll die. A copy of me, a point in time snapshot of who I was when it was taken, will continue on, with my memories, mimicking who I was. That gives me no comfort at all. I don't care if a copy of me lives on forever if I still die. That's not immortality, not for me personally, not if I die.
Every time you go to sleep your consciousness shuts down. What wakes up is something slightly different due to biological processes. You are not tired, your mood is different, part of cells in your brain got replaced, memories become more stable and maybe linked to other similar memories, etc... This is way more pronounced for people waking up from a coma. Philosophers thought of this for a while, see https://philosophy.stackexchange.com/questions/66018/the-bra...
So, each morning is a copy of you wakes up? Or do you draw a line in the sand and say that is different.
I raised a similar objection to the GP comment many years ago, and someone made the same response to me. I wonder if I'm alone in that instead of making me feel more optimistic about mind transfer, it just made me slightly afraid to go to sleep.
Consciousness not a single isolated process and describing a lull in activity as "shutting down" is incorrect.
> So, each morning is a copy of you wakes up?
Obviously not. Many of these processes also happen in reverse when you are awake. So when I go to sleep I'm tired and in a different mood. When did that "copy" get made exactly?
> Or do you draw a line in the sand and say that is different.
You can easily ask "how much of my conscious state is controlled by my body's chemical state?" If that's a non trivial amount then simply copying and hoisting my memories into a different body is clearly going to have a non trivial impact. To the extent that it's easy to say "that would not be /me/."
> Consciousness not a single isolated process and describing a lull in activity as "shutting down" is incorrect.
but there is indeed a discontinuity in your consciousness when you sleep. This discontinuity cannot be distinguished from the discontinuity arising from a copy of your brain on first "restart".
> Every time you go to sleep your consciousness shuts down. So, each morning is a copy of you wakes up
I think it doesn't "shut down", maybe fades away to a different, low-power mode. When you go to sleep and then wake up, here is still a continuity, because of the same underlying structures and processes from which the consciousness emerges. So it is like a 4D thing.
That continuity never really breaks (until death? which is like singularity) and I think this is what makes you "you". You can't copy/teleport it (kinda by definition), but you can extend/transform it.
Perhaps a "Ship of Theseus"-style approach would work — gradually replacing neurons one by one, or brain sections piece by piece. Alternatively, the brain could be extended with peripheral "devices" connected to it with bandwidth comparable to the interconnections within the brain itself, up to the point until the biological part becomes irrelevant. This is similar to how a portion of the living brain can be removed while still preserving consciousness — as neurosurgeons often demonstrate.
There’s a difference when there could conceivably be two of you.
There will never be two of me in my bed when I wake up, so I’m the same.
If my mind could be copied and embedded in a robot or whatever, then there are two of me, each now diverging. That other one is not me, as far as I, the other, am concerned.
You'll need to be more clear in your last sentence.
If you are cloned while you are sleeping, and both the original and the copy are moved into different rooms before you wake up, both of you think themself is you, and both of you think the other is not you. So, who is correct?
If you tell me this will happen tonight, I would right now call both "me". Both of them after the event would agree that I am them. Both of them would agree they are different people to each other, even if they say so in perfect stereo unison.
from my own perspective it would be impossible to tell given the parameters of this thought experiment, but any outside observer who witnessed the cloning process and the staging of the second bedroom would know.
Moreover, if randomly one of the two dies and the other survives then either the original or the clone experiences death. So from the point of view of the original before the experiment, he has a 50% chance of dying. If we change the experiment to say that he has a 100% chance of dying but the clone has a 100% chance of surviving, the original would not agree to the experiment unless he was suicidal; obviously he has no reason to care that a different person will live on assuming his identity after he dies.
You're assuming that there exists such a cloning process, and in doing so, silently ascribing properties to that process that completely change the outcome.
You're in effect saying "Assume there is a cloning process that is instant and creates a complete copy of you, both mentally and physically," and sure, in that theoretical world, you can talk about there being no way to know which was you, the clone or the original.
But IRL, we have no way of knowing what the properties of such a cloning process would be - maybe it's impossible to create a cloning process such that mentally, it is a complete copy. Maybe you can always tell whether you're the original or the clone. Similarly, maybe we simply cannot create a digital copy of a mind that cannot immediately tell it is the copy. We have literally no way of knowing.
So instead of living in the philosphical realm, we can have a lot more productive conversation in the practical one. Practically, it strongly appears to me that I have continuity every time I wake up, and practically, if you copied my mind and ran it on a computer, that mind would not be the "me" that appears to experience continuity. Practically, it would bring me no comfort to know that this other mind that used to be a copy of my own continues onwards while I die, no more so than it would bring me comfort to know a twin of mine continues onwards.
It makes no difference. If it’s possible for the two versions to meet (they exist simultaneously) then what you’re describing is simple deceit.
A world where two of me exist is fundamentally and irreconcilably different than a world where I simply go to bed and wake up each day. It makes no difference if I personally am aware.
it's almost as if your thought patterns and language are a product of your experience of being the only copy of you...
if having copies of you walking around was a normal thing, our language and expectations would reflect that.
right now if another copy of you comes by and claims your wife and your house, you'd be miffed. but the whole concept of one of you owning the house is just an artefact of how individuality and possessions happen to be right now. it doesn't have to be forever. there was a time you could steal a picture and have the only copy - now you can save a copy on your phone and you don't deprive another of theirs. some day people and stuff could behave the same.
There's a difference between "I have changed" and "I have been copied." This seems obvious enough to me that I don't question it, but is it not obvious to everyone?
The fact that "you are different" does not mean a "different you" now exists, though. Change of this specific kind is already intrinsically integrated into your existence and identity.
Honestly I'd like to have my mind copied, but may only be 'simulated' once the current me is dead. From my perspective I'll be dead and there's no changing that, but from the clone's perspective, life just goes on.
I do wonder sometimes if each period of being awake is the entire subjective experience of a sequence of independent conscious entities. Like I woke up today with the memory of yesterday, but possibly that version of me that had the subjective experience of being was an ephemeral phenomenon that ceased to exist when I fell asleep, just as I will tonight.
> Every time you go to sleep your consciousness shuts down.
What consciousness? Yes, I am aware that "sleep" means that a person becomes unconscious (trivially demonstrated), but that is not the same thing as "consciousness shutting down". Supposing there is such a thing as consciousness, it doesn't seem to be anything that is suspended by the sleep process. And that's also a big "if", because consciousness is just woo-woo nonsense talked about but never defined by people who haven't quite gotten over the idea that humans don't have some immaterial soul.
Consciousness has, I'm told, 40 identifiable meanings. Makes it hard to get to grips with it.
For example, I sometimes have direct awareness of the nature of my consciousness changing with tiredness. Sometimes that has been what I can only describe as if my homunculus has switched to autopilot and is now just watching a cinema of my senses.
But that's probably a terrible description if you've never experienced that yourself.
This makes sense if you believe in a non-materialist self, like a soul. That model wouldn't be falsifiable- we can't measure it, so it can have whatever properties are convenient. You could then rule out the possibility that your soul would inhabit the copied 'you' by fiat.
Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
There's no need to believe in an immaterial soul to think that a copy is a different 'you'.
It is enough to understand that having an independent "command & control" structure (for lack of a better word; C&C) is by definition 'you'. C&C means that the individual can perform a self-contained sense-think-act cycle.
We can infer this from the following:
1. Identical twins, despite having identical genetic structures, have separate C&C, and therefore qualify as separate persons.
2. Two siamese twins conjoined by the body (including parts of the head) have two C&Cs, and therefore qualify as two persons (with varying level of bodily control per person).
3. An individual with additional parts of the body (e.g., in a special case of being conjoined with a twin) has only one C&C and therefore qualifies as one person.
4. Clones are a special case of identical twins, and still have separate C&Cs, therefore qualifying as two persons.
5. A person in a coma or with another condition nonetheless qualifies as a person with some sense-think-act autonomy, despite being dependent on someone else for continued bodily functions.
6. An individual with a transplant (e.g., heart) or an extension (e.g., pacemaker) is nonetheless a person because of their consistent C&C.
7. An individual with extreme memory loss remains a person with C&C (in brain, body, genetics for the most part).
Any other special but naturally occurring cases (e.g., individuals with two brains, individuals with completely separated hemispheres, conjoined twins sharing a brain) would require either that we:
a. understand how brains and bodies work (and therefore, make a more quantified measure of how much of sense-think-act autonomy qualifies as personhood); or
b. decide based on a heuristic (1 brain = 1 person) or principle (protection of bodily autonomy, i.e., 1 body = 1 person).
But none of these need you to believe in a soul, in order to think that a digital clone is not 'you'. Unless, of course, you can prove that an individual either:
1. can sense-think-act with both physical and digital 'bodies' simultaneously and/or sequentially*, which share an equivalent experience (i.e., central C&C coordinating distribution or maintaining strict consistency across bodies).
2. can sense-think-act with either physical or digital 'body' at will, upon establishing a 'connection' and successful transfer of state (i.e., remote access).
3. can transfer whatever is necessary for C&C to another 'body' (e.g., brain transplant), and demonstrate beyond reasonable doubt that their C&C signatures in new body match the ones in the old one.
Soul or not, it wouldn't be me. Imagine that I was alive and existed next to the cloud copy of me. It would be 2 different minds with a common core. A database restore to a new cluster, that immediately became different afterward because our two experiences would drift. Funny enough, the db term for a db cluster which has two writable primary instances, which have received differing sets of writes containing different sets of data, is called a "split-brain".
So if I exist and have my own thoughts and the cloud copy exists and has its own thoughts, and we are taking in reality differently than each other, we are 2 distinct minds. Now imagine I died. What has that changed for the copy of me? Nothing. It's still a different mind than mine. My mind is dead. I'm dead. A copy of me existing after I die does not change the fact that I died. It does not help me, literally me, live forever.
What if you have done the procedure and after it woke up in artificial environment. You'd be aware that there's biological remnant of pre-procedure you that will still experience other things than you will including dying and death. But he's separate entity and you are glad it's not you that are about to die.
Basically if you wake up after the cloning as your old biological self then you just drew the short straw. It's up to you if it helps you or not that there's another you that will continue on whatever you were doing without the nuisance of dying. Some people might find calmness in that they can freely die and nothing of value will be lost.
What about startrek transporters? Wouldn't you use them because you are getting disintegrated and what's create at destination is just a copy of you but not you?
It is the same person but a different human. Even if they drift apart, by all means, their own personal identity is the same, same as if we split the universe with 1 human taking path 1 and in another universe the human taking path 2.
The moment they change, they are not the same person, just as I am not the same person I was when I started writing this reply, and yet, I still am "me", irrespective of the drift.
The person who is you exist independently of any copy of yourself. While a replica of you would exist there would still be the original you manifest in your brain that would exist an independent life and die at some point without ever “resuming” in another body. This is similar to how you can never time travel yourself even through relativity. Your perception of time will be the same no matter what relativistic effects occur outside of your frame of reference. You will live your life span and die.
Your replica however would have a different experience and be able to draw on your memories etc. But you yourself will in fact live a natural life with no shared experience with your replica and will experience death no matter what. It’ll be little solace to you that a copy got to live your life after your passing.
I don't believe in a soul in the sense of a "ghost in a machine", since it introduces so many problems. Nonetheless, it's pretty clear that there is an immaterial component to thought, and therefore to our existence, for various reasons. Examples:
1. If we are entirely material, we can't know anything. For example, if you're thinking about a tree, there must be something about you that has 'grasped' what it is to be a tree. This is impossible if your idea of a tree is nothing more than a bunch of neurons firing in a given way. There is no correspondence between neurons firing and a tree. So there must be more to your thought than neurons firing (or any other material process).
2. Similarly, if we're entirely material, we can't reason. A reasonable thought differs from an unreasonable thought only by its content. It can't differ in this way by its material components in the brain. Therefore, if we hold that some thoughts are wrong and others are right, we must conclude that there is more to our thought than the material.
Is there no chance that “thinking” (or your term “grasping”) is not simply an emergent property of certain matter+states? A computer with a machine learning process seems to be able to “grasp” what it is to be tree (as opposed to a cat or a crosswalk).
You seem to be engaging in difficult philosophical concepts and jumping over important steps (like rigorous definitions or showing why something “must be”).
> You seem to be engaging in difficult philosophical concepts and jumping over important steps (like rigorous definitions or showing why something “must be”).
Yeah, sorry, I bashed that out too quickly. I'll try to slow down a bit, though I can't promise a full treatise :)
For a thought to be about reality, there must be something about that thought that makes it about reality. How this might be the case is one thing, but the fact that it is the case is nonetheless necessary. And for the thought to be about the reality, there must in one way or another be similarity between thought and reality. Again, the 'how' may be extremely complex, but the existence of the similarity seems necessary. To put it in opposite terms, if a thought is not similar to reality, then it is not about reality, and therefore can't constitute knowledge about reality. Do you agree thus far? If so, let that be premise 1.
I'm obviously talking about the concepts we use to think about reality, not about the words we use to describe it.
I will assume that we agree that thoughts can be about reality (even though they're obviously sometimes not) as premise 2.
So, given our two premises, suppose Bob is thinking about a tree. With materialism, where is the likeness between Bob's mind and the tree? It cannot be the neurons or the physical structure of his brain or body, because these things don't have similarity with a tree. The atomic structure of a brain, in any state, is different from a tree's. That much is clear. And likewise, a tree doesn't somehow appear in Bob's brain. Given our premises, this is a problem for materialism. If the likeness is not physical, and it clearly isn't, then what is it?
We can discuss the idea of a tree as 'emergent' from the matter&state. But honestly, I struggle with this concept. Every time I've heard the word used, it seems to be a way of restating the problem rather than solving it. Perhaps you could explain in a bit more detail how this overcomes the 'similarity' problem I've outlined above? ISTM you still get the problem that a bunch of neurons is not a tree, regardless of their arrangements. Either the neurons in some sense 'cause' you to think about a tree, in which case you need to ascribe to the neurons an ability to cause immaterial effects, which again seems a problem; or it is simply a way of saying "we don't know how it works". Again, so ISTM.
On computers: very briefly, a computer doesn't 'know' anything about a tree, or anything else. A 'tree' on a computer is a bunch of magnetic states on a hard drive, or transistors in particular states, or what have you. The meaning of a computer's output (that is, the link between the computer and reality) exists only in the mind of a human being. An LLM's text about a tree is no different in principle from the binary for the word 'tree' on an hdd, or a jpeg, or whatever. An LLM gives the illusion of knowing, but no more than that. Again, I realize this is a pretty summary treatment, but I think the tendency to describe the mind as a computer is misguided for this reason. You face exactly the same problem with the computer as you do with the mind. For my part, I simply reject premise 2 as far as a computer is concerned.
Anyway, very simply, if materialism is true, a thought cannot be about reality, and it cannot have a similarity with reality. Therefore, we can't know anything about reality. But we can know about reality (premise 2), therefore materialism is false.
There are a number of points where this thesis needs considerable work.
Perhaps the first is that you appear to be ambivalent about what you have in mind by 'similarity'. When you first discuss it, you say "...there must in one way or another be similarity between thought and reality. Again, the 'how' may be extremely complex...", but then you go on to say "...where is the likeness between Bob's mind and the tree? It cannot be the neurons or the physical structure of his brain or body, because these things don't have similarity with a tree..." - so this similarity could be extremely complex, yet you dismiss the possibility of it being anything physical without offering the slightest justification. You might find that convincing, but you have not said anything that presents a challenge to anyone skeptical of this claim.
You do not help your case by continuing "The atomic structure of a brain, in any state, is different from a tree's." This suggests that while you recognize that this similarity could be extremely complex, you do not completely grasp how complex a similarity between a physiological structure and something external to the body could be.
This impression is reinforced by your comments on emergence, which you appear to want to dismiss on the grounds that you struggle with it. Again, this is unlikely to be persuasive to people who do get it. For what it's worth, here's my go-to example of emergence: in the theory of evolution, fitness is a key concept, yet you will not find an organ of fitness by dissection - it is an emergent property of an organism's physiology in relation to its environment.
Much of the rest of your comment is in this vein, and it is rather like saying "how can music be recorded on a compact disk? CD burners use light, not sound." There is no point in belaboring this further, so let us instead return to your starting point - "for a thought to be about reality, there must be something about that thought that makes it about reality." A physicalist could respond by suggesting that there is a a chain of physical causality running from the outside world to brain states via sense organs. When we recognize that sense organs do not capture complete information, and also that brain states may be influenced by previous brain states, this view accommodates the fact that we can have incorrect thoughts about reality. How this works is opaque to us at our current state of knowledge, but that does not mean that it does not happen - in fact, it would be quite hard for an anti-physicalist to deny that it does.
The bottom line here is that the correspondence between mental states and reality is not the problem for physicalism that you think it is.
I will return to what I meant by 'complex' in a minute. But nothing in your response undermines my point, which is that similarity between thought and object-of-thought is made impossible by materialism. It is true that "a chain of physical causality [runs] from the outside world to brain states via sense organs", and it is also true that brain states will be different based on the causality imparted by the world, as well as previous brain states, genetics, drugs, etc. But causality is not likeness. A brain-state caused by a tree is not of itself the same as a thought about a tree, any more than something else caused by a tree (like the sound of rustling in the wind) is the same as a thought about a tree. The latter needs likeness, not just causality.
> yet you dismiss the possibility of it being anything physical without offering the slightest justification.
The only way there can be likeness if materialism is true is physical similarity. The brain would have to be physically like a tree when Bob had the thought of a tree for there to be likeness. If we take materialism to mean that everything can be reduced to matter, then there cannot be any likeness that is not of matter. And this means a physical/material/atomic likeness is the only possibility. Yet we know there is no such likeness.
The concept of emergence doesn't overcome this problem. It must either produce a material likeness, which we know doesn't happen, or it must cause some kind of likeness which is real-but-immaterial, which is a problem for materialism.
The 'complexity' I referred to was in the context of how we know there is likeness between thought and reality vs that we know there is said likeness. We can know that the likeness exists without knowing how it exists, or being able to describe it in detail. The point was that no matter how complex the mind/brain/process may be, we know likeness must exist in one way or another, because without likeness there is no knowledge of reality. The complexity of the process, whatever the process is, is irrelevant to the need for likeness. That was my point in raising complexity; I should have been clearer. Complexity does not change the fact that the only likeness for materialism can be a physical likeness.
The CD is not a valid parallel. There is a chain of causality between the laser's interaction with the disc, and the sounds produced by vibrating speakers. A chain of causality is not a likeness, as I stated above. Further, a CD has no knowledge of its contents, so there is no need for likeness (physical or otherwise) between its imprint and the sounds it causes. We have knowledge of the world, so there is need for likeness.
Anyway, the nub of the matter is that you need to show how there can be likeness that is not physical likeness if materialism is true, or you need to show that likeness is not necessary for a thought to be about reality.
Why do you have to show the likeness ? There’s a strict level of fidelity being imposed here.
The brain has to only represent the sensations it encodes.
The tree and the brain are both material, so the phenomenon of the brain encoding it will also be material.
There’s no need for a soul to represent a further reality.
Even if a soul exists, wouldn’t that result in the same issue again?
The argument sounds like we are saying that there has to be a tiny tree in our heads for reality to be perceived. Since there is no tiny tree, we have to perceive it in our immaterial soul. Which just bumps the issue to the immaterial.
But then how is the immaterial perceiving the material ?
I can see from your response that there are a number of points that I need to clarify.
Firstly, I am not at all surprised that you, personally, do not feel that your position has been undermined, and I don't suppose you ever will, but one of the themes of my response is that neither your intuition, nor that of like-minded people, is particularly persuasive; for that, you need something that would give a skeptic persuasive reasons for abandoning their skepticism.
From your statement "causality is not likeness", I see that I need to clarify the point I made in my penultimate paragraph. To do so, we need to go back to your second post in this thread [1], and in particular, to the paragraph beginning "for a thought to be about reality, there must be something about that thought that makes it about reality... And for the thought to be about the reality, there must in one way or another be similarity between thought and reality."
The point in my penultimate paragraph is this: If we suppose that there is a causal flow from the real world to brain states via the sense organs, and adopt the premise that (changes in) mental states are caused by (changes in) brain states, then we have at least the outlines of an answer to your tacit initial question (what is it that makes a thought being one about reality?) that does not invoke or depend on any concept of likeness or similarity. In what I quoted above, you certainly seem to be saying that the only thing that could make a thought about reality is that it has a similarity (in some nebulous sense, at least) to reality, but here we have another way for a thought to be about reality that does not depend on likeness in any sense.
At this point, I can imagine someone saying "... but this causal chain could well result in brain states that are, in some sense, similar to reality." I don't dispute this, as I am not arguing against the idea that such a similarity can be seen, I am arguing that a link between reality and thoughts can plausibly be postulated without an appeal to similarity or likeness in any sense, let alone an appeal to a likeness about which it is asserted, apparently purely on the basis of intuition, that cannot possibly be physical in any sense.
In the light of the above, my misunderstanding of what you were calling complex seems moot - an understanding of causality may potentially lead us to being able to say, with some specificity, in what way thoughts are similar to reality, so it is not inconsistent with the claim that there are no complex issues in seeing that there must be a similarity, even if it does not do anything to endorse that view. In the same way, it is consistent with your claim that the only likeness for materialism can be a physical likeness, without actually requiring it.
The above physicalist story has another thing going for it: it contains at least the outlines of a possible explanation of how thoughts are related to reality, while your story does not - as far as I can tell, it is confined to saying that whatever it is, it can't be physical. If you have anything affirmative to say about how thoughts come to resemble reality and what the similarity is, now would be a good time to present it.
Your response to my CD analogy does not need much attention, as it is just an analogy, but I see I did not make it clear what point I was aiming at, which is that while "how can it be that..." questions can be insightful when the person asking has a good grasp of what is going on, they can merely reflect ignorance when the questioner lacks that understanding - and when it comes to how the mind works, we are all ignorant to a considerable degree.
By now it should be clear that what you claim in your final paragraph is wrong: in order to show that your argument has failed to make its case, I neither need to show how there can be likeness that is not physical likeness if materialism is true, and nor do I need to show that likeness is not necessary for a thought to be about reality. On the contrary, you have chosen to make a strong claim - essentially that the mind cannot possibly be the result of physical processes - and to sustain that, you need more than arguments grounded in appeals to intuition about how things either must or cannot be. In particular, anything resembling 'so prove me wrong' would amount to burden-shifting, and while we are about it, the alternative to 'the mind cannot be a physical phenomenon' is not 'the mind must be a physical phenomenon', it is 'the mind might be a physical phenomenon' (something that I believe is probably true, but which I do not claim to know.)
Hold a tree in your mind. Now describe it aloud. You have now produced a material effect in the physical world that resulted directly from your idea of a tree. Definitionally, that means it is not immaterial- you conjured it with material and measured it with material.
> Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
How is that "scientifically" an "of course"?
How is it more "an exact copy of you is you" than the alternative claim, "an exact copy of you is 'you (1)'" (to borrow from file manager nomenclature).
The trivial example of that seems to be that if you make an exact copy, put it in a different room, and then something happens to the exact copy, that thing does not happen simultaneously to the original copy.
An identical copy necessarily cannot be measured in a way that distinguishes itself from the original. So "scientifically" because we're restricting the space to measurements of the physical world, and "of course" because the conclusion falls almost tautologously out of the definitions, no experiment needed. How can the thing scientifically not be you if it is not materially different from you?
To your example- I am not sure which of two points are being made, so I'll address both. I'm not saying that everything that happens to entity 1 also happens to entity 2, just that both are you. Two things can both be apples, even though biting one leaves the other intact. And if something happening to you makes you not 'you' anymore, 'you' isn't really a coherent concept across even a fraction of a second; you'd cease to exist and be replaced multiple times in that time.
Yes, they are both me. It is admittedly a weird conclusion, but that doesn't make it false- we did a weird thing and got a weird result. Objectively, there is no case for saying either is not you.
I agree that they'd both be you, but I don't think that's what's at stake. "You" would only get to experience one of the two lives at that point: either the original's or the robot's. Presumably whatever consciousness was present before the duplication still resides with the original.
So there's certainly an argument to be made that if someone created a copy of "you" and put it in a robot, and then destroyed your organic body, that consciousness wouldn't "move" to the robot; it would stay in the organic body and be killed. The robot would be a completely new consciousness, but with all of your memories and behaviors and attitudes.
I'm honestly not sure what I believe around this. Perhaps we will discover "consciousness" as a physical thing and learn how to transfer that as well. Or not. Who knows.
I'm with you on this one - continuity is important too, it's not just about form. A robot version of me might be a suitable substitute to my loved ones, but it would always be outside my head and therefore not a continuous extension of my self.
No, one is a copy of you. There is only one you. Literally. A copy can be perceived as being you, but it's literally, physically, not you. If it was you, literally you, it couldn't exist in both. Your copy could answer a question differently than you might, because the copy would immediately start having a different experience than you. If it was you, that wouldn't be possible. If it was a copy of you, you wouldn't exist along side of it, and it wouldn't be able to answer a question differently than you.
Right. You'd have a memory of cloning your mind, and the copy wouldn't have a memory of that. From the first boot of the copy, it would be operating with a different set of data than you. Also the fact you exist along side of it, means it's not you. It's a copy of you.
Scientifically though, an exact copy is likely to not be possible due to the quantum uncertainty principle.
And I'm not sure that's the only issue even for die-hard materialists : think for instance about all the problems that come from multiple exact copies & things like ownership, relationships...
If I copy the contents of an Apple II disk onto my pc and try to "run" it...nothing happens. Isn't mapped to the new hardware. So who is going to write the "human mind/consciousness" emulator that will map onto a hard drive? Will they simulate all the inputs from your body, nervous system, senses, etc.?
And will they perfectly emulate the dynamic situation that is the brain?
You've mapped every neuron perfectly, you've even captured every ongoing signal and the whole thing is recreated perfectly. But that's not the entire story - connections are created and broken all the time, that's what memory is after all. Is it the "same person" if that snapshot is put into an emulator with slightly different results there? They clearly won't be having the same thoughts in a pretty short timescale even if the inputs are perfectly the same.
> This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
That's just one opinion, though. We still don't know what consciousness really is. We don't know what makes us who we are. We don't know if some concept (scientific or supernatural) of a "soul" exists apart from the behavior that arises from the physical structures of the brain.
We just don't know. So maybe you're right, but that's far from clear.
I'd agree with you, except if my brain was scanned and uploaded to the cloud, I'd still exist in my original brain. There can't be two originals of something so the cloud version would be a copy of me, since the original me would still be alive, able to talk to the copy of me living in the cloud.
Suppose you died every night in your sleep and were replaced in the morning with a perfect copy of yourself with all your memories intact. Would you know the difference?
I had a great deal of difficulty going to sleep the night after this first occurred to me.
I only really managed to assuage my fear by realizing there would be an incredible number of dead bodies to deal with if this were happening to everyone every night, and it was extremely unlikely it was only happening to me.
You're taking what the above person said too literally. They didn't mean physically dying, they meant your mind dies every night when you sleep and in the morning a new mind/a copy of your previous mind, is what wakes up and continues on as you.
No, but if I died tonight and a copy of me took over tomorrow, what good did that do for me personally? I'm still dead. Selfishly, I don't care if a copy of me continues on, I only care that I don't.
Let's say you just found out that this is actually happening. Every night you die and are replaced with a perfect copy of yourself. Would you do anything differently?
I would probably get a concentrated dose of existential dread knowing that by EOD I would cease to exist. Usually I'm able to delay that dread with the silly reasoning that I still have many years left before I die, but I wouldn't have that mind hack if I knew I had no time left.
I doubt it. You’d maybe freak out about it for a week but eventually you’re going to come to terms with the fact that this weird setup has absolutely no effect on your actual experience.
Same for other contrived things like dying and getting revived every day, getting frozen and unfrozen every day, taking a teleport dematerializer every day for your commute, having a portion of your brain and organs randomly get swapped out ship of Theseus style, etc.
At some point you would just come to terms with the fact that your existence is really just that of being a mind with a past and present. The future doesn’t really matter.
Why not go out with a bang and spend your life savings on hookers and blow (or whatever decadent thing floats your boat)?
My point is that I suspect most of us wouldn't do anything differently, even if we know it's not our consciousness continuing on, because both scenarios are identical for all practical purposes.
You're right, nothing would change, and both scenarios do play out the same. The difference would be the knowing about it part. That's what changes things for me. If it currently happens that way I wouldn't be aware of it, but if I knew that it was going to happen, the act of knowing, makes it an issue, for me anyway, even if it still plays out the same as every other time.
You remember going to sleep last night. So even if you died and a copy of you was made, intuitively, subjectively, you feel that tomorrow is still you, same as yesterday was same you, even if there is some technical disconnect.
You are 'caring' objectively, we live subjectively, so your caring is of purely academic interest, even, dare I say it, to you.
Subjective continuity is illusionary, you placing import on this illusion is up to you of course, but there is no substantive difference between the discontinuity experienced by your 'copy' in the cloud and what you experience yourself moment to moment.
A copy of me continuing on not knowing or caring that it's a copy of me is of little importance to me. That's not my point. My point is that I will not be continuing on. Sure, a copy that thinks it's me will, and to the world, that would be me. It would have my memories and make decisions based on those memories in a way that would be the same as I would. But that wouldn't be me, literally. My mind, my inner voice, my experience of reality through my mind's eye, would not exist. I wouldn't exist. A copy of who I was continuing on doesn't help me, literally.
No but you would not wake up, a clone who has your memories would wake up but the you that went to sleep will never wake up. That effectively doesnt make a difference but I find it pretty odd to think about. We as humans kinda miss a way verify its really „us“ and not a clone.
Instantly ctrl-f'd to see if anyone would mention transporters. I believe in some circles this has been dubbed "The Transporter Problem". It's a thought experiment that already exists.
Invincible also tackled this problem, with someone cloning a new body, and copying his brain to a new body. For a brief moment both bodies perceive the same thing before their experiences split into the two bodies. The copy wakes up, says goodbye to the original, who is dying, and says "I'm sorry it wasn't you."
This is also IMO related to the ship of theseus problem. Are you the same person you were 20 years ago? Are you the same person in the morning as the person who went to sleep? Are you the same person as a minute ago? What if you add in concussions/memory loss/degenerative disease?
Star Trek's lore includes some technobabble about transporters operating in "quantum" mode to assuage concerns that the person at the transporter destination is not the same as at the source.
Hans Moravec had a suggestion on how to do this in Mind Children: instead of examining the whole brain, you measure a layer, then replace it, layer by layer, until the last. There is never a copy and an original, it's just a ship of theseus self where the neurons are individually replaced with new ones, albeit electronic ones.
I view (hypothetical, sufficiently good) brain upload and emulation the way I view git forks: both children are just as equally "the real one" even if only one of them is on the same physical hard drive.
Looking forward from now, both bio-me and digital-fork-me (or indeed bio-fork-me if the upload gets revented) are equally "my future self", even though they would also each be distinct beings in their own rights and not a shared consciousness over many bodies.
I think that from everyone else's perspective, an ideal copy of me would be me; by the definition of "ideal copy". I, however, would not consider the copy to be me; to me, there is only one me.
That all makes sense. But let's run with this way beyond the foreseeable tech. What if you can replace each neuron in situ one by one up to X%. Then what if it was reversible (the neurons were initially just bypassed, commented out). Someone could then dial it up to 5%.. 50% and if they still felt the same throughout and then went up to 100%. In that scenario would they have copied themselves?
I find it fascinatingly coincidental that neurons are the only cells in the body that don't rejuvenate unless there is some sort of injury [0].
Technically, we are not the same people that we were when we were born. The cells in our body have one by one been replaced with new ones and the old ones have died off. The things that make up my body are 100% different than when I was born, so I literally am not the same person, physically.
Maybe this is an indicator that there is more to what makes us, us, than just the physical assembly of atoms in our bodies. There are things I don't know that we'll ever get a full understanding of.
The cells in our body have one by one been replaced with new ones and the old ones have died off.
This is not the case for our "talking" neurons which is what I was trying to limit this thought experiment to. I think a lot more folks would be ok with preserving their biological brain as is within a robot/clone if that was the only option and understand the body gets (mostly) replaced. Although a few in this thread have alluded to the fact we might be missing important relationships with the rest of the body such as the nervous system and gut biome.
Categories, numbers, logic, grammar, etc. all map onto physical systems, but aren't necessarily directly linked. This is "metaphysics" in philosophy, and is essential in order to even reason about physical systems. Just the concept of a "physical system" is actually metaphysical, but without metaphysics, you can say nothing to anyone about anything. Metaphysics is just generally taken as a "given", but is worth evaluating itself.
Then you'll run into the "realism" vs. "nominalism" debate and you'll understand the philosophical underpinnings of the current culture wars in America.
That's still a copy, just one where both copies remember their life before being copied and equally think they are the original. The original will still experience aging and death.
If you further develop this thought, systems might be capable enough to implant core desires into you before transferring the copy into your new body. "You'll love Coca Cola no matter what, and capitalism".
The above scenario is if you get re-implanted into a self-evolving, autonomous biological entity, unlinked again from the system. If this is not feasible and the only solution is to embed you into a robot with an uplink to the internet, because "why not?", then my biggest issue with a digital self is that there are no guarantees of having a proper firewall, which would equal to total surveillance:
Thoughts are free, who can guess them?
They fly by like nocturnal shadows.
No person can know them, no hunter can shoot them
and so it'll always be: Thoughts are free!
Your memories and biology are you. If someone makes a perfect copy of you, disintegrates your old “you” and then wakes the “new you” up how is it any different?
From what my experience tells me, many people prioritize their children's life over their own. Kind of like the insurance policy that lives on carrying their values further after they have died.
So this is not really that much different. Your own body is nearing it's end but a new vessel is carrying your values your influence on the world further. Even if it is a physically separate part from the own body it's the close enough for me to be considered living beyond my death.
As the article says, a book is not the ink, but the words. I am not my physical atoms, but the connections that form my thought patterns and memories. If it were possible to make a perfect copy of those things, the copy would be "me". If the original still existed, that would also be (a different) me.
Not that it makes things any better for the original you, but the copied you would presumably feel and believe themself to be the real you. From their perspective, "you" really are living forever. To proceed with the copying would be a sort of act of kindness for yourself.
thinking about this even a little makes me want to throw up because this can never be tested. yeah, id be dead, but technically, im alive. im alive according to everyone including myself (new me). So im not dead. But like you said, i died and am dead and dont experience any more life. if all evidence is that i didnt die and still exist, then thats the fact
we can theorize that our consciousness continues, but there is NO WAY to ever actually test it because all experience are that consciousness continues whether or not that is the case. sprinkle in some quantum suicide and my stomach is doing backflips from the existentialism od
The line I like to use here when people talk about this sort of 'transferring' your consciousness - imagine there was a 'perfect' chatbot trained on your every experience, thought, and sensation, and mind.
OK, now your consciousness is 'transferred'. I promise to run you.exe plenty. It's just an exercise in self delusion, even if this was possible. That's not you anymore than a 'you' created by a taxidermist and shoved in a closet is.
We can't copy a person anyway. We're going to find out that it requires an order of events - order of experiences - aspect that can't be recreated to get the actual person. Everything else will end up being a weak simulation (even if it gets a bit closer over time with improvements in simulating).
In theory if you had an atomic replicator you should be able to make a copy of a person. I'm not saying it will ever be practically possible. But I don't see any fundamental laws of nature that make it impossible.
Whats the theoretical route around the no cloning theorem?
pop science communicators tend to introduce the no cloning theorem in the context of making a copy of a person being impossible, but they could be applying it wrong.
Unless some quantum effect doesn't actually allow that to be true, a "great filter" of sorts. Could make a copy and it just doesn't turn on, due to unknown unknowns, can't know why.
You're essentially suggesting 'maybe the laws of physics would stop applying here for unknown reasons', but it's not plausible that would happen absent evidence.
That is true (in that it is a statement always true of all subjects), and I don't think anyone was disputing that. Simply pointing out that we don't know of any reason an identical copy in an identical state wouldn't obey the laws of physics and perform identically.
This is a pretty big philosophical question. There's no universal answer, just opinions. Your conclusion is not the obvious one for everybody.
What happens if you have an identical copy down to the atom. Totally impossible to distinguish. You're put to sleep and your mind is copied over. When you both wake, which one is "you"? Both copies think they're "you" and both are correct. Each has the same sense of self and continuity of identity. Maybe at the moment of synchronization "you" ceased to exist as you branched into two new identities.
Say you upload into a computer. From the copy's perspective, it is "you", it's the same self. It might view your biological body like you'd view an old computer after you finish migrating your files to a new one.
Say you destroy the biological body, or wipe its brain once you copy. Does that change the equation? If you destroyed one of the identical clones, is it even relevant to ask which is "you"?
Personally, I think Altered Carbon comes close to how our society will solve this problem. The original copy has its brain wiped and the new copy is "you" by cultural consensus. Truly duplicating a single identity is strongly taboo and illegal.
I think this is a question that either matters to you or it doesn't. In my opinion, it's irrelevant. I, the original "me" am totally free to agree with the copy that it is "me". I can choose to give it my social and legal identity and agree that "me" is no longer "I". My personal choice is to destroy the original, but one could also choose to let the original body continue and assume a new identity, live a new life or whatever.
I view this the same way I do my past self. The person I was ten years ago is not "me", it was a totally different person. That past self lived a different life and had a sense of identity that isn't at all like what I have today. That past me, the ego that called itself "me" died a long time ago and the "me" now is a different self identity built out of new pieces. In my worldview, "me" is a transient thing. The self is not one fixed thing, it changes over time and sometimes it's replaced. I don't see the idea of transferring my sense of self to a new body as anything more extreme than any other kind of ego death.
I choose to resolve this problem with practicality. I agree with myself that the new copy is "me". My social and legal identity, as well as my own sense of self transfer to the copy. My choice would be to destroy the original copy. Barring that, it would assume a new identity and live a different life far away. It'd get a memory wipe if available. I can make the choice to sacrifice my ego and allow "me" to be assumed by someone else. To me, even to the original copy, the new copy is me. In my opinion, "me" is immortal in the digital copy, even if "I" remain in a mortal body.
Is your savegame you loaded today still the same game you started yesterday? Or is it a copy of yesterday's game running forward? Does it matter?
Are electrical processes rebooted from chemical memory every morning when you wake up or after you have a seizure still the same you? Or is it just a fresh copy of your mind that dispersed when electrical signals in your brain lost continuity?
I don't think that's the thought experiment. We're not talking about physical bodies, we're talking about consciousness. When you go to sleep, does your consciousness cease to exist, to be replaced with a new one, with all your memories and behaviors and attitudes, when you wake up?
I have memories of being me and doing me things yesterday, but was that really me doing those things, or was that a different consciousness that doesn't exist anymore, and my memories are just the high-fidelity recorded experiences of someone else?
And on top of it all: if that's the case, does it matter?
Something I think about a lot is that people tend to compare whatever the most recent innovation was to humans.
It used to be that what made you alive was electricity; you could Frankenstein shock yourself back to life.
It used to be that you were a clock. Your gears wound up at birth, and then eventually you wore out. You needed repairs.
People love to use metaphors, but none of these things are the full picture. Just because computers are so complex doesn't make you more correct. Your brain isn't the whole of your mind, we already know that's true. Why is this silly nonsense entertained?
I used to buy into this kind of stuff, but I've become more and more skeptical of the idea that you would still be yourself if your brain could be preserved/emulated/transplanted/whatever.
Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
More recent research has been hinting that we may even hold certain types of memories outside our brains.
Humans have always been drawn to neat, tidy ideas, especially ones that draw clean boundaries: it's an appealing idea that our consciousness lives solely in our brains, and that our brains could function independently of our bodies, but it seems unlikely that it's really that simple.
As a neuroscientist working on brain computer interfaces, it's painfully clear to me that we are absolutely nowhere close to understanding the full complexity of the human brain in a manner required to simulate or reboot someone's consciousness. It's not even clear yet what level of abstraction is required. Do we need to map all of the synapses to get a connection graph, or do we need to map all synapses plus the synaptic proteins to assign connection weights too? This is ignoring other types of connections like gap junctions between cells, ephaptic coupling (the influence of local electric fields on neurons firing), mapping neuormodulator release, etc. On one hand, it feels like irreduceable complexity. On the other hand, however, you can lose about half of your neurons to neurodegenerative diseases before you start noticing a behavioral effect, so clearly not every single details is required to simulate your consciousness. It would be a MAJOR leap forward in neuroscience to even understand what level of abstraction is necessary and which biological details are essential vs. which can be summarized succinctly.
Anyone claiming to take your brain and slice it up and have a working model right now is currently selling snake oil. It's not impossible, but neuroscience has to progress a ways before this is a reasonable proposition. The alternative is to take the brain and preserve it, but even a frozen or perfused brain may have degraded in ways that would make it hard to recover important aspects that we don't yet understand.
It is, however, fascinating to do the research required to answer these questions, and that should be funded and continue, even if just to understand the underlying biology.
In addition to all that we don't know about synapses etc, I've often wondered if even mapping all the "hardware connections" so to speak would even be enough. You'd have everything in the right place, but what about the "signals" running on it? Does a certain amount of constant activity on these circuits constitute signs of a "living" brain vs a dead one? How much of our consciousness is really in the topology of the circuits, and how much of it is simply defined by the constant activity running around in them? I assume neural circuits form loops that consist of synapses that reinforce or surpress activity. If these signals going around and around ever "stop", can they ever be started again with the same "patterns"? What if these patterns, the living "software", are at least partially what define you?
Well anyway that's my airchair crackpot neuroscience theory for the world to consume ;). I'm sure there must already be a name for the idea though.
Six of the sheep were given a single higher dose of ketamine, 24mg/kg. This is at the high end of the anesthetic range. Initially, the same response was seen as with a lower dose. But within two minutes of administering the drug, the brain activity of five of these six sheep stopped completely, one of them for several minutes – a phenomenon that has never been seen before.
“This wasn’t just reduced brain activity. After the high dose of ketamine the brains of these sheep completely stopped. We’ve never seen that before,” said Morton. Although the anesthetized sheep looked as though they were asleep, their brains had switched off. “A few minutes later their brains were functioning normally again – it was as though they had just been switched off and on.”
On one hand, I wonder if a gradual transition would work. Spend enough time over the years mirroring your conscious patterns onto a computational substrate, and they might get used to the lay of the land, the loss of old senses and the appearance of new ones. There might not be an ultimate "stepping in", but something like you might be able to outlive you, on a substrate that it feels happy and comfortable on.
On the other hand, the idea of "simulating your consciousness" raises questions beyond just cognition or personality. A mechanistically perfect simulation of your brain might not be conscious at all. Spooky stuff.
There's gonna be million artificial minds of various levels of capacity before the first human mind is accurately simulated.
At that time we are going to be accustomed to glitching artificial minds creates, modified, bugged, debugged that current moral conundrums "is the copy me or not", "is it ok to create a hobbled copy of someone" are going to be as quaint bit akin to counting angels on a head of the pin. Mangled and molded consciousness will be as mundane as computation itself.
In my PhD work, I helped conduct the human portion of a study on this topic, contributing to some discussions at the FDA [1]. The idea was a bit controversial then, and I've had a few anesthesiologists get mad at me for it, but the general pattern has now been replicated quite a few times now, such that the field has largely moved on from 'Is something bad happening?' to 'Why does it happen, and how do we prevent that bad thing from happening?'[2]. So it has been a gratifying excursion from my typical research before and since then.
Thanks. This is purely anecdotal, but we had a family member whose child was under anesthesia for a severe respiratory infection. He’s been severely developmentally delayed in his first year, and it’s unclear to us what damage done.
Thanks for sharing. It is difficult to know for certain. If the respiratory infection led to hypoxic damage, then that could also contribute. I have not kept up with the field, but generally the most sensitive period for anesthesia was before 4 years or so. As I mentioned briefly, most of my work is in different areas of research so I haven't kept up to date.
Is there any reason to suspect that adults suffer the same effects as infants? (Not asking to be combative, just curious whether children are uniquely affected because their brains are still cooking.)
You and the other commenter bring up good points. Developmental neurotoxicity (with lesser or no effects in older children and young adults) is, I speculate, probably due to differential gene expression during early development versus later when genes related to development are suppressed and genes related to maintenance are more abundantly expressed. The developmental neurotoxicity probably works through different mechanisms than what is termed "postoperative cognitive dysfunction" in the elderly after general anesthesia dysfunction [1][2], which, all I know is that it is a thing. If I were to speculate it would be that in the elderly there are fewer redundant cognitive resources, and so detrimental effects to cognition are magnified. I know that it used to be thought that post-operative dysfunction is temporary, but it seems likely to me (again speculation) that there is both recovery and permanent dysfunction, but the dysfunction becomes a little more difficult to detect. Going back to my paper, where we used a method to disentangle two types of memory processes i.e. recollection (explicit recollection of experiential details) and familiarity (a general feeling of familiarity with things you've seen previously) which contribute to memory performance but tend to be differentially affected by neurodegeneration (recollection is more affected, and generally more hippocampal), so that sometimes, when not accounting for these processes, a memory test will fail to find differences because patients rely on familiarity to answer memory questions.
Why would you want to go on in a world that has either left you behind or keeps making the same mistakes over and over in a cycle and won't listen to you because you're too old to understand?
And conversely, I think Kim Stanley Robinson puts it best in the Mars trilogy. Scientific progress often has to wait for the old guard to die so new ideas can be tried. Sometimes there are actually new things and they need to be allowed to cook.
A scientist like Einstein experienced scientific revolutions within his lifetime. That's hardly going to be the norm in the history of science, and also a horrible assumption to think revolutions would endlessly be occurring and reoccurring.
Also, we know when we're on the edge of knowledge, especially in cosmology and physics. We're waiting for revolution there. There's dark energy and dark matter. It doesn't matter if you're old or young, you knew that your theories isn't good enough to explain whatever these are.
Scientific knowledge don't get swept away especially if they're rock solid. Newtonian physics still has a lot of relevance after all. It's just that relativity is even more accurate.
Just imagine someone who died 50 years ago coming back and hearing skibidi toilet, no cap, ohio, etc. Then not being allowed to board a plane without a body scan, and not having money for a plane anyways since bread was dime and a gallon of gas was a quarter last you checked. You can't even get a job you're just a brain and all the knowledge work you could do is 50 years out of date.
German physicist Max Planck somewhat cynically declared, science advances one funeral at a time. Planck noted “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents die, and a new generation grows up that is familiar with it.”
There’s a short story about uploaded consciousnesses being used as AI slaves. They go bad once enough years have gone by that they can’t speak the modern language anymore. Then they usually lapse into insanity or depression.
I don't think I agree with you. There are multiple examples in society of damaged nervous system connections with the brain, spine cord damage for example, where the personality of the pacient changes little. In the same sense, losing limbs or entire sections of your body (aside from psychological trauma and other psychological consequences) don't affect personality that much
Of course the nervous system is much more complex, but damage to the brain almost always result in some sort of cognitive dysfunction or personality change, see the Phineas Gage case for example.
>In the same sense, losing limbs or entire sections of your body (aside from psychological trauma and other psychological consequences) don't affect personality that much
"There aren't any changes except for all of the changes, but those changes don't count because reasons."
I don't know how many amputees you know; you may know many. I was in the army for 10 years during the height of the global war on terror and know more than most. Not a single one is the same as they were pre-amputation. Could be the trauma that caused the amputation, could be the amputation. I'm not an amputationologist.
I do assert that a holo-techno-brain will need a shit-ton of e-drugs to deal with being amputated from its fucking body.
The bacteria in your butthole are a part of you just like your brain, maybe less, but they ARE a part of you.
> Could be the trauma that caused the amputation, could be the amputation.
Given the personality changes seen in people who go off to fight in the military and who end up coming back fully physically intact, I think it's more likely that the personality changes here were caused by the trauma, not by the amputation.
I'm not saying the latter isn't possible, but absent evidence to the contrary, it doesn't make much sense to assume the personality changes occurred because of the amputation alone.
Also consider that amputation -- even ignoring whatever trauma precipitated it -- is its own sort of trauma. I imagine if someone came up to me, perfectly physically healthy, knocked me out, and cut off my leg, I would wake up and develop emotional trauma that would cause personality changes.
I see what you mean- but consider that the gut does seem to play a significant role in mood and mental health. The enteric nervous system may not hold memories, but it seems to have something to do with personality and digestion issues can have negative cognitive effects.
Agree that discomfort can cause temporary problems, and sometimes chronic problems in parts of the body can cause life long cognitive impairment. But that is not to say that these represent "you" or your personality. You brain could still function perfectly without those body conditions.
And for the gut example the brain actually does work normally, stomach and instetines removal (and other related surgeries) are fairly common procedures and I don't hear of people complaining about personality changes. Of course, those types of procedures are extremely invasive in a sistemic way, and not only your mental state, but multiple other parts of the body need to re-adapt. But I truly believe "you" will be still be "you" inside your brain
PS.: I quoted "you" because discussions about the identity of one-self are much more complex, just regard it as the most high level definition of the concept
>Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
I think that may be true enough, but it doesn't have the upshot you seem to think it does.
It just means that what we need to sustain not just a brain itself but the totality of the environmental conditions on which it depends. No easy task for sure, but not something that presents an in-principle impossibility of preserving brains.
I think there's a major philosophical error here in thinking that the added logistics present this kind of in-principle impossibility.
Also, talking like this starts to play with anti-science speculation a bit. Octopi actually have neurons extending through their limbs. We don't. So when we talk about consciousness being "embodied", I'm sorry, it's an attempt to romanticize the question in a way that loses sight of our scientific understanding. Consciousness happens in the brain.
Sure, the brain needs stimulus from its embodied nervous system, and may even depend on those data and interactions in significant ways, but everything we know about consciousness suggests its in the brain. And so the data from "embodied" nervous systems may be important but there's no in-principle reason why it can't be accounted for in the context of preservation.
I consider that I have likely died more than twice in my lifetime already. And before this body gives up, I will have already died more times. Must simply enjoy the present and give gifts to my future self.
The way various hormones influence the brain alone makes it pretty clear to me already that you'd be a completely different person when taken out of your body, and I'm pretty sure that's just the tip of the iceberg.
Very well, you think that preserving the brain, or even preserving the nervous system, is futile. But what of total biostasis, preserving the entire organism, just like the archaebacteria that live for thousands of years in ice or other extreme environments by slowing their metabolisms to a crawl?
To me, excessive negativity about the possibility of immortality smacks of weakness and defeatism. You either love life and want as much of it as possible, which makes you a friend of humanity, or prefer death, which makes you an enemy of humanity. I take a stronger line than the neuroscientist in the article. “Death positivity” like that of Viktor Frankl, anti-natalism, even faith in magical spiritual resurrections—all are anti-human viewpoints, only excusable in the past because they were copes with the inevitability of death. Now that we have reason to believe it can be averted, we owe our potential future selves every possible effort to save them from oblivion.
There is a bit of research and effort into a head transplants. I wonder if and when that is successful to see how it impacts the individual. Possibly having memories of the body or changing personality.
First Human Head Transplantation: Surgically Challenging, Ethically Controversial and Historically Tempting – an Experimental Endeavor or a Scientific Landmark? (2019)
I’m not sure I actually believe in quantum immortality but I think it is slightly suspicious—out of all the people you could have been be born as, you just happen to be born in a timeframe where brain preservation might be possible before you die?
Most people are alive right now. The population historically has been much lower, so odds are you would be born around the time high technology would support a high population.
> So what are the figures? There are currently seven billion people alive today and the Population Reference Bureau estimates that about 107 billion people have ever lived.
> This means that we are nowhere near close to having more alive than dead. In fact, there are 15 dead people for every person living.
So it is not wildly impossible that you’d be alive now, but it is fairly unlikely.
Also, hard to say what’s in the future of course, but even if population growth levels off, you’d expect to be born in the future, right? Which brings up another question, why not?
If we are going to go along on the fully ridiculous implications here and reinterpret all probabilities as conditioned on your immortality, why weren’t you born in the far future? I’d expect people born in the future to have easier access to immortality.
Maybe birth rates will go way down if we discover immortality (lowering your odds of being born later). Or maybe pre-immortality minds will be seen as more interesting and worth preserving (increasing your odds of being kept around).
I'm also skeptical of the idea that one can "upload" consciousness and it would still be "you". I suppose this is true in a philosophical sense, but in a practical sense, subjective experience of consciousness rules the roost. It's inevitably going to be a mere copy of you. You don't get to experience any of it. Similar to a software project which is forked, I think it makes more sense to classify it as an entirely different entity at that point.
I suppose there are valid use cases for this, but I'm not that narcissistic to think the world needs eternal copies of me.
The continued subjective experience of the original consciousness is where I believe the real value lies. Digitisation of consciousness, assuming it has any sound scientific basis in the first place, would practically need to look more like the gradual replacement of brain (and bodily) matter with something more durable, enduring, and controllable. A slow process in which carbon is exchanged for silicon, or cellular damage is continuously reversed and aging kept at bay.
> It's inevitably going to be a mere copy of you. You don't get to experience any of it.
You can make the same argument for 'you before you went to sleep' and 'you after you woke up'. The only real link you have to that previous consciousness are memories of experiences, which are all produced by your current body/brain.
Think about this: For every consciousness (including you right now) it is _impossible_ to experience anything other than what the thing producing that consciousness produces (memories, sensations, etc.). It doesn't matter whether the different conscious entities or whatever produces them are separated by time or space. They _will_ be produced, and they _will_ experience exactly what the thing that produces them produces.
With an analogy: If you drop pebbles in either the same pond at different times or in different ponds at the same time, waves will be produced in all cases. From the perspectives of the waves themselves, what they interact with is always _exactly_ the stuff that interacts with the water they're made up of. To them, the question of identity or continuity is fully irrelevant. They're just them.
Similarly, it makes no difference whether you only have the memories of the previous conscious experiences, or if 'you' really experienced them. Those situations are indistinguishable to you. The link to future consciousnesses inhabiting your body is effectively the same.
>> It's inevitably going to be a mere copy of you. You don't get to experience any of it.
> You can make the same argument for 'you before you went to sleep' and 'you after you woke up'. The only real link you have to that previous consciousness are memories of experiences, which are all produced by your current body/brain.
Except I know, empirically, that people go to sleep all the time and wake up, and remain the same person. And I know (for practical purposes) I do the same. I -- my mind/body composite -- lie down, and get up the next morning. I remain the same person.
Simply 'copying' or 'uploading' my consciousness, like a computer file, is impossible even in theory, because I'm not just a conscious mind, but a conscious mind which is also a body. Consciousness cannot be split from the material body, even in theory. Somebody upthread said that he'd seen many amputees undergo personality changes as a result of their operations -- this is an informative (if very sad) example.
In terms of computing (one that we do not understand), it would be like cloning a live machine by taking the CPU dye only, or maybe the hard drive. How many parts you need to take away from a computer for it to be the same machine? It's easy though with a VM, or a kernel that supports many hardware. Kind of a digress, but I liked this idea.
I don't think this is a great analogy because computers don't have consciousness (yet).
But I usually move the hard drive (or at least its contents) between machines when I get a new computer, and that's enough for me to think of it as the "same", even if I reinstall the OS on the new machine and just copy my home directory onto the new one.
> More recent research has been hinting that we may even hold certain types of memories outside our brains.
Not just hinting - the evidence is strong and accumulating rapidly. The gut, in particular, has so many neurons that it is considered the body’s “second brain”, to say nothing about the impact that gut bacteria have on your mind.
If you really wanted to create a copy of your “mind”, you’d have to image every neuron in your body for a thoroughly accurate copy. And then accept the fact that your entire behavioural profile is then missing the input of your gut bacteria, which appears to have a significant and non-trivial impact.
Transferring our consciousness into "the net", or some other fuzzy concepts are so far removed from reality as to be complete fiction. This includes freezing our brains and reanimating them later to resuscitate our lives.
They not only massively overestimate the functionality of today's tech to receive something like our consciousnesses, but even more so, by orders of magnitude, underestimate just how complex our living bodies are.
We only have the vaguest of ideas about how our physiology works (while we might be able to replicate flesh cells for "fake meat", we have 0 understanding or control over how those cells organize to form macroscopic organs). Applying this to the brain, our understanding is even more primitive. An example would be recent news that perhaps the brain is not sterile, but hosts a microbiome. Whether or not the brain hosts a microbiome is still "controversial".
We're still hundreds of years away from a comprehensive understanding of physiology.
But of course, we're never going to live that long, because we still believe (statistically as a species) in invisible guys in outer space that tell us we need to dismember people who believe in the WRONG invisible guy in outer space.
Our primitive violent ape species will extinct itself long before we ever have a comprehensive grasp of how life works, especially to the level of understanding consciousnesses...
Ultimately, I am going to quote one of my favorite writers [0] and say that I am not afraid of a life that ends.
I don't want to be a brain in a jar. Or in a computer either. I enjoy experiencing physical sensations and interacting with the world in meatspace. And if I can't enjoy either, then just let me die.
And I apply this to not just brain preservation, but any attempt to artificially prolong the quantity of my life at the expense of the quality of my life. I do not want to spend my last years in a hospital bed hooked up to machines and unable to move. That was how my dad died, and even then he was lucky enough his partner (who he had discussed this with before and who had the authority to make the decision) eventually agreed to switch him to palliative care in his final hours. Similarly, I have seen what chemotherapy does to people, and I have long since decided that if I ever get cancer, I will refuse chemo and let myself die. I am also having a living will drawn up that includes a DNR order, multiple scenarios where doctors will be ordered to pull the plug, and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
I will make sure I die with my autonomy and my dignity intact.
[0] Al Ewing. He writes comics. Read his stuff, he's good.
Do you have a source for this quote? Googling just returns this page.
I was particularly struck by:
> if I ever get cancer, I will refuse chemo and let myself die
And figured this quote must be at least 20 or 30 years ago? Cancer isn't necessarily a death-sentence, and many treatments are much less harsh than they were 20+ years ago.
This seems a bit extreme. Chemotherapy and its effects can be a very temporary thing, and your quality of life can go back to normal after you've finished your course and the cancer has gone into remission. Certainly there are aggressive cancers where you'd be fighting a painful battle of attrition, but there are many cancers where prognoses are good, and quality of life once treatment is done is more or less the same as before. A blanket personal ban on chemo is reckless and shortsighted.
The prohibition against amputation and sensory organ removal is a bit nuts too. You'd rather die than have someone remove one of your eyes or ears, or say a hand or arm or foot or leg? That is profoundly sad, and intensely insulting to anyone who has had to deal with that sort of thing and has nonetheless lived a full, rich life.
I get that many medical interventions do actually have a terrible, permanent effect on quality of life, but these seem like pretty extreme views that ignore reality.
I don't know what the commenter who posted about chemo and amputation actually thinks or believes. But I hesitate to call them "nuts" or to lecture them about how they have a wrong opinion. And I would not expand their personal opinion as a judgment on people who decide they can live with the effects of chemo, or amputation, or loss of an eye, because nothing in the original comment included a judgment on other people. Everyone has their own threshold for what they consider a life worth continuing, but we should not impose our own thresholds on other people, or judge them for making different choices.
For me the question goes beyond "Can I survive chemo (or amputation) and resume something like a normal life?" When you have to face cancer or loss of a limb or any illness or injury that threatens your life, or perceived quality of life, or dignity and autonomy, you necessarily have to think about what that means for your future. Until you get a diagnosis of (for example) cancer you don't know what it feels like, or how you will react, to the fact that no matter if you survive the treatment or not, you will always have that threat and reminder of your mortality in your conscious thoughts. You think about how you might not get so lucky the next time, how much your treatments might cost, what your illness might put your loved ones through, how far you will go to keep yourself alive even when it imposes costs and obligations on other people. And you think that maybe other people will have to make hard decisions about your future if you can't. A cancer diagnosis doesn't just affect me, in other words. If I lost a leg or arm that would impose burdens on my wife and family, affect my ability to make a living. Those thoughts more than the medical condition itself lead people to arrive at opinions such as the original commenter expressed.
Having faced my own mortality already I know I think more about how my own end of life scenarios affect other people more than how they will affect me. I worry that I will suffer a stroke, or slip into dementia, before I can pull my own plug, leaving people I care deeply about with that awful obligation, and the burden of caring for me rather than living their own life. And it's that thought, not the fear of disease or dying, that leads me to my own ideas about how much I might endure, because I won't endure it alone or without cost to others.
I suspect part of extending human life much beyond 120 years is going to be finding ways to delay physical adulthood, so that proportionally you still have the same time to learn and grow, and those growth hormones are still kicking around repairing things for longer. Because the quality of life 100 years after your organs have stopped repairing themselves is not going to be that great, but if you could reduce that to 80-90 years then maybe.
> (…) and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
> I will make sure I die with my autonomy and my dignity intact.
Amputees have autonomy, dignity, and rich lives. To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
What if instead of requiring an amputation, he loses faculties by accident like suffering from parosmia due to COVID or having a weight crush a body part? Did he suddenly lose his dignity? He certainly lost some autonomy. What’s the next step then?
Many people end their life when they find it's too painful to live. Many more wish they could -- the debate around end-of-life issues is raging in many countries.
If having to undergo a few months of chemotherapy in order for your cancer to go into remission is "too painful to live", then I think someone's threshold for pain is way below that of the average person, to a point where that's kinda sad.
I know several people who have gone through chemo and came out the other side happy and healthy, after recovery. They live full, rich lives. They are much happier living than dead.
Sure, there are some cancers where you end up with declining quality of life for months or years before you eventually die. I wouldn't fault anyone with deciding to opt out of that from the very start. But that's not what we're talking about, exclusively: the person upthread was very absolutist and rejects chemotherapy in it entirety.
What’s your point? I support the right to euthanasia, nothing in my comment contradicts that.
We’re not talking about someone in pain wishing to die, we’re talking about someone vehemently arguing they would rather die than live without a limb, without having experienced it. And their reasoning is a lack of autonomy and dignity, none of which are a given.
There are literally millions of people without limbs, half a million new ones per year in the US alone. They’re not poor invalids, they’re people who adapt and can do things we only dream off while living normal lives.
> To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
No. Denigrating someone expressing their personal opinion seems absurd. Since the commenter did not impose their opinions on other people you had to put those words in their mouth to call them insensitive.
I prefer to die with autonomy and dignity as well, meaning I would like to pull my own plug. That other people might have a different threshold, or want to die differently than I might, seems neither absurd nor insensitive. The commenter just described their threshold, they didn't judge other people.
> Denigrating someone expressing their personal opinion seems absurd. (…) The commenter just described their threshold, they didn't judge other people.
My sentence does not judge the person, it criticises the belief. Learn to differentiate or you’ll be doomed to a life of ad hominem attacks and taking things personally.
If person A says they love spiders and person B replies they find spiders repulsive, there’s no value judgement passed on person A.
My remark was not a commentary on yourself, your world view, the author, or your approval of them. I don’t know you.
> I prefer to die with autonomy and dignity as well
Who wouldn’t? By itself that statement is meaningless. What’s in question is how one defines the terms.
I invite you to take a closer look at that quote and understand what it means to the people who live those situations. Let’s exaggerate to make a point: If someone said they refused to be treated by a black doctor even if their life depended on it, and followed up with the remark they would make sure to die with dignity, do you not see how that would be insensitive to black people? A writer, especially an ostensibly good one, would understand that basic sentence structure.
Again, that is a purposeful exaggeration to make a point. I’m not making a remark on yourself or the author, I am disagreeing with the belief.
> My sentence does not judge the person, it criticises the belief.
An opinion or belief can't "be" insensitive. A person may intend to say something insensitive, another person may interpret an opinion as insensitive (as you did when dragging in amputees and people suffering from other conditions and injuries). "Insensitive" can only refer to a person's intention or another person's reaction. So calling someone insensitive for their expressed opinion does indeed judge the person.
> Learn to differentiate or you’ll be doomed to a life of ad hominem attacks and taking things personally.
Surely someone as skilled in rhetoric as yourself can see the irony of you warning me about "a life of ad hominem attacks" embedded in an ad hominem attack. Then you followed up with the implication that I don't understand "basic sentence structure." Address my actual comment rather than telling me what I need to learn and how I will get doomed for not thinking like you.
As for spiders and racists, those have nothing to do with anything in this thread. If someone says they don't want to live if they lose a limb or face chemotherapy, whether you agree with their stated choice or not, no other person or race got mentioned or implicated in the comment you replied to. Setting up a false and deliberately inflammatory analogy to make your point, equating an opinion about perceived quality of life with racism, doesn't help your argument. Try sticking with countering the arguments the commenter (and I) expressed.
Personal opinions about end-of-life care, personal autonomy, dignity have the same flavor as religious beliefs: you can't counter them with logic. Just calling someone wrong or "insensitive" or "nuts" as some other commenters have misses the mark, because the subject involves beliefs, not facts that we can argue. One can express their own different opinion, but going beyond that starts to verge into attacks on personal beliefs, which requires making assumptions about another person's faculties, judgment, and ad hominem, all of which you have deployed in your comments.
If someone else is free to decide that they'd rather die than lose an eye, or rather die than have to experience a few months of chemotherapy in order to be cancer-free, then I am also free to decide that those views are absurd and extreme, and reflect a deep misunderstanding of medical outcomes.
> Denigrating someone expressing their personal opinion seems absurd.
There's a difference between saying someone is foolish and saying their beliefs/opinions are foolish. The former is not what the GP did.
> then I am also free to decide that those views are absurd and extreme, and reflect a deep misunderstanding of medical outcomes.
I don't agree. You can decide that another person's expressed opinions don't align with yours, according to what you believe and think you understand about medical outcomes. The original comment didn't mention medical outcomes so I hesitate to judge how much the commenter knows about that. And I hesitate to call someone's personal views absurd. They have opinions I may or may not share. I can't make a rational argument to prove them wrong.
A person's beliefs can't "be" foolish or even wrong. Belief by definition does not come from an objective and rational evaluation of facts and probabilities. I can say I hold different beliefs, but no more.
We most often encounter this kind of argument around religion. Someone can sincerely hold religious beliefs that don't submit to rational and objective argument. We can have different beliefs but we can't prove someone else's beliefs wrong. To call a belief that you can't argue against with reason "foolish" or wrong equals calling the person holding the belief foolish and wrong. You can show that chemo can work and people with cancer can recover. You can't say how any individual should feel about that, or how they should choose to deal with a cancer diagnosis. The original comment didn't make any statement about whether chemo works or not, or whether some people can thrive with dignity after losing a limb. Rather the original comment expressed one person's belief about how they feel about those possibilities, for their own definitions of autonomy, dignity, and quality of life.
One of Don DeLillo's later good novels is about this stuff (Zero K).
I always think people's attitude toward possible future worlds is interesting. You can see a wide spread of opinion in this thread -- whether you think functional immortality would be a good thing says a lot about who you are. Ditto for colonizing other planets, automating all work, building AGI, and so on.
I suppose I'm on the side of the technologists. I think immortality is probably possible and humans should try to achieve it. But along the way it will mostly be snake oil and cults. And, of course, it's all but guaranteed that everyone in this thread isn't going to make the cut-off.
I'm certain immortality is possible, and it's also likely to be achieved, because we always do everything we can do, regardless of consequences.
But I think this is the acme of selfishness. I don't want to be immortal, and I wouldn't want to live in a world with 500-year-old know-it-alls running around "oldsplaining" everything to everyone else.
I have, thankfully, a fairly good chance of dying before that happens.
How is immortality selfish? Selfishness requires taking from other “selves” who have unmet needs of their own. But there’s every reason to believe a society of immortals could either function perfectly well without producing new selves, or that it could choose to reproduce at a slow rate sustainable with its ability to extract resources to support itself. Any new selves that were born would be provided the same opportunities that we provide new selves in the present day—breastfeeding, education, healthcare. How would that be “selfish?”
Is it selfish when a centenarian lives past 100? Is each additional year of life obtained by a centenarian “selfishly” stolen from some hypothetical unborn self?
They'll decide when they want to decide. Some might choose to actually live forever, and that's fine. Others will choose a more current-human type lifespan, and that's fine. Some will choose 150, some 300, some 1000, some 10,000. All of those numbers are fine.
> And how will they do it?
There are already humane forms of medical euthanasia performed in progressive places in the world; this question already has answers, and likely more will be developed over time. I don't think it's an important question or issue to discuss, as long as people have legal options.
They'll decide when they're ready. I would love a little more time on this planet. And when it's time, I'll hop in the nitrogen pod. People are already making that decision in some parts of the world.
Life expectancy has been increasing over time, especially in the past century or so. I don't think it's credible to suggest that civilizations have progressed meaningfully slower now that people live to be 80 or so instead of only 30, which was common in recent history.
And even if immortality "stalls" humanity, so what? People matter, not technology or some amorphous concept of "progress".
Why 100? You can also make way for the next generation by living 80, or 60, or 40 years. Yet no one would be okay with that option. Funny thing is historically speaking 40 is a lot closer to the useful human lifespan than 100. So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades. Why do you think that won't change drastically another few decades from now?
"Funny thing is historically speaking 40 is a lot closer to the useful human lifespan than 100."
That isn't really true. Life expectancy was historically driven down by high infant mortality and lack of medicine. The meaningful human lifespan has been in the 70s for the majority of history. (Lifespan is different from life expectancy)
"So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades."
Who says it's a strong belief?
"Why do you think that won't change drastically another few decades from now?"
Because there's no real evidence to support that. Life expectancy hasn't gone up drastically over the past 50 years. Rates of chronic illnesses, including things like dementia, have gone up drastically. So even if people are living a couple years longer, they're generally sicker and it's costing more. Even if medicine makes drastic improvements, 100 is still a lofty goal. I'd be fine making it only 80 too. I'm actually skeptical that will even happen. What I do know is that I don't want it to take more than 100 years.
You’re being defeatist and ignoring the evidence presented in the article—even hospice patients want to live longer. You, too, will desire to live before (and hopefully: if!) you breathe your last.
This is because the entire goal of the sentient consciousness is simply to preserve itself as long as possible. DNA has the essential goal of replicating itself in reproduction. Consciousness, by contrast, appears to have no goal other than self-preservation. People sometimes choose to sacrifice themselves, but usually only when death is inevitable and they wish to save someone else from it (Lily/Harry Potter and Medal of Honor type situations).
I'm not really being defeatist nor ignoring evidence. Perhaps I just have a different perspective. There can be moral/ethical arguments for why mortality is a good, or at least useful, thing.
That's fine, but please don't stand in the way of those of us who would love to experience the world on a longer time frame, and are frustrated that the current level of medical knowledge doesn't allow it.
Why shouldn't we be frustrated by aspects of the natural world? Bad weather, disease, death and so on. Was eliminating smallpox odd because we had no entitlement expect to that?
Those things are about the timeliness - bad weather one day vs another, some people get the disease and others don't, early death verses a longer life. It's about what is reasonable to expect. It might be reasonable to expect good weather on a specific day, or even to live past the age of 50. It's unreasonable to expect to live indefinitely.
Yeah, I find the need to live forever kind of.. juvenile? You can’t let go of your ego for long enough to realise that at some point it’s better to make room for a new human with new perspectives and new ideas?
I like to think of it this way: if life was a game would you want to play the same character forever? No.. if you’re gonna keep playing the game it’s more interesting to start from scratch now and then. I don’t believe in reincarnation. There’s no need to. What you really are deep down is an instance of humanity. Almost all your genes and all your culture comes from and is shared with other humans. Any new instance (new human) is you playing a new character, essentially. If you’ve contributed to shaping the world you’re leaving behind this is even more true.
Unless you’re believe in a soul in the christian/jewish/muslim sense I guess, but then why would you fear death?
IMO the pursuit of immortality is far more dangerous and far more likely to kill humanity than AI. At least it may make us deteriorate to insignificance. Humanity is a super organism and we have a name for the phenomenon where parts of an organism figures out how to die and yet still replicates: cancer
We don't need to live forever as shown by the fact we've got by without it so far but death is kind of depressing. I've never really got the distinction that say killing millions in the holocaust is terrible but similar millions dying through age is desirable.
Eternal life doesn’t necessarily mean being impervious to harm. If you live indeterminably because your flesh brain was preserved or there’s a digital copy of you on a hard drive, a simple drop on the floor could terminate your existence.
And if we’re talking about fiction, there’s no obligation to make those lives unbearably immortal either.
> Our flesh is indestructible. Our lives are never ending. But not even in the dumb vampire way where after a while you hate it and you can’t die. We can die whenever we want. We just don’t have to.
Catholic theology actually justifies the belief in Hell by arguing that an eternity of suffering in Hell is a blessing, because it admits the one benefit of existence itself, while total annihilation has no redeeming factors whatsoever.
I've been thinking a bit about how living on could play out without being too futuristic with the technology. Coming back from cryopreservation is probably many decades out but we are quite close to being able to make a virtual AI version of you, working a bit like an actor playing you. Tech like Heygen is quite good with avatars and voices, and chatgpt type AI could pretend to be someone in a pretty terrible way just now but that stuff will get better and the virtual you could act as an assistant, learn your ways and help out.
Then when the physical you passes and maybe is cryopreserved the virtual you could conduct the funeral type service and say hey guys physical tim333 is gone but I'm still here and you can chat to me on the web etc. Virtual you could maybe have some investments and could buy gifts for the grandkids and post on HN etc. Then in a few decades virtual you could get data from cryopreserved you and incorporate it.
Part of personality lives inside of our stomachs as well, apparently, and so much of who were are is driven by our interaction with our nerve endings. I think even with a perfect copy of consciousness, it's pointless without a perfect copy of our bodies as well.
I think that AGI, when and if we ever reach that stage of technological innovation, will enable us to live in a utopian world where our bodies are impervious to biological defects or age and we have little to no reliance on food and water — where we can control all aspects of how we feel with a slider, or a thought.
Like the Lotus-eaters from Homer's Odyssey, some alien tribe will eventually show up and decide we're imperiled instead of living a perfect life in blissful stasis, and kill us to stop whatever afflicts us from spreading to them.
Anyway, I don't want the brain preservation thing, thank you.
One of the absolute best short movies I've ever seen is The World of Tomorrow.
The whole thing is cute, but every time I read about brains in boxes I can't help but think of this scene about grandpa's consciousness being uploaded into a cube:
We are also able to download correspondence from him. [...] I will read one of his letters to you now.
"Oh. Oh God. Oh God. Oh my God. Oh Holy Mother of God. Oh, oh, oh, oh, oh God."
I like how the video game Soma showed it. If you fork a brain you kind of have a 50% chance of it continuing into the copy, and 50% chance of you being "left behind".
https://www.youtube.com/watch?v=x790AjID0FA
I get the point, but I think the most self-consistent answer is that your conscious experience has a 100% chance of staying in the original body. (And similarly for destroy-and-remake teleporters.)
Indeed, the coin toss explanation "Catharine" uses on the Simon copies is merely a manipulation to ensure he continues following her instructions so that the ARK is launched, obscured to the player by the necessary mechanic of always controlling the surviving Simon copy. The only "real" Simon died in Toronto.
If all the theory is correct, yes you are still you. Maybe damaged you, but you nonetheless.
Imagine the ship of theseus thought experiment, but instead of replacing part by part you store it on a dry dock (lost its function), some time after you put it in the sea again (recover the function), for all effects this still is the ship of Theseus
I love how people have wet dreams about living forever by uploading their mind to a computer but put absolutely 0 fucking effort actually increasing their health/life span in real life, and/or waste their days/week/years doing stuff they hate to "enjoy life" later (when their body is already half way rotten). As long as it's sci fi and they have 0 effort to provide they'll suck every single drop of hope but as soon as there is something actually actionable they recoil in horror
imho if you're not lean and exercising every day you have no business talking about living longer, you've already refused the only magic pill there is. It makes all the difference between having one foot in the grave at 60 or still chopping your own fire wood at 80, and all it takes besides a bit of luck is to move your ass 45 minutes a day
There's no contradiction here. You're talking about people who have already detached the concept of self from their physical body. They think of 'me' as their brain, so why maintain the body they're 'trapped' in.
People who try to solve all their problems with intellect tend to suffer from this. And many of them never learned that being in shape makes you feel a whole lot better.
From what I've seen of research on apparently-dispersed storage of memories in worms, I'd not be at all surprised to find that a human brain separated from a body has (assuming we could "boot it up" in that state, as you put it) lost a lot of memories or functionality beyond the obvious, even assuming we could perfectly preserve everything present in the brain per se.
Not volatile storage for most of it. If you get knocked out or similar you have most of the memories. It probably is partly down to the structure of what's connected to what and parly chemical changes at the synapses, although I don't think it's fully understood.
It'd make it pretty hard to get memories out of a brain sample as it's hard enough just to see the structure in an electron microscope. I don't think they have any way to log the chemical changes in the synapses presently.
We can barely even do “artificial intelligence” correctly. Let alone mapping the human brain to a digital representation. It’s an interesting concept but in order to “preserve” the brain, wouldn’t one have to get their brain dissected layer by layer and thus killing the subject in the first place?
Then storing all of that information without any loss would likely well exceed current technological limits. Then the computing power need to run the brain would have to be immense.
> We can barely even do “artificial intelligence” correctly
Even if we could the brain is an amazingly efficient mechanism for computation. It uses such a low amount of power that it's hard to imagine a computer based AI model ever genuinely competing with it.
If the environment and entropy are human scale concerns than AI is decidedly a dead end for us.
Leaving aside whether or not I think this is A Good Idea[1], I am fascinated by what other limits people would run up against, if this were possible. Memory, for example. Is it even possible to retain a first person memory for hundreds of years? Is there some upper limit above which you can no longer form new memories? Would it become much harder to motivate yourself, without the pressures of a limited existence. How would it affect our psyche?
> Memory, for example. Is it even possible to retain a first person memory for hundreds of years?
It's not. But it's fine, because it's not even possible to retain memories of typical lifespan or even short one or even a decade or even a year. You lose way more than 90% of it and the rest gets largely altered anyways.
I think that immortality would be a disaster, personally. That we die is a critically important aspect of life. I'd prefer that we work on ways to make the death process less traumatic.
Why? We have already decimated any semblance of natural selection so death is no longer a necessity from an evolutionary standpoint. Maybe immortal humans would be more beneficent because they wouldn't be scrambling to get ahead in their short time here.
If I didn't only have 30-40ish productive years to achieve whatever I will in this life it would be much easier to donate money or time to charitable pursuits.
> If I didn't only have 30-40ish productive years to achieve whatever I will in this life it would be much easier to donate money or time to charitable pursuits.
Spoiler: The haves are not going to get any more generous when they've got thousand year lifespans. You would just have end up having to spend hundreds of years grinding away at the bottom of the ladder instead of 30 or 40.
> We have already decimated any semblance of natural selection
We certainly have not.
> If I didn't only have 30-40ish productive years to achieve whatever I will in this life it would be much easier to donate money or time to charitable pursuits.
True, but if you lived forever that would mean that it wouldn't be realistic for new people to come into the world, which means far fewer new ideas and ways of looking at the world. That would a net loss for humanity.
1) There's no danger of overpopulation. People have a natural tendency to reproduce slower when they feel safer.
2) Trivial argument: if people already lived indefinitely would you advocate murdering them to "make room"? Telling people they shouldn't be able to pursue a longer life is equivalent. Making that decision for yourself is perfectly fine; making it for others is not.
3) 150k people die every day, nearly 2 people per second. If fixing that tragedy creates new problems, bring them on; we'll solve those problems too.
1) Is not clearly true. Yes, 'feeling safer' pushes down on reproduction rates. But the _total_ effect on population growth could still be positive if the total death-rate drops enough -- we don't know enough to say for sure. And frankly, I think the most likely outcome is that people would be more likely to have kids if they didn't have to worry about missing out on their chance at XYZ dream.
2) Not true from most moral perspectives, including 'common sense morality'. In a pure utilitarian sense, sure, but most people don't subscribe to that. For example, choosing to not save someone from a burning fire is not the same as choosing to burn them to death. Both the actor and their intention matter.
3) I don't disagree with the first half of your point (that this is a tragedy) but I cannot share your optimism re.: us solving the consequent problems. If there's anything that the last fifty years of modernity have shown, it's that we're actually quite bad at solving broader social problems, with new and even-worse problems often arising well after we thought the original problem settled. Consider global warming (to which the 'solution' looks to be the further impoverishment of the third world, and probably mass deaths due to famine/drought/heat waves), or how we in the US 'solved' mobility by destroying main streets and replacing established public transportation with cars and mazes of concrete. Now we've "solved" loneliness by giving everyone a phone and -- well, I'm sure you know how that went.
1) We already have a growing population, and I don't think it's inherent that curing mortality must make it grow faster. The net effect would certainly be an ongoing upwards growth (since I would hope that population never goes down), but I'm arguing that the net effect does not inherently have to be unchecked exponential growth. Immortality doesn't solve resource constraints, and resource constraints do influence people's choices. That said, I also believe that even if it did result in faster growth, that isn't a reason to not solve the problem.
2) The equivalence here isn't "choosing to not save". Choosing to push someone back into a burning building, or preventing them from trying to escape, is equivalent to choosing to burn them to death.
3) I am an incorrigible optimist and don't intend to ever stop being one. Humanity is incredible and it's amazing what we can solve over time. I don't believe that any potential solution we might come up with is worse than doing nothing and letting 150k people die every day.
I like knowing that the worlds biggest assholes sometimes lie awake at night fearfully pondering their own death. I don't want to deprive them of that.
Plenty of the most-powerful already keep causing harm just to make number go up even more, well beyond the point at which they can conceivably personally benefit before they die. Imagine if they could conceivably personally benefit from that because they live for centuries. Why would anyone expect that to improve their behavior?
When I look back on my 45 years life, there are spans which feel like a different life altogether. I thought differently, and made choices that I won't make today. I'd say "in my former life" as if that life ended and a new one began. I suspect youthful immortality would be a sequence of many deaths and rebirths. If you had the neuroplasticity of a 25 year old and the experience and wisdom of a 50 year old, I imagine it won't get boring, and perhaps new ideas and modes of living won't require a generation to die, and a new one to be born.
What if death is just a 'feature' of how life evolved in this planet? What if we discover life in other planets that are just endless? It just seems too anthropocentric to think that all forms of life must die.
Ok, that maybe part of how 'WE' define life, but, for me, that looks a lot like an arbitrary definition.
It's a bit like fighting against the ocean imho, it doesn't matter how much you put into it you'll always lose eventually.
It's much simpler to accept and live within your constraints than to waste your life and mental energy wishing you could be/do something you will never be/do.
If you think about it most of the things we "fix" are extremely wonky, even something such as a bone fracture isn't guaranteed to heal 100%, most medicine have massive side effects, organ transplants have something like 50% survival at 15 years on average, &c. We think we're getting more and more control on things but most of it is a hack job temporarily delaying the inevitable
Also, anyone thinking being uploaded to a computer forever is heaven on earth must live a pretty fucking terrible life to begin with
To be frank, the extent to which the very old now dominate science, business and politics is already unhealthy. I shudder to think what our world would look like if our most important positions of power were dominated by men born 110 years ago.
They say science advances one death at a time. Looking at congress, I think you can say the same for politics as well.
It doesn't matter how much you like sci fi, even if it was technically possible it would be reserved to the 0.001% and you'd still be grinding your whole life just as you are doing now
Yes. I have long believed that all those who applaud death as a good thing — Steve Jobs most memorably in his address at Stanford — will be first in line for life extension once it becomes an option.
There are some externalities at play here though...
- We currently only have one world that all living humans must share.
- Imagine the sickening amount of power some people would be able to gather given a few centuries. That can't be good for everyone else.
Well, people already gather that sickening amount of power - not for themselves, for their heirs, but still - take the soon-to-be-again US president: he probably wouldn't be where he is right now if his father hadn't amassed a considerable fortune which he inherited.
Imagine people like putin pouring billions in, easily sacrificing millions to achieve such thing for themselves, and ideally nobody else. A truly terrible scenario worth fighting against.
Some folks are scared of rogue AI, when biggest threat to mankind always was, is and will be other, properly messed up humans with certain capabilities.
I don't know why you're getting so much pushback... Planetary resources are finite. If you give up dying, you have to give up reproducing beyond the replacement rate. People like to imagine they'll be part of some small tribe of a lucky few immortals, but the reality is we'd be in exactly the same situation as today, but with a population rapidly screeching beyond all known sustainable limits far faster than it is today. To name just one obvious problem.
Success of something like this entails a way to regulate reproduction at a far more draconian level than even China's one-child policy. I don't think any civilized nation could impose a "no child" policy and remain intact.
> but the reality is we'd be in exactly the same situation as today, but with a population rapidly screeching beyond all known sustainable limits far faster than it is today
I don’t think this is a given. Most developed nations have bad birth rates for example
Setting aside the actual physical and technological limitations here, I think such immortality would create a whole new kind of population problem. Population growth, in some sense, would explode. Or, more realistically, you’d have two classes of people: those living, and those “immortal,” those who could afford immortality, and those who were poor and would have to die permanently.
That said, I think this is all a pipe dream and totally infeasible.
Alastair Reynolds wrote some great books which touch on this concept - the Perfect series has a AI character who originally started out as a fatal brain scan of a person who eventually escaped...
A fun way to die, would be to transition digitally into the cloud as data for some sort of future "humanity LLM" that would be inquired in the future. The Council of Elders
I'm reading a book from 1974 right now. The book I read before that was a 2013 adaptation of a book from 1605. The book I read before that was from 1956.
The book I read before was from 1985: I started it while the author was still alive but he died before I finished it. The book I read before that was from 1998.
I'm kinda surprised everyone in that show doesn't walk around with full neck armor to protect their stacks. Metal gorgets should be all the rage in this universe.
Jeroen Lanier in his book "You Are Not a Gadget" uses the example of a MIDI file: they can describe music, but even though they sound like the real thing, they are limited by the digital world. E.g. according to ChatGPT the minimum interval between 2 MIDI messages is 0.77 milliseconds.
And then he asks what sort of limitations might we have if our minds are software, and how would we not notice it?
> according to ChatGPT the minimum interval between 2 MIDI messages is 0.77 milliseconds.
Thank you for stating your source. However, ChatGPT isn’t deterministic. I asked it the same thing and it responded it depends on several factors, including the MIDI protocol version and the device or software used, and that the minimum between two messages is between 1 and 2 milliseconds.
Which of those is true? Perhaps neither. A quick web search didn’t provide a straightforward answer. Point being that we should avoid propagating even more wrong information, especially since it’s not relevant to your point (which makes sense).
So, I've seen Bredo Morstoel, the Frozen Dead Guy of Nederland, Colorado[0]. The 'tour' you could go on was ... something. But, at the end, we were asked to help keep Bredo frozen and helped pile on dry ice. So, they had to open the sarcophagus and get the CO2 in there.
And, yeah...
Bredo looks like Otzi the ice man [1]. He's just a dead body, there's no saving him, he's gone.
It was a good lesson in any thoughts I ever may have had about cryopreservation. Unless you pay someone a LOT of money, for a very long time, and somehow manage to get them to actually really truly care about you, you are just having a really strange and long funeral.
Good excuse for a party though. Those dead salmon tossers are something else.
I really hate this idea. The wrong people would get to persist indefinitely -- the malignantly greedy who hoard majority of the resources and make life miserable for billions of others.
Like so many scientific pursuits, this one has its roots in science fiction. A terrific trashy futuristic novel from the 70's by Lawrence Sanders called The Tomorrow File[0] features preserved heads that are kept around to spout ideas in the future, when they might become useful, among many other Brave New World-type concepts.
That internet archive version is particularly boring, isn't it. The novel is his best, if you ask me. His detective fiction didn't ever rise to that level and this was the only one he wrote in the sci-fi genre. Perhaps being relegated to the dime store novelist category in his other books prevented this from getting the attention it deserves, but it was wonderfully smarmy and prescient.
I wish we'd reframe the way this gets talked about. It's pedantic, but we can't escape death. All matter in the universe will eventually decay or transmute into something very different from what it now is. And my answer to the question posed of when you'd want to die is unknown. I can't say 150 years. I can't imagine a specific age I'd hit at which I'd want to die. But at the same time, while I might want to outlive the Earth, I don't want to outlive all baryonic matter and somehow persist into the age of the universe in which all other matter is black holes, and there is neither light nor sound, nothing to touch, and all I would ever experience is quadrillions of years of utter loneliness. There is no immortality and nobody would want it if they really thought about what it would entail.
But cessation of aging would be wonderful. I'd love to live indefinitely and not have my body or mind noticeably decay. Research like this should be done, but we need to be honest about what we're trying to accomplish. Nobody will ever escape death, and when you start getting large enough numbers, I'm not sure it would make any difference to live longer. Even if we somehow achieve brain uploading, which I don't see entails you living longer so much as making copies of yourself with different identities, every storage medium has a capacity limit. At some point, the only way to form new memories to evict old ones, and the experience of living 90 trillion years won't be any different than 1 trillion years if 1 trillion years of experience is all you can store.
That said, we need to also be humble about what is even achievable. The very idea of a high resolution scan capturing the entirety of your brain state is already science fiction. We have another home page story right now about controversy over whether the brain has a microbiome. The only reason that's a question is because we have no means of opening up a living brain to see. We can't even accurately measure a person's body fat without dissection. The limitations of non-invasive remote imaging that don't kill the animal being imaged are quite severe and constitute a large reason medicine isn't more effective than it is. There is no technology we are on a known arc toward achieving that will make it possible to capture molecular-level detail of an entire brain as it is still running. I don't see how you can base an entire research project on a premise that doesn't exist and we have no idea if it ever will exist.
There are so many ifs and buts in this idea of resurrecting the brain that it's laughable, yet "brilliant scientists" make press with it. What about the completely unknown mind-body problem that scientism pretends doesn't really exist but each one of us knows is very real? What if you wake up 500 years later but plugged into a matrix? Who guarantees cryo-maintenance of the frozen brain?
Those hundred years won't exist for the dying. I would personally find comfort in knowing that I will feel waking up right away into a technologically much more advanced world
It's really debatable if religious people actually believe in eternal life. If they did, they would immediately see that their behavior drives them straight to eternal damnation.
They don't seem to care about that, and the only rational explanation is that they don't believe there's anything after death. They're nihilists. We all are.
What if the constituent of the brain changes, would you feel you or would it be like another person just go on being conscious. If you don’t know what makes consciousness, most likely you just “die” after the brain freeze. This ain’t like sleeping or coma where your system is kept running continuously
I think it’s more likely we can preserve our ego rather than our consciousness. For instance, create an AI replica of yourself that accurately behaves the same way you do. Although you would be dead, your ego can carry on living and responding to changes in the external world, and people could have interactions with you that accurately simulate how you would respond to them long after you’re gone. And as your ego learns about the world, it develops opinions closely similar to opinions you would have based on your life experience. Perhaps in this way people in power could remain in power indefinitely.
"1918 when diabetes had no known treatment"
What nonsense. A history of diabetes shows knowledge and treatment for the disease for thousands of years. https://en.wikipedia.org/wiki/History_of_diabetes
“Take very high resolution scans of brain structure to characterise how someone’s neurons work, recreate it in a digital format then put it in another robotic, virtual or biological body.”
This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
It's really kind of odd that people talk about brain transplants, or living forever, and then talk about uploading your mind to the cloud or swapping someone else's healthy brain into your healthy body, or scanning your brain and recreating it, and make it sound like it is going to help you to live forever. If my brain is scanned and uploaded to the cloud, a copy of me would live on forever, but me, literally, me, will still deteriorate and die, then rot away into nothingness. My mind, my personality, my essence, me, will not live forever under any of these solutions. I'll die. A copy of me, a point in time snapshot of who I was when it was taken, will continue on, with my memories, mimicking who I was. That gives me no comfort at all. I don't care if a copy of me lives on forever if I still die. That's not immortality, not for me personally, not if I die.
Every time you go to sleep your consciousness shuts down. What wakes up is something slightly different due to biological processes. You are not tired, your mood is different, part of cells in your brain got replaced, memories become more stable and maybe linked to other similar memories, etc... This is way more pronounced for people waking up from a coma. Philosophers thought of this for a while, see https://philosophy.stackexchange.com/questions/66018/the-bra...
So, each morning is a copy of you wakes up? Or do you draw a line in the sand and say that is different.
I raised a similar objection to the GP comment many years ago, and someone made the same response to me. I wonder if I'm alone in that instead of making me feel more optimistic about mind transfer, it just made me slightly afraid to go to sleep.
> your consciousness shuts down.
Consciousness not a single isolated process and describing a lull in activity as "shutting down" is incorrect.
> So, each morning is a copy of you wakes up?
Obviously not. Many of these processes also happen in reverse when you are awake. So when I go to sleep I'm tired and in a different mood. When did that "copy" get made exactly?
> Or do you draw a line in the sand and say that is different.
You can easily ask "how much of my conscious state is controlled by my body's chemical state?" If that's a non trivial amount then simply copying and hoisting my memories into a different body is clearly going to have a non trivial impact. To the extent that it's easy to say "that would not be /me/."
> Consciousness not a single isolated process and describing a lull in activity as "shutting down" is incorrect.
but there is indeed a discontinuity in your consciousness when you sleep. This discontinuity cannot be distinguished from the discontinuity arising from a copy of your brain on first "restart".
So how do you reconcile this discontinuity?
> Every time you go to sleep your consciousness shuts down. So, each morning is a copy of you wakes up
I think it doesn't "shut down", maybe fades away to a different, low-power mode. When you go to sleep and then wake up, here is still a continuity, because of the same underlying structures and processes from which the consciousness emerges. So it is like a 4D thing.
That continuity never really breaks (until death? which is like singularity) and I think this is what makes you "you". You can't copy/teleport it (kinda by definition), but you can extend/transform it.
Perhaps a "Ship of Theseus"-style approach would work — gradually replacing neurons one by one, or brain sections piece by piece. Alternatively, the brain could be extended with peripheral "devices" connected to it with bandwidth comparable to the interconnections within the brain itself, up to the point until the biological part becomes irrelevant. This is similar to how a portion of the living brain can be removed while still preserving consciousness — as neurosurgeons often demonstrate.
There’s a difference when there could conceivably be two of you.
There will never be two of me in my bed when I wake up, so I’m the same.
If my mind could be copied and embedded in a robot or whatever, then there are two of me, each now diverging. That other one is not me, as far as I, the other, am concerned.
You'll need to be more clear in your last sentence.
If you are cloned while you are sleeping, and both the original and the copy are moved into different rooms before you wake up, both of you think themself is you, and both of you think the other is not you. So, who is correct?
For me, the relationship is non-transitive.
If you tell me this will happen tonight, I would right now call both "me". Both of them after the event would agree that I am them. Both of them would agree they are different people to each other, even if they say so in perfect stereo unison.
Cue the "We are Bob" series, aka Bobiverse, by Dennis E. Taylor!
This entire thread just has me wanting to watch The Prestige one more time.
from my own perspective it would be impossible to tell given the parameters of this thought experiment, but any outside observer who witnessed the cloning process and the staging of the second bedroom would know.
Moreover, if randomly one of the two dies and the other survives then either the original or the clone experiences death. So from the point of view of the original before the experiment, he has a 50% chance of dying. If we change the experiment to say that he has a 100% chance of dying but the clone has a 100% chance of surviving, the original would not agree to the experiment unless he was suicidal; obviously he has no reason to care that a different person will live on assuming his identity after he dies.
You're assuming that there exists such a cloning process, and in doing so, silently ascribing properties to that process that completely change the outcome.
You're in effect saying "Assume there is a cloning process that is instant and creates a complete copy of you, both mentally and physically," and sure, in that theoretical world, you can talk about there being no way to know which was you, the clone or the original.
But IRL, we have no way of knowing what the properties of such a cloning process would be - maybe it's impossible to create a cloning process such that mentally, it is a complete copy. Maybe you can always tell whether you're the original or the clone. Similarly, maybe we simply cannot create a digital copy of a mind that cannot immediately tell it is the copy. We have literally no way of knowing.
So instead of living in the philosphical realm, we can have a lot more productive conversation in the practical one. Practically, it strongly appears to me that I have continuity every time I wake up, and practically, if you copied my mind and ran it on a computer, that mind would not be the "me" that appears to experience continuity. Practically, it would bring me no comfort to know that this other mind that used to be a copy of my own continues onwards while I die, no more so than it would bring me comfort to know a twin of mine continues onwards.
> So, who is correct?
The one that wasn’t the clone.
It makes no difference. If it’s possible for the two versions to meet (they exist simultaneously) then what you’re describing is simple deceit.
A world where two of me exist is fundamentally and irreconcilably different than a world where I simply go to bed and wake up each day. It makes no difference if I personally am aware.
it's almost as if your thought patterns and language are a product of your experience of being the only copy of you...
if having copies of you walking around was a normal thing, our language and expectations would reflect that.
right now if another copy of you comes by and claims your wife and your house, you'd be miffed. but the whole concept of one of you owning the house is just an artefact of how individuality and possessions happen to be right now. it doesn't have to be forever. there was a time you could steal a picture and have the only copy - now you can save a copy on your phone and you don't deprive another of theirs. some day people and stuff could behave the same.
Irrelevant, cloning doesn't and probably never will, happen.
Reminds me of Filmcow: https://www.youtube.com/embed/ur9jLK_6EcY?start=70&end=100
Every nanosecond the set of atoms constituting you changes, each of those are a different you.
There's a difference between "I have changed" and "I have been copied." This seems obvious enough to me that I don't question it, but is it not obvious to everyone?
Copies are impossible because every moment is change with bounded but non-zero uncertainty.
The fact that "you are different" does not mean a "different you" now exists, though. Change of this specific kind is already intrinsically integrated into your existence and identity.
Every moment of consciousness is different. It is a different you in that sense. What is the strand that keeps it the same ‘you’?
Honestly I'd like to have my mind copied, but may only be 'simulated' once the current me is dead. From my perspective I'll be dead and there's no changing that, but from the clone's perspective, life just goes on.
true quantum scale copies may be impossible but that resolution may not be necessary to fully copy a person.
Not if “you” is created by emergence.
switching a sesame seed on my burger bun is not the same thing as scanning it and uploading the file to the cloud.
I do wonder sometimes if each period of being awake is the entire subjective experience of a sequence of independent conscious entities. Like I woke up today with the memory of yesterday, but possibly that version of me that had the subjective experience of being was an ephemeral phenomenon that ceased to exist when I fell asleep, just as I will tonight.
> Every time you go to sleep your consciousness shuts down.
What consciousness? Yes, I am aware that "sleep" means that a person becomes unconscious (trivially demonstrated), but that is not the same thing as "consciousness shutting down". Supposing there is such a thing as consciousness, it doesn't seem to be anything that is suspended by the sleep process. And that's also a big "if", because consciousness is just woo-woo nonsense talked about but never defined by people who haven't quite gotten over the idea that humans don't have some immaterial soul.
Consciousness has, I'm told, 40 identifiable meanings. Makes it hard to get to grips with it.
For example, I sometimes have direct awareness of the nature of my consciousness changing with tiredness. Sometimes that has been what I can only describe as if my homunculus has switched to autopilot and is now just watching a cinema of my senses.
But that's probably a terrible description if you've never experienced that yourself.
If I'm not conscious, it's rather difficult to explain what I'm experiencing right now.
This makes sense if you believe in a non-materialist self, like a soul. That model wouldn't be falsifiable- we can't measure it, so it can have whatever properties are convenient. You could then rule out the possibility that your soul would inhabit the copied 'you' by fiat.
Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
There's no need to believe in an immaterial soul to think that a copy is a different 'you'.
It is enough to understand that having an independent "command & control" structure (for lack of a better word; C&C) is by definition 'you'. C&C means that the individual can perform a self-contained sense-think-act cycle.
We can infer this from the following: 1. Identical twins, despite having identical genetic structures, have separate C&C, and therefore qualify as separate persons. 2. Two siamese twins conjoined by the body (including parts of the head) have two C&Cs, and therefore qualify as two persons (with varying level of bodily control per person). 3. An individual with additional parts of the body (e.g., in a special case of being conjoined with a twin) has only one C&C and therefore qualifies as one person. 4. Clones are a special case of identical twins, and still have separate C&Cs, therefore qualifying as two persons. 5. A person in a coma or with another condition nonetheless qualifies as a person with some sense-think-act autonomy, despite being dependent on someone else for continued bodily functions. 6. An individual with a transplant (e.g., heart) or an extension (e.g., pacemaker) is nonetheless a person because of their consistent C&C. 7. An individual with extreme memory loss remains a person with C&C (in brain, body, genetics for the most part).
Any other special but naturally occurring cases (e.g., individuals with two brains, individuals with completely separated hemispheres, conjoined twins sharing a brain) would require either that we: a. understand how brains and bodies work (and therefore, make a more quantified measure of how much of sense-think-act autonomy qualifies as personhood); or b. decide based on a heuristic (1 brain = 1 person) or principle (protection of bodily autonomy, i.e., 1 body = 1 person).
But none of these need you to believe in a soul, in order to think that a digital clone is not 'you'. Unless, of course, you can prove that an individual either: 1. can sense-think-act with both physical and digital 'bodies' simultaneously and/or sequentially*, which share an equivalent experience (i.e., central C&C coordinating distribution or maintaining strict consistency across bodies). 2. can sense-think-act with either physical or digital 'body' at will, upon establishing a 'connection' and successful transfer of state (i.e., remote access). 3. can transfer whatever is necessary for C&C to another 'body' (e.g., brain transplant), and demonstrate beyond reasonable doubt that their C&C signatures in new body match the ones in the old one.
It's not an issue of the soul, if you make a copy there are two people now, which are clearly physically separate, and think independently.
Even if you somehow made a perfect instant copy, they'll start drifting apart, as they'll be experiencing different things.
this ^
Soul or not, it wouldn't be me. Imagine that I was alive and existed next to the cloud copy of me. It would be 2 different minds with a common core. A database restore to a new cluster, that immediately became different afterward because our two experiences would drift. Funny enough, the db term for a db cluster which has two writable primary instances, which have received differing sets of writes containing different sets of data, is called a "split-brain".
So if I exist and have my own thoughts and the cloud copy exists and has its own thoughts, and we are taking in reality differently than each other, we are 2 distinct minds. Now imagine I died. What has that changed for the copy of me? Nothing. It's still a different mind than mine. My mind is dead. I'm dead. A copy of me existing after I die does not change the fact that I died. It does not help me, literally me, live forever.
What if you have done the procedure and after it woke up in artificial environment. You'd be aware that there's biological remnant of pre-procedure you that will still experience other things than you will including dying and death. But he's separate entity and you are glad it's not you that are about to die.
Basically if you wake up after the cloning as your old biological self then you just drew the short straw. It's up to you if it helps you or not that there's another you that will continue on whatever you were doing without the nuisance of dying. Some people might find calmness in that they can freely die and nothing of value will be lost.
What about startrek transporters? Wouldn't you use them because you are getting disintegrated and what's create at destination is just a copy of you but not you?
It is the same person but a different human. Even if they drift apart, by all means, their own personal identity is the same, same as if we split the universe with 1 human taking path 1 and in another universe the human taking path 2.
The moment they change, they are not the same person, just as I am not the same person I was when I started writing this reply, and yet, I still am "me", irrespective of the drift.
If you were only one person (big if), you would also experience things and drift. Would you stop being you at that point?
The person who is you exist independently of any copy of yourself. While a replica of you would exist there would still be the original you manifest in your brain that would exist an independent life and die at some point without ever “resuming” in another body. This is similar to how you can never time travel yourself even through relativity. Your perception of time will be the same no matter what relativistic effects occur outside of your frame of reference. You will live your life span and die.
Your replica however would have a different experience and be able to draw on your memories etc. But you yourself will in fact live a natural life with no shared experience with your replica and will experience death no matter what. It’ll be little solace to you that a copy got to live your life after your passing.
I don't believe in a soul in the sense of a "ghost in a machine", since it introduces so many problems. Nonetheless, it's pretty clear that there is an immaterial component to thought, and therefore to our existence, for various reasons. Examples:
1. If we are entirely material, we can't know anything. For example, if you're thinking about a tree, there must be something about you that has 'grasped' what it is to be a tree. This is impossible if your idea of a tree is nothing more than a bunch of neurons firing in a given way. There is no correspondence between neurons firing and a tree. So there must be more to your thought than neurons firing (or any other material process).
2. Similarly, if we're entirely material, we can't reason. A reasonable thought differs from an unreasonable thought only by its content. It can't differ in this way by its material components in the brain. Therefore, if we hold that some thoughts are wrong and others are right, we must conclude that there is more to our thought than the material.
How sure are you about (1)?
Is there no chance that “thinking” (or your term “grasping”) is not simply an emergent property of certain matter+states? A computer with a machine learning process seems to be able to “grasp” what it is to be tree (as opposed to a cat or a crosswalk).
You seem to be engaging in difficult philosophical concepts and jumping over important steps (like rigorous definitions or showing why something “must be”).
> You seem to be engaging in difficult philosophical concepts and jumping over important steps (like rigorous definitions or showing why something “must be”).
Yeah, sorry, I bashed that out too quickly. I'll try to slow down a bit, though I can't promise a full treatise :)
For a thought to be about reality, there must be something about that thought that makes it about reality. How this might be the case is one thing, but the fact that it is the case is nonetheless necessary. And for the thought to be about the reality, there must in one way or another be similarity between thought and reality. Again, the 'how' may be extremely complex, but the existence of the similarity seems necessary. To put it in opposite terms, if a thought is not similar to reality, then it is not about reality, and therefore can't constitute knowledge about reality. Do you agree thus far? If so, let that be premise 1.
I'm obviously talking about the concepts we use to think about reality, not about the words we use to describe it.
I will assume that we agree that thoughts can be about reality (even though they're obviously sometimes not) as premise 2.
So, given our two premises, suppose Bob is thinking about a tree. With materialism, where is the likeness between Bob's mind and the tree? It cannot be the neurons or the physical structure of his brain or body, because these things don't have similarity with a tree. The atomic structure of a brain, in any state, is different from a tree's. That much is clear. And likewise, a tree doesn't somehow appear in Bob's brain. Given our premises, this is a problem for materialism. If the likeness is not physical, and it clearly isn't, then what is it?
We can discuss the idea of a tree as 'emergent' from the matter&state. But honestly, I struggle with this concept. Every time I've heard the word used, it seems to be a way of restating the problem rather than solving it. Perhaps you could explain in a bit more detail how this overcomes the 'similarity' problem I've outlined above? ISTM you still get the problem that a bunch of neurons is not a tree, regardless of their arrangements. Either the neurons in some sense 'cause' you to think about a tree, in which case you need to ascribe to the neurons an ability to cause immaterial effects, which again seems a problem; or it is simply a way of saying "we don't know how it works". Again, so ISTM.
On computers: very briefly, a computer doesn't 'know' anything about a tree, or anything else. A 'tree' on a computer is a bunch of magnetic states on a hard drive, or transistors in particular states, or what have you. The meaning of a computer's output (that is, the link between the computer and reality) exists only in the mind of a human being. An LLM's text about a tree is no different in principle from the binary for the word 'tree' on an hdd, or a jpeg, or whatever. An LLM gives the illusion of knowing, but no more than that. Again, I realize this is a pretty summary treatment, but I think the tendency to describe the mind as a computer is misguided for this reason. You face exactly the same problem with the computer as you do with the mind. For my part, I simply reject premise 2 as far as a computer is concerned.
Anyway, very simply, if materialism is true, a thought cannot be about reality, and it cannot have a similarity with reality. Therefore, we can't know anything about reality. But we can know about reality (premise 2), therefore materialism is false.
There are a number of points where this thesis needs considerable work.
Perhaps the first is that you appear to be ambivalent about what you have in mind by 'similarity'. When you first discuss it, you say "...there must in one way or another be similarity between thought and reality. Again, the 'how' may be extremely complex...", but then you go on to say "...where is the likeness between Bob's mind and the tree? It cannot be the neurons or the physical structure of his brain or body, because these things don't have similarity with a tree..." - so this similarity could be extremely complex, yet you dismiss the possibility of it being anything physical without offering the slightest justification. You might find that convincing, but you have not said anything that presents a challenge to anyone skeptical of this claim.
You do not help your case by continuing "The atomic structure of a brain, in any state, is different from a tree's." This suggests that while you recognize that this similarity could be extremely complex, you do not completely grasp how complex a similarity between a physiological structure and something external to the body could be.
This impression is reinforced by your comments on emergence, which you appear to want to dismiss on the grounds that you struggle with it. Again, this is unlikely to be persuasive to people who do get it. For what it's worth, here's my go-to example of emergence: in the theory of evolution, fitness is a key concept, yet you will not find an organ of fitness by dissection - it is an emergent property of an organism's physiology in relation to its environment.
Much of the rest of your comment is in this vein, and it is rather like saying "how can music be recorded on a compact disk? CD burners use light, not sound." There is no point in belaboring this further, so let us instead return to your starting point - "for a thought to be about reality, there must be something about that thought that makes it about reality." A physicalist could respond by suggesting that there is a a chain of physical causality running from the outside world to brain states via sense organs. When we recognize that sense organs do not capture complete information, and also that brain states may be influenced by previous brain states, this view accommodates the fact that we can have incorrect thoughts about reality. How this works is opaque to us at our current state of knowledge, but that does not mean that it does not happen - in fact, it would be quite hard for an anti-physicalist to deny that it does.
The bottom line here is that the correspondence between mental states and reality is not the problem for physicalism that you think it is.
Thanks for the response.
I will return to what I meant by 'complex' in a minute. But nothing in your response undermines my point, which is that similarity between thought and object-of-thought is made impossible by materialism. It is true that "a chain of physical causality [runs] from the outside world to brain states via sense organs", and it is also true that brain states will be different based on the causality imparted by the world, as well as previous brain states, genetics, drugs, etc. But causality is not likeness. A brain-state caused by a tree is not of itself the same as a thought about a tree, any more than something else caused by a tree (like the sound of rustling in the wind) is the same as a thought about a tree. The latter needs likeness, not just causality.
> yet you dismiss the possibility of it being anything physical without offering the slightest justification.
The only way there can be likeness if materialism is true is physical similarity. The brain would have to be physically like a tree when Bob had the thought of a tree for there to be likeness. If we take materialism to mean that everything can be reduced to matter, then there cannot be any likeness that is not of matter. And this means a physical/material/atomic likeness is the only possibility. Yet we know there is no such likeness.
The concept of emergence doesn't overcome this problem. It must either produce a material likeness, which we know doesn't happen, or it must cause some kind of likeness which is real-but-immaterial, which is a problem for materialism.
The 'complexity' I referred to was in the context of how we know there is likeness between thought and reality vs that we know there is said likeness. We can know that the likeness exists without knowing how it exists, or being able to describe it in detail. The point was that no matter how complex the mind/brain/process may be, we know likeness must exist in one way or another, because without likeness there is no knowledge of reality. The complexity of the process, whatever the process is, is irrelevant to the need for likeness. That was my point in raising complexity; I should have been clearer. Complexity does not change the fact that the only likeness for materialism can be a physical likeness.
The CD is not a valid parallel. There is a chain of causality between the laser's interaction with the disc, and the sounds produced by vibrating speakers. A chain of causality is not a likeness, as I stated above. Further, a CD has no knowledge of its contents, so there is no need for likeness (physical or otherwise) between its imprint and the sounds it causes. We have knowledge of the world, so there is need for likeness.
Anyway, the nub of the matter is that you need to show how there can be likeness that is not physical likeness if materialism is true, or you need to show that likeness is not necessary for a thought to be about reality.
Why do you have to show the likeness ? There’s a strict level of fidelity being imposed here.
The brain has to only represent the sensations it encodes.
The tree and the brain are both material, so the phenomenon of the brain encoding it will also be material.
There’s no need for a soul to represent a further reality.
Even if a soul exists, wouldn’t that result in the same issue again?
The argument sounds like we are saying that there has to be a tiny tree in our heads for reality to be perceived. Since there is no tiny tree, we have to perceive it in our immaterial soul. Which just bumps the issue to the immaterial.
But then how is the immaterial perceiving the material ?
I can see from your response that there are a number of points that I need to clarify.
Firstly, I am not at all surprised that you, personally, do not feel that your position has been undermined, and I don't suppose you ever will, but one of the themes of my response is that neither your intuition, nor that of like-minded people, is particularly persuasive; for that, you need something that would give a skeptic persuasive reasons for abandoning their skepticism.
From your statement "causality is not likeness", I see that I need to clarify the point I made in my penultimate paragraph. To do so, we need to go back to your second post in this thread [1], and in particular, to the paragraph beginning "for a thought to be about reality, there must be something about that thought that makes it about reality... And for the thought to be about the reality, there must in one way or another be similarity between thought and reality."
The point in my penultimate paragraph is this: If we suppose that there is a causal flow from the real world to brain states via the sense organs, and adopt the premise that (changes in) mental states are caused by (changes in) brain states, then we have at least the outlines of an answer to your tacit initial question (what is it that makes a thought being one about reality?) that does not invoke or depend on any concept of likeness or similarity. In what I quoted above, you certainly seem to be saying that the only thing that could make a thought about reality is that it has a similarity (in some nebulous sense, at least) to reality, but here we have another way for a thought to be about reality that does not depend on likeness in any sense.
At this point, I can imagine someone saying "... but this causal chain could well result in brain states that are, in some sense, similar to reality." I don't dispute this, as I am not arguing against the idea that such a similarity can be seen, I am arguing that a link between reality and thoughts can plausibly be postulated without an appeal to similarity or likeness in any sense, let alone an appeal to a likeness about which it is asserted, apparently purely on the basis of intuition, that cannot possibly be physical in any sense.
In the light of the above, my misunderstanding of what you were calling complex seems moot - an understanding of causality may potentially lead us to being able to say, with some specificity, in what way thoughts are similar to reality, so it is not inconsistent with the claim that there are no complex issues in seeing that there must be a similarity, even if it does not do anything to endorse that view. In the same way, it is consistent with your claim that the only likeness for materialism can be a physical likeness, without actually requiring it.
The above physicalist story has another thing going for it: it contains at least the outlines of a possible explanation of how thoughts are related to reality, while your story does not - as far as I can tell, it is confined to saying that whatever it is, it can't be physical. If you have anything affirmative to say about how thoughts come to resemble reality and what the similarity is, now would be a good time to present it.
Your response to my CD analogy does not need much attention, as it is just an analogy, but I see I did not make it clear what point I was aiming at, which is that while "how can it be that..." questions can be insightful when the person asking has a good grasp of what is going on, they can merely reflect ignorance when the questioner lacks that understanding - and when it comes to how the mind works, we are all ignorant to a considerable degree.
By now it should be clear that what you claim in your final paragraph is wrong: in order to show that your argument has failed to make its case, I neither need to show how there can be likeness that is not physical likeness if materialism is true, and nor do I need to show that likeness is not necessary for a thought to be about reality. On the contrary, you have chosen to make a strong claim - essentially that the mind cannot possibly be the result of physical processes - and to sustain that, you need more than arguments grounded in appeals to intuition about how things either must or cannot be. In particular, anything resembling 'so prove me wrong' would amount to burden-shifting, and while we are about it, the alternative to 'the mind cannot be a physical phenomenon' is not 'the mind must be a physical phenomenon', it is 'the mind might be a physical phenomenon' (something that I believe is probably true, but which I do not claim to know.)
[1] https://news.ycombinator.com/item?id=42312450
Hold a tree in your mind. Now describe it aloud. You have now produced a material effect in the physical world that resulted directly from your idea of a tree. Definitionally, that means it is not immaterial- you conjured it with material and measured it with material.
> Scientifically, though: of course an exact copy of you is you. If you don't believe in souls but still feel like this is not the case, that indicates your model of 'you' is incorrect.
How is that "scientifically" an "of course"?
How is it more "an exact copy of you is you" than the alternative claim, "an exact copy of you is 'you (1)'" (to borrow from file manager nomenclature).
The trivial example of that seems to be that if you make an exact copy, put it in a different room, and then something happens to the exact copy, that thing does not happen simultaneously to the original copy.
An identical copy necessarily cannot be measured in a way that distinguishes itself from the original. So "scientifically" because we're restricting the space to measurements of the physical world, and "of course" because the conclusion falls almost tautologously out of the definitions, no experiment needed. How can the thing scientifically not be you if it is not materially different from you?
To your example- I am not sure which of two points are being made, so I'll address both. I'm not saying that everything that happens to entity 1 also happens to entity 2, just that both are you. Two things can both be apples, even though biting one leaves the other intact. And if something happening to you makes you not 'you' anymore, 'you' isn't really a coherent concept across even a fraction of a second; you'd cease to exist and be replaced multiple times in that time.
In reality, no two things are the same. When we categorize things in any way, we make an error in order to achieve some practical goal.
So if we copy you into a robot and the copying process leaves you unharmed and awake, are they both you?
Yes, they are both me. It is admittedly a weird conclusion, but that doesn't make it false- we did a weird thing and got a weird result. Objectively, there is no case for saying either is not you.
I agree that they'd both be you, but I don't think that's what's at stake. "You" would only get to experience one of the two lives at that point: either the original's or the robot's. Presumably whatever consciousness was present before the duplication still resides with the original.
So there's certainly an argument to be made that if someone created a copy of "you" and put it in a robot, and then destroyed your organic body, that consciousness wouldn't "move" to the robot; it would stay in the organic body and be killed. The robot would be a completely new consciousness, but with all of your memories and behaviors and attitudes.
I'm honestly not sure what I believe around this. Perhaps we will discover "consciousness" as a physical thing and learn how to transfer that as well. Or not. Who knows.
I'm with you on this one - continuity is important too, it's not just about form. A robot version of me might be a suitable substitute to my loved ones, but it would always be outside my head and therefore not a continuous extension of my self.
So twins are both the same person? Not sure what you mean, they aren't both you, you create a new person who thinks the same way you do.
No, one is a copy of you. There is only one you. Literally. A copy can be perceived as being you, but it's literally, physically, not you. If it was you, literally you, it couldn't exist in both. Your copy could answer a question differently than you might, because the copy would immediately start having a different experience than you. If it was you, that wouldn't be possible. If it was a copy of you, you wouldn't exist along side of it, and it wouldn't be able to answer a question differently than you.
I think it would both be me until it started to drift, which would be almost instantly.
Right. You'd have a memory of cloning your mind, and the copy wouldn't have a memory of that. From the first boot of the copy, it would be operating with a different set of data than you. Also the fact you exist along side of it, means it's not you. It's a copy of you.
As long as we're dreaming up stuff, why do we assume the consciousness wouldn't be shared? what if you woke up and there was one mind but two bodies?
Scientifically though, an exact copy is likely to not be possible due to the quantum uncertainty principle.
And I'm not sure that's the only issue even for die-hard materialists : think for instance about all the problems that come from multiple exact copies & things like ownership, relationships...
Another instance of me is a different instance of me than I am, thus it isn't me.
If I copy the contents of an Apple II disk onto my pc and try to "run" it...nothing happens. Isn't mapped to the new hardware. So who is going to write the "human mind/consciousness" emulator that will map onto a hard drive? Will they simulate all the inputs from your body, nervous system, senses, etc.?
And will they perfectly emulate the dynamic situation that is the brain?
You've mapped every neuron perfectly, you've even captured every ongoing signal and the whole thing is recreated perfectly. But that's not the entire story - connections are created and broken all the time, that's what memory is after all. Is it the "same person" if that snapshot is put into an emulator with slightly different results there? They clearly won't be having the same thoughts in a pretty short timescale even if the inputs are perfectly the same.
> This isn't preserving, it's copying. A copy of your brain/personality/you... isn't you... it's a copy. It's a cassette tape playing back the radio you recorded onto it.
That's just one opinion, though. We still don't know what consciousness really is. We don't know what makes us who we are. We don't know if some concept (scientific or supernatural) of a "soul" exists apart from the behavior that arises from the physical structures of the brain.
We just don't know. So maybe you're right, but that's far from clear.
I'd agree with you, except if my brain was scanned and uploaded to the cloud, I'd still exist in my original brain. There can't be two originals of something so the cloud version would be a copy of me, since the original me would still be alive, able to talk to the copy of me living in the cloud.
Suppose you died every night in your sleep and were replaced in the morning with a perfect copy of yourself with all your memories intact. Would you know the difference?
The copy who woke up wouldn't know the difference.
There would be thousands of copies that experienced going to bed and then nothing afterwards.
I had a great deal of difficulty going to sleep the night after this first occurred to me.
I only really managed to assuage my fear by realizing there would be an incredible number of dead bodies to deal with if this were happening to everyone every night, and it was extremely unlikely it was only happening to me.
This really hit me the first time I went under GA.
You're taking what the above person said too literally. They didn't mean physically dying, they meant your mind dies every night when you sleep and in the morning a new mind/a copy of your previous mind, is what wakes up and continues on as you.
Now smeej won't be able to sleep at night again.
I'm not merely a mind. I am an embodied mind.
See The Real Transported Man
Wasn’t The Prestige about this idea, too?
No, but if I died tonight and a copy of me took over tomorrow, what good did that do for me personally? I'm still dead. Selfishly, I don't care if a copy of me continues on, I only care that I don't.
Let's say you just found out that this is actually happening. Every night you die and are replaced with a perfect copy of yourself. Would you do anything differently?
I would probably get a concentrated dose of existential dread knowing that by EOD I would cease to exist. Usually I'm able to delay that dread with the silly reasoning that I still have many years left before I die, but I wouldn't have that mind hack if I knew I had no time left.
I doubt it. You’d maybe freak out about it for a week but eventually you’re going to come to terms with the fact that this weird setup has absolutely no effect on your actual experience.
Same for other contrived things like dying and getting revived every day, getting frozen and unfrozen every day, taking a teleport dematerializer every day for your commute, having a portion of your brain and organs randomly get swapped out ship of Theseus style, etc.
At some point you would just come to terms with the fact that your existence is really just that of being a mind with a past and present. The future doesn’t really matter.
Why not go out with a bang and spend your life savings on hookers and blow (or whatever decadent thing floats your boat)?
My point is that I suspect most of us wouldn't do anything differently, even if we know it's not our consciousness continuing on, because both scenarios are identical for all practical purposes.
You're right, nothing would change, and both scenarios do play out the same. The difference would be the knowing about it part. That's what changes things for me. If it currently happens that way I wouldn't be aware of it, but if I knew that it was going to happen, the act of knowing, makes it an issue, for me anyway, even if it still plays out the same as every other time.
You remember going to sleep last night. So even if you died and a copy of you was made, intuitively, subjectively, you feel that tomorrow is still you, same as yesterday was same you, even if there is some technical disconnect.
If it calms your dread, know that most people who died expectedly (long illness) want to die.
So you have no need to fear the inevitability of death, only fear the inevitability of the desire for death.
Finally, to cure the fear the inevitability of the desire for death, live your life so well that even if you die accidentally, it's worth it.
You are 'caring' objectively, we live subjectively, so your caring is of purely academic interest, even, dare I say it, to you.
Subjective continuity is illusionary, you placing import on this illusion is up to you of course, but there is no substantive difference between the discontinuity experienced by your 'copy' in the cloud and what you experience yourself moment to moment.
A copy of me continuing on not knowing or caring that it's a copy of me is of little importance to me. That's not my point. My point is that I will not be continuing on. Sure, a copy that thinks it's me will, and to the world, that would be me. It would have my memories and make decisions based on those memories in a way that would be the same as I would. But that wouldn't be me, literally. My mind, my inner voice, my experience of reality through my mind's eye, would not exist. I wouldn't exist. A copy of who I was continuing on doesn't help me, literally.
Every morning you, the new you, would not know. And unless the old you died while sleeping, they would know - however briefly.
No but you would not wake up, a clone who has your memories would wake up but the you that went to sleep will never wake up. That effectively doesnt make a difference but I find it pretty odd to think about. We as humans kinda miss a way verify its really „us“ and not a clone.
How do you think Star Trek transporters would work?
Instantly ctrl-f'd to see if anyone would mention transporters. I believe in some circles this has been dubbed "The Transporter Problem". It's a thought experiment that already exists.
Invincible also tackled this problem, with someone cloning a new body, and copying his brain to a new body. For a brief moment both bodies perceive the same thing before their experiences split into the two bodies. The copy wakes up, says goodbye to the original, who is dying, and says "I'm sorry it wasn't you."
This is also IMO related to the ship of theseus problem. Are you the same person you were 20 years ago? Are you the same person in the morning as the person who went to sleep? Are you the same person as a minute ago? What if you add in concussions/memory loss/degenerative disease?
Star Trek's lore includes some technobabble about transporters operating in "quantum" mode to assuage concerns that the person at the transporter destination is not the same as at the source.
Except all those times people got cloned or worse.
Hans Moravec had a suggestion on how to do this in Mind Children: instead of examining the whole brain, you measure a layer, then replace it, layer by layer, until the last. There is never a copy and an original, it's just a ship of theseus self where the neurons are individually replaced with new ones, albeit electronic ones.
I view (hypothetical, sufficiently good) brain upload and emulation the way I view git forks: both children are just as equally "the real one" even if only one of them is on the same physical hard drive.
Looking forward from now, both bio-me and digital-fork-me (or indeed bio-fork-me if the upload gets revented) are equally "my future self", even though they would also each be distinct beings in their own rights and not a shared consciousness over many bodies.
I think that from everyone else's perspective, an ideal copy of me would be me; by the definition of "ideal copy". I, however, would not consider the copy to be me; to me, there is only one me.
That all makes sense. But let's run with this way beyond the foreseeable tech. What if you can replace each neuron in situ one by one up to X%. Then what if it was reversible (the neurons were initially just bypassed, commented out). Someone could then dial it up to 5%.. 50% and if they still felt the same throughout and then went up to 100%. In that scenario would they have copied themselves?
I find it fascinatingly coincidental that neurons are the only cells in the body that don't rejuvenate unless there is some sort of injury [0].
[0]: https://www.dzne.de/en/im-fokus/meldungen/2021/neurons-able-...
You're asking a question that we've asked ourselves for a millennia
https://en.wikipedia.org/wiki/Ship_of_Theseus
Technically, we are not the same people that we were when we were born. The cells in our body have one by one been replaced with new ones and the old ones have died off. The things that make up my body are 100% different than when I was born, so I literally am not the same person, physically.
Maybe this is an indicator that there is more to what makes us, us, than just the physical assembly of atoms in our bodies. There are things I don't know that we'll ever get a full understanding of.
Perhaps but with a modern/futuristic twist.
The cells in our body have one by one been replaced with new ones and the old ones have died off.
This is not the case for our "talking" neurons which is what I was trying to limit this thought experiment to. I think a lot more folks would be ok with preserving their biological brain as is within a robot/clone if that was the only option and understand the body gets (mostly) replaced. Although a few in this thread have alluded to the fact we might be missing important relationships with the rest of the body such as the nervous system and gut biome.
Categories, numbers, logic, grammar, etc. all map onto physical systems, but aren't necessarily directly linked. This is "metaphysics" in philosophy, and is essential in order to even reason about physical systems. Just the concept of a "physical system" is actually metaphysical, but without metaphysics, you can say nothing to anyone about anything. Metaphysics is just generally taken as a "given", but is worth evaluating itself.
Then you'll run into the "realism" vs. "nominalism" debate and you'll understand the philosophical underpinnings of the current culture wars in America.
That's still a copy, just one where both copies remember their life before being copied and equally think they are the original. The original will still experience aging and death.
If you further develop this thought, systems might be capable enough to implant core desires into you before transferring the copy into your new body. "You'll love Coca Cola no matter what, and capitalism".
The above scenario is if you get re-implanted into a self-evolving, autonomous biological entity, unlinked again from the system. If this is not feasible and the only solution is to embed you into a robot with an uplink to the internet, because "why not?", then my biggest issue with a digital self is that there are no guarantees of having a proper firewall, which would equal to total surveillance:
https://en.wikipedia.org/wiki/Die_Gedanken_sind_freiYour memories and biology are you. If someone makes a perfect copy of you, disintegrates your old “you” and then wakes the “new you” up how is it any different?
From what my experience tells me, many people prioritize their children's life over their own. Kind of like the insurance policy that lives on carrying their values further after they have died. So this is not really that much different. Your own body is nearing it's end but a new vessel is carrying your values your influence on the world further. Even if it is a physically separate part from the own body it's the close enough for me to be considered living beyond my death.
As the article says, a book is not the ink, but the words. I am not my physical atoms, but the connections that form my thought patterns and memories. If it were possible to make a perfect copy of those things, the copy would be "me". If the original still existed, that would also be (a different) me.
Of all things, The Venture Bros convinced me otherwise that the copies “are” me, enough to matter anyway.
quiet, Dean
Not that it makes things any better for the original you, but the copied you would presumably feel and believe themself to be the real you. From their perspective, "you" really are living forever. To proceed with the copying would be a sort of act of kindness for yourself.
its Villeneuve's Enemy
thinking about this even a little makes me want to throw up because this can never be tested. yeah, id be dead, but technically, im alive. im alive according to everyone including myself (new me). So im not dead. But like you said, i died and am dead and dont experience any more life. if all evidence is that i didnt die and still exist, then thats the fact
we can theorize that our consciousness continues, but there is NO WAY to ever actually test it because all experience are that consciousness continues whether or not that is the case. sprinkle in some quantum suicide and my stomach is doing backflips from the existentialism od
The line I like to use here when people talk about this sort of 'transferring' your consciousness - imagine there was a 'perfect' chatbot trained on your every experience, thought, and sensation, and mind.
OK, now your consciousness is 'transferred'. I promise to run you.exe plenty. It's just an exercise in self delusion, even if this was possible. That's not you anymore than a 'you' created by a taxidermist and shoved in a closet is.
Reminds me of the scene from the show pantheon where the scanning process kills you because you need to scan and peel the brain layer by layer
> A copy of your brain/personality/you... isn't you... it's a copy
This reads a bit like "a copy of Super Mario Bros isn't Super Mario Bros, it's a copy".
It has all of the bits that let you distinguish Super Mario Bros from Donkey Kong or Super Mario World. Why isn't it also Super Mario Bros?
I'm not a huge Trekkie but I recall the reason Bones McCoy wouldn't go in a transporter was he believed it killed the person and created a copy.
there was an episode where it didn't go as planned
We can't copy a person anyway. We're going to find out that it requires an order of events - order of experiences - aspect that can't be recreated to get the actual person. Everything else will end up being a weak simulation (even if it gets a bit closer over time with improvements in simulating).
In theory if you had an atomic replicator you should be able to make a copy of a person. I'm not saying it will ever be practically possible. But I don't see any fundamental laws of nature that make it impossible.
Whats the theoretical route around the no cloning theorem?
pop science communicators tend to introduce the no cloning theorem in the context of making a copy of a person being impossible, but they could be applying it wrong.
Unless some quantum effect doesn't actually allow that to be true, a "great filter" of sorts. Could make a copy and it just doesn't turn on, due to unknown unknowns, can't know why.
You're essentially suggesting 'maybe the laws of physics would stop applying here for unknown reasons', but it's not plausible that would happen absent evidence.
I was trying to elegantly hint there is probably lot we don't understand. :)
https://pubs.acs.org/doi/10.1021/acs.jpcb.3c07936 / https://pmc.ncbi.nlm.nih.gov/articles/PMC10671017/
That is true (in that it is a statement always true of all subjects), and I don't think anyone was disputing that. Simply pointing out that we don't know of any reason an identical copy in an identical state wouldn't obey the laws of physics and perform identically.
That law of nature is quantum mechanics. Quantum state can be teleported, but cannot be duplicated.
This is a pretty big philosophical question. There's no universal answer, just opinions. Your conclusion is not the obvious one for everybody.
What happens if you have an identical copy down to the atom. Totally impossible to distinguish. You're put to sleep and your mind is copied over. When you both wake, which one is "you"? Both copies think they're "you" and both are correct. Each has the same sense of self and continuity of identity. Maybe at the moment of synchronization "you" ceased to exist as you branched into two new identities.
Say you upload into a computer. From the copy's perspective, it is "you", it's the same self. It might view your biological body like you'd view an old computer after you finish migrating your files to a new one.
Say you destroy the biological body, or wipe its brain once you copy. Does that change the equation? If you destroyed one of the identical clones, is it even relevant to ask which is "you"?
Personally, I think Altered Carbon comes close to how our society will solve this problem. The original copy has its brain wiped and the new copy is "you" by cultural consensus. Truly duplicating a single identity is strongly taboo and illegal.
I think this is a question that either matters to you or it doesn't. In my opinion, it's irrelevant. I, the original "me" am totally free to agree with the copy that it is "me". I can choose to give it my social and legal identity and agree that "me" is no longer "I". My personal choice is to destroy the original, but one could also choose to let the original body continue and assume a new identity, live a new life or whatever.
I view this the same way I do my past self. The person I was ten years ago is not "me", it was a totally different person. That past self lived a different life and had a sense of identity that isn't at all like what I have today. That past me, the ego that called itself "me" died a long time ago and the "me" now is a different self identity built out of new pieces. In my worldview, "me" is a transient thing. The self is not one fixed thing, it changes over time and sometimes it's replaced. I don't see the idea of transferring my sense of self to a new body as anything more extreme than any other kind of ego death.
I choose to resolve this problem with practicality. I agree with myself that the new copy is "me". My social and legal identity, as well as my own sense of self transfer to the copy. My choice would be to destroy the original copy. Barring that, it would assume a new identity and live a different life far away. It'd get a memory wipe if available. I can make the choice to sacrifice my ego and allow "me" to be assumed by someone else. To me, even to the original copy, the new copy is me. In my opinion, "me" is immortal in the digital copy, even if "I" remain in a mortal body.
Is your savegame you loaded today still the same game you started yesterday? Or is it a copy of yesterday's game running forward? Does it matter?
Are electrical processes rebooted from chemical memory every morning when you wake up or after you have a seizure still the same you? Or is it just a fresh copy of your mind that dispersed when electrical signals in your brain lost continuity?
the trick is to do it slowly, a few grams at a time. a one way ticket on the ship of Theseus.
Is the person from before you last slept the same you? How do you know?
If not, I should have had to do something with her body when I woke up where she went to sleep.
I don't think that's the thought experiment. We're not talking about physical bodies, we're talking about consciousness. When you go to sleep, does your consciousness cease to exist, to be replaced with a new one, with all your memories and behaviors and attitudes, when you wake up?
I have memories of being me and doing me things yesterday, but was that really me doing those things, or was that a different consciousness that doesn't exist anymore, and my memories are just the high-fidelity recorded experiences of someone else?
And on top of it all: if that's the case, does it matter?
Something I think about a lot is that people tend to compare whatever the most recent innovation was to humans.
It used to be that what made you alive was electricity; you could Frankenstein shock yourself back to life.
It used to be that you were a clock. Your gears wound up at birth, and then eventually you wore out. You needed repairs.
People love to use metaphors, but none of these things are the full picture. Just because computers are so complex doesn't make you more correct. Your brain isn't the whole of your mind, we already know that's true. Why is this silly nonsense entertained?
I used to buy into this kind of stuff, but I've become more and more skeptical of the idea that you would still be yourself if your brain could be preserved/emulated/transplanted/whatever.
Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
More recent research has been hinting that we may even hold certain types of memories outside our brains.
Humans have always been drawn to neat, tidy ideas, especially ones that draw clean boundaries: it's an appealing idea that our consciousness lives solely in our brains, and that our brains could function independently of our bodies, but it seems unlikely that it's really that simple.
As a neuroscientist working on brain computer interfaces, it's painfully clear to me that we are absolutely nowhere close to understanding the full complexity of the human brain in a manner required to simulate or reboot someone's consciousness. It's not even clear yet what level of abstraction is required. Do we need to map all of the synapses to get a connection graph, or do we need to map all synapses plus the synaptic proteins to assign connection weights too? This is ignoring other types of connections like gap junctions between cells, ephaptic coupling (the influence of local electric fields on neurons firing), mapping neuormodulator release, etc. On one hand, it feels like irreduceable complexity. On the other hand, however, you can lose about half of your neurons to neurodegenerative diseases before you start noticing a behavioral effect, so clearly not every single details is required to simulate your consciousness. It would be a MAJOR leap forward in neuroscience to even understand what level of abstraction is necessary and which biological details are essential vs. which can be summarized succinctly.
Anyone claiming to take your brain and slice it up and have a working model right now is currently selling snake oil. It's not impossible, but neuroscience has to progress a ways before this is a reasonable proposition. The alternative is to take the brain and preserve it, but even a frozen or perfused brain may have degraded in ways that would make it hard to recover important aspects that we don't yet understand.
It is, however, fascinating to do the research required to answer these questions, and that should be funded and continue, even if just to understand the underlying biology.
In addition to all that we don't know about synapses etc, I've often wondered if even mapping all the "hardware connections" so to speak would even be enough. You'd have everything in the right place, but what about the "signals" running on it? Does a certain amount of constant activity on these circuits constitute signs of a "living" brain vs a dead one? How much of our consciousness is really in the topology of the circuits, and how much of it is simply defined by the constant activity running around in them? I assume neural circuits form loops that consist of synapses that reinforce or surpress activity. If these signals going around and around ever "stop", can they ever be started again with the same "patterns"? What if these patterns, the living "software", are at least partially what define you?
Well anyway that's my airchair crackpot neuroscience theory for the world to consume ;). I'm sure there must already be a name for the idea though.
This article [0] may help here:
Six of the sheep were given a single higher dose of ketamine, 24mg/kg. This is at the high end of the anesthetic range. Initially, the same response was seen as with a lower dose. But within two minutes of administering the drug, the brain activity of five of these six sheep stopped completely, one of them for several minutes – a phenomenon that has never been seen before.
“This wasn’t just reduced brain activity. After the high dose of ketamine the brains of these sheep completely stopped. We’ve never seen that before,” said Morton. Although the anesthetized sheep looked as though they were asleep, their brains had switched off. “A few minutes later their brains were functioning normally again – it was as though they had just been switched off and on.”
0: https://www.technologynetworks.com/neuroscience/news/sedated...
Just to add a current link to this conversation:
https://www.scientificamerican.com/article/consciousness-mig...
An article suggesting that consciousness is embodied in the active fields, not the synapses themselves.
On one hand, I wonder if a gradual transition would work. Spend enough time over the years mirroring your conscious patterns onto a computational substrate, and they might get used to the lay of the land, the loss of old senses and the appearance of new ones. There might not be an ultimate "stepping in", but something like you might be able to outlive you, on a substrate that it feels happy and comfortable on.
On the other hand, the idea of "simulating your consciousness" raises questions beyond just cognition or personality. A mechanistically perfect simulation of your brain might not be conscious at all. Spooky stuff.
Keying off your comment - in the field of neuroscience, is consciousness viewed as a kind of simulation?
(I'm just a programmer so it's fascinating to me to consider how actual brain scientists model consciousness in their work.)
Losing half neurons caught my attention. I believe we had a post about how LLM weights being trimmed by 1/2 still had them functional. Interesting.
Imagine the bugs introduced in trying to make a digital copy of a brain. Terrifying for the subject.
There's gonna be million artificial minds of various levels of capacity before the first human mind is accurately simulated.
At that time we are going to be accustomed to glitching artificial minds creates, modified, bugged, debugged that current moral conundrums "is the copy me or not", "is it ok to create a hobbled copy of someone" are going to be as quaint bit akin to counting angels on a head of the pin. Mangled and molded consciousness will be as mundane as computation itself.
For example, it’s widely reported that organ transplantation, especially heart transplants, may cause personality changes associated with the donor.
https://www.mdpi.com/2673-3943/5/1/2
https://pubmed.ncbi.nlm.nih.gov/31739081/
Do we think the personality changes are due to the heart itself, or just due the minor brain damage accompanying an incredibly invasive surgery?
General anesthesia is not good for the brain.
replying to myself to provide more context.
In my PhD work, I helped conduct the human portion of a study on this topic, contributing to some discussions at the FDA [1]. The idea was a bit controversial then, and I've had a few anesthesiologists get mad at me for it, but the general pattern has now been replicated quite a few times now, such that the field has largely moved on from 'Is something bad happening?' to 'Why does it happen, and how do we prevent that bad thing from happening?'[2]. So it has been a gratifying excursion from my typical research before and since then.
[1] https://pmc.ncbi.nlm.nih.gov/articles/PMC4168665/
[2] https://pmc.ncbi.nlm.nih.gov/articles/PMC9750936/
The two sources you've given are both about general anaesthetic in infancy, though. Are you staying it might extend to adults?
Thanks. This is purely anecdotal, but we had a family member whose child was under anesthesia for a severe respiratory infection. He’s been severely developmentally delayed in his first year, and it’s unclear to us what damage done.
Thanks for sharing. It is difficult to know for certain. If the respiratory infection led to hypoxic damage, then that could also contribute. I have not kept up with the field, but generally the most sensitive period for anesthesia was before 4 years or so. As I mentioned briefly, most of my work is in different areas of research so I haven't kept up to date.
Is there any reason to suspect that adults suffer the same effects as infants? (Not asking to be combative, just curious whether children are uniquely affected because their brains are still cooking.)
You and the other commenter bring up good points. Developmental neurotoxicity (with lesser or no effects in older children and young adults) is, I speculate, probably due to differential gene expression during early development versus later when genes related to development are suppressed and genes related to maintenance are more abundantly expressed. The developmental neurotoxicity probably works through different mechanisms than what is termed "postoperative cognitive dysfunction" in the elderly after general anesthesia dysfunction [1][2], which, all I know is that it is a thing. If I were to speculate it would be that in the elderly there are fewer redundant cognitive resources, and so detrimental effects to cognition are magnified. I know that it used to be thought that post-operative dysfunction is temporary, but it seems likely to me (again speculation) that there is both recovery and permanent dysfunction, but the dysfunction becomes a little more difficult to detect. Going back to my paper, where we used a method to disentangle two types of memory processes i.e. recollection (explicit recollection of experiential details) and familiarity (a general feeling of familiarity with things you've seen previously) which contribute to memory performance but tend to be differentially affected by neurodegeneration (recollection is more affected, and generally more hippocampal), so that sometimes, when not accounting for these processes, a memory test will fail to find differences because patients rely on familiarity to answer memory questions.
[1] https://scholar.google.com/scholar?as_ylo=2020&q=postoperati... [2] https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&as_ylo...
The mass-spaced effect in organ tissues may be one explanation of this: https://www.nyu.edu/about/news-publications/news/2024/novemb...
Why would you want to go on in a world that has either left you behind or keeps making the same mistakes over and over in a cycle and won't listen to you because you're too old to understand?
And conversely, I think Kim Stanley Robinson puts it best in the Mars trilogy. Scientific progress often has to wait for the old guard to die so new ideas can be tried. Sometimes there are actually new things and they need to be allowed to cook.
A questionable assumption.
A scientist like Einstein experienced scientific revolutions within his lifetime. That's hardly going to be the norm in the history of science, and also a horrible assumption to think revolutions would endlessly be occurring and reoccurring.
Also, we know when we're on the edge of knowledge, especially in cosmology and physics. We're waiting for revolution there. There's dark energy and dark matter. It doesn't matter if you're old or young, you knew that your theories isn't good enough to explain whatever these are.
Scientific knowledge don't get swept away especially if they're rock solid. Newtonian physics still has a lot of relevance after all. It's just that relativity is even more accurate.
Einstein and Feynman both supposedly struggled with unseating some of Dirac's ideas.
Just imagine someone who died 50 years ago coming back and hearing skibidi toilet, no cap, ohio, etc. Then not being allowed to board a plane without a body scan, and not having money for a plane anyways since bread was dime and a gallon of gas was a quarter last you checked. You can't even get a job you're just a brain and all the knowledge work you could do is 50 years out of date.
I'm still alive and this shit is already starting to get to me. This last round of inflation was a kick in the nuts.
I think the idea originated with Max Planck:
German physicist Max Planck somewhat cynically declared, science advances one funeral at a time. Planck noted “a new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents die, and a new generation grows up that is familiar with it.”
I dunno. I’m a pretty open minded guy so if anyone is going to be immortal, might as well be me. I promise not even to shout at too many clouds.
Plus there are a lot of assholes in the world. Come on, there isn’t anybody you’d enjoy watching get Ozymandias‘d? I’d enjoy it.
There’s a short story about uploaded consciousnesses being used as AI slaves. They go bad once enough years have gone by that they can’t speak the modern language anymore. Then they usually lapse into insanity or depression.
Reminds me of: https://qntm.org/mmacevedo
I don't think I agree with you. There are multiple examples in society of damaged nervous system connections with the brain, spine cord damage for example, where the personality of the pacient changes little. In the same sense, losing limbs or entire sections of your body (aside from psychological trauma and other psychological consequences) don't affect personality that much
Of course the nervous system is much more complex, but damage to the brain almost always result in some sort of cognitive dysfunction or personality change, see the Phineas Gage case for example.
>In the same sense, losing limbs or entire sections of your body (aside from psychological trauma and other psychological consequences) don't affect personality that much
"There aren't any changes except for all of the changes, but those changes don't count because reasons."
I don't know how many amputees you know; you may know many. I was in the army for 10 years during the height of the global war on terror and know more than most. Not a single one is the same as they were pre-amputation. Could be the trauma that caused the amputation, could be the amputation. I'm not an amputationologist.
I do assert that a holo-techno-brain will need a shit-ton of e-drugs to deal with being amputated from its fucking body.
The bacteria in your butthole are a part of you just like your brain, maybe less, but they ARE a part of you.
> Could be the trauma that caused the amputation, could be the amputation.
Given the personality changes seen in people who go off to fight in the military and who end up coming back fully physically intact, I think it's more likely that the personality changes here were caused by the trauma, not by the amputation.
I'm not saying the latter isn't possible, but absent evidence to the contrary, it doesn't make much sense to assume the personality changes occurred because of the amputation alone.
Also consider that amputation -- even ignoring whatever trauma precipitated it -- is its own sort of trauma. I imagine if someone came up to me, perfectly physically healthy, knocked me out, and cut off my leg, I would wake up and develop emotional trauma that would cause personality changes.
I see what you mean- but consider that the gut does seem to play a significant role in mood and mental health. The enteric nervous system may not hold memories, but it seems to have something to do with personality and digestion issues can have negative cognitive effects.
Agree that discomfort can cause temporary problems, and sometimes chronic problems in parts of the body can cause life long cognitive impairment. But that is not to say that these represent "you" or your personality. You brain could still function perfectly without those body conditions.
And for the gut example the brain actually does work normally, stomach and instetines removal (and other related surgeries) are fairly common procedures and I don't hear of people complaining about personality changes. Of course, those types of procedures are extremely invasive in a sistemic way, and not only your mental state, but multiple other parts of the body need to re-adapt. But I truly believe "you" will be still be "you" inside your brain
PS.: I quoted "you" because discussions about the identity of one-self are much more complex, just regard it as the most high level definition of the concept
>Our nervous system extends into our bodies. We feel emotions in our bodies. People with certain kinds of brain damage that prevents them from feeling emotions normally also experience trouble making rational decisions.
I think that may be true enough, but it doesn't have the upshot you seem to think it does.
It just means that what we need to sustain not just a brain itself but the totality of the environmental conditions on which it depends. No easy task for sure, but not something that presents an in-principle impossibility of preserving brains.
I think there's a major philosophical error here in thinking that the added logistics present this kind of in-principle impossibility.
Also, talking like this starts to play with anti-science speculation a bit. Octopi actually have neurons extending through their limbs. We don't. So when we talk about consciousness being "embodied", I'm sorry, it's an attempt to romanticize the question in a way that loses sight of our scientific understanding. Consciousness happens in the brain.
Sure, the brain needs stimulus from its embodied nervous system, and may even depend on those data and interactions in significant ways, but everything we know about consciousness suggests its in the brain. And so the data from "embodied" nervous systems may be important but there's no in-principle reason why it can't be accounted for in the context of preservation.
> Octopi actually have neurons extending through their limbs. We don't.
You don't have neurons extending through your limbs?
I consider that I have likely died more than twice in my lifetime already. And before this body gives up, I will have already died more times. Must simply enjoy the present and give gifts to my future self.
The way various hormones influence the brain alone makes it pretty clear to me already that you'd be a completely different person when taken out of your body, and I'm pretty sure that's just the tip of the iceberg.
Coming back as a brain in a vat certainly sounds awful.
But I would assume that bringing someone back would be tied to a physical or simulated body that provided a compatible context.
Not a bad assumption to solidify in your brain preservation/restoration contract.
The 'people' that undergo this first will experience the most incredible pain imaginable if we do not figure out a way to block such pain first.
Very well, you think that preserving the brain, or even preserving the nervous system, is futile. But what of total biostasis, preserving the entire organism, just like the archaebacteria that live for thousands of years in ice or other extreme environments by slowing their metabolisms to a crawl?
To me, excessive negativity about the possibility of immortality smacks of weakness and defeatism. You either love life and want as much of it as possible, which makes you a friend of humanity, or prefer death, which makes you an enemy of humanity. I take a stronger line than the neuroscientist in the article. “Death positivity” like that of Viktor Frankl, anti-natalism, even faith in magical spiritual resurrections—all are anti-human viewpoints, only excusable in the past because they were copes with the inevitability of death. Now that we have reason to believe it can be averted, we owe our potential future selves every possible effort to save them from oblivion.
There is a bit of research and effort into a head transplants. I wonder if and when that is successful to see how it impacts the individual. Possibly having memories of the body or changing personality.
Can't wait for C.S. Lewis' That Hideous Strength to become a reality
where?
>HEAVEN: The head anastomosis venture Project outline for the first human head transplantation with spinal linkage (GEMINI) [2013]
https://pmc.ncbi.nlm.nih.gov/articles/PMC3821155/
...................
>World's first human head transplant successfully performed on a corpse, scientists say (2017)
https://nationalpost.com/health/worlds-first-human-head-tran...
.............
First Human Head Transplantation: Surgically Challenging, Ethically Controversial and Historically Tempting – an Experimental Endeavor or a Scientific Landmark? (2019)
https://pmc.ncbi.nlm.nih.gov/articles/PMC6511668/
..........
>No Doctor Has Ever Performed a Human Head Transplant. This Neurosurgeon Says He’s Ready to Do It. (2024)
https://www.popularmechanics.com/science/a62831709/human-hea...
I’m not sure I actually believe in quantum immortality but I think it is slightly suspicious—out of all the people you could have been be born as, you just happen to be born in a timeframe where brain preservation might be possible before you die?
I see what you're saying...but...
Most people are alive right now. The population historically has been much lower, so odds are you would be born around the time high technology would support a high population.
I think that “most people are alive now” is not true (although it is often repeated).
https://www.bbc.com/news/magazine-16870579
> So what are the figures? There are currently seven billion people alive today and the Population Reference Bureau estimates that about 107 billion people have ever lived.
> This means that we are nowhere near close to having more alive than dead. In fact, there are 15 dead people for every person living.
So it is not wildly impossible that you’d be alive now, but it is fairly unlikely.
Also, hard to say what’s in the future of course, but even if population growth levels off, you’d expect to be born in the future, right? Which brings up another question, why not?
If we are going to go along on the fully ridiculous implications here and reinterpret all probabilities as conditioned on your immortality, why weren’t you born in the far future? I’d expect people born in the future to have easier access to immortality.
Maybe birth rates will go way down if we discover immortality (lowering your odds of being born later). Or maybe pre-immortality minds will be seen as more interesting and worth preserving (increasing your odds of being kept around).
I'm also skeptical of the idea that one can "upload" consciousness and it would still be "you". I suppose this is true in a philosophical sense, but in a practical sense, subjective experience of consciousness rules the roost. It's inevitably going to be a mere copy of you. You don't get to experience any of it. Similar to a software project which is forked, I think it makes more sense to classify it as an entirely different entity at that point.
I suppose there are valid use cases for this, but I'm not that narcissistic to think the world needs eternal copies of me.
The continued subjective experience of the original consciousness is where I believe the real value lies. Digitisation of consciousness, assuming it has any sound scientific basis in the first place, would practically need to look more like the gradual replacement of brain (and bodily) matter with something more durable, enduring, and controllable. A slow process in which carbon is exchanged for silicon, or cellular damage is continuously reversed and aging kept at bay.
As long as you believe that any theory of subjective experience will ultimately be physicalist, this argument doesn't really work.
There is no continuity of subjective experience even within the same brain, you can be deeply unconscious for extended periods of time and come back.
Yeah, you can argue the same thing about going to sleep, there's no guarantee that the same "you" wakes up.
From the outside, an "identical clone" is indistinguishable.
On the inside, the clone feels exactly how you would feel.
The only problem is the "I don't want my 'me' to die" feeling.
I bet most people would be fine with death/rebirth teleportation.
> It's inevitably going to be a mere copy of you. You don't get to experience any of it.
You can make the same argument for 'you before you went to sleep' and 'you after you woke up'. The only real link you have to that previous consciousness are memories of experiences, which are all produced by your current body/brain.
Think about this: For every consciousness (including you right now) it is _impossible_ to experience anything other than what the thing producing that consciousness produces (memories, sensations, etc.). It doesn't matter whether the different conscious entities or whatever produces them are separated by time or space. They _will_ be produced, and they _will_ experience exactly what the thing that produces them produces.
With an analogy: If you drop pebbles in either the same pond at different times or in different ponds at the same time, waves will be produced in all cases. From the perspectives of the waves themselves, what they interact with is always _exactly_ the stuff that interacts with the water they're made up of. To them, the question of identity or continuity is fully irrelevant. They're just them.
Similarly, it makes no difference whether you only have the memories of the previous conscious experiences, or if 'you' really experienced them. Those situations are indistinguishable to you. The link to future consciousnesses inhabiting your body is effectively the same.
>> It's inevitably going to be a mere copy of you. You don't get to experience any of it.
> You can make the same argument for 'you before you went to sleep' and 'you after you woke up'. The only real link you have to that previous consciousness are memories of experiences, which are all produced by your current body/brain.
Except I know, empirically, that people go to sleep all the time and wake up, and remain the same person. And I know (for practical purposes) I do the same. I -- my mind/body composite -- lie down, and get up the next morning. I remain the same person.
Simply 'copying' or 'uploading' my consciousness, like a computer file, is impossible even in theory, because I'm not just a conscious mind, but a conscious mind which is also a body. Consciousness cannot be split from the material body, even in theory. Somebody upthread said that he'd seen many amputees undergo personality changes as a result of their operations -- this is an informative (if very sad) example.
In terms of computing (one that we do not understand), it would be like cloning a live machine by taking the CPU dye only, or maybe the hard drive. How many parts you need to take away from a computer for it to be the same machine? It's easy though with a VM, or a kernel that supports many hardware. Kind of a digress, but I liked this idea.
I don't think this is a great analogy because computers don't have consciousness (yet).
But I usually move the hard drive (or at least its contents) between machines when I get a new computer, and that's enough for me to think of it as the "same", even if I reinstall the OS on the new machine and just copy my home directory onto the new one.
> More recent research has been hinting that we may even hold certain types of memories outside our brains.
Not just hinting - the evidence is strong and accumulating rapidly. The gut, in particular, has so many neurons that it is considered the body’s “second brain”, to say nothing about the impact that gut bacteria have on your mind.
If you really wanted to create a copy of your “mind”, you’d have to image every neuron in your body for a thoroughly accurate copy. And then accept the fact that your entire behavioural profile is then missing the input of your gut bacteria, which appears to have a significant and non-trivial impact.
I agree completely. This is unsubstantiated hype.
Transferring our consciousness into "the net", or some other fuzzy concepts are so far removed from reality as to be complete fiction. This includes freezing our brains and reanimating them later to resuscitate our lives.
They not only massively overestimate the functionality of today's tech to receive something like our consciousnesses, but even more so, by orders of magnitude, underestimate just how complex our living bodies are.
We only have the vaguest of ideas about how our physiology works (while we might be able to replicate flesh cells for "fake meat", we have 0 understanding or control over how those cells organize to form macroscopic organs). Applying this to the brain, our understanding is even more primitive. An example would be recent news that perhaps the brain is not sterile, but hosts a microbiome. Whether or not the brain hosts a microbiome is still "controversial".
We're still hundreds of years away from a comprehensive understanding of physiology.
But of course, we're never going to live that long, because we still believe (statistically as a species) in invisible guys in outer space that tell us we need to dismember people who believe in the WRONG invisible guy in outer space.
Our primitive violent ape species will extinct itself long before we ever have a comprehensive grasp of how life works, especially to the level of understanding consciousnesses...
Ultimately, I am going to quote one of my favorite writers [0] and say that I am not afraid of a life that ends.
I don't want to be a brain in a jar. Or in a computer either. I enjoy experiencing physical sensations and interacting with the world in meatspace. And if I can't enjoy either, then just let me die.
And I apply this to not just brain preservation, but any attempt to artificially prolong the quantity of my life at the expense of the quality of my life. I do not want to spend my last years in a hospital bed hooked up to machines and unable to move. That was how my dad died, and even then he was lucky enough his partner (who he had discussed this with before and who had the authority to make the decision) eventually agreed to switch him to palliative care in his final hours. Similarly, I have seen what chemotherapy does to people, and I have long since decided that if I ever get cancer, I will refuse chemo and let myself die. I am also having a living will drawn up that includes a DNR order, multiple scenarios where doctors will be ordered to pull the plug, and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
I will make sure I die with my autonomy and my dignity intact.
[0] Al Ewing. He writes comics. Read his stuff, he's good.
Do you have a source for this quote? Googling just returns this page.
I was particularly struck by:
> if I ever get cancer, I will refuse chemo and let myself die
And figured this quote must be at least 20 or 30 years ago? Cancer isn't necessarily a death-sentence, and many treatments are much less harsh than they were 20+ years ago.
This seems a bit extreme. Chemotherapy and its effects can be a very temporary thing, and your quality of life can go back to normal after you've finished your course and the cancer has gone into remission. Certainly there are aggressive cancers where you'd be fighting a painful battle of attrition, but there are many cancers where prognoses are good, and quality of life once treatment is done is more or less the same as before. A blanket personal ban on chemo is reckless and shortsighted.
The prohibition against amputation and sensory organ removal is a bit nuts too. You'd rather die than have someone remove one of your eyes or ears, or say a hand or arm or foot or leg? That is profoundly sad, and intensely insulting to anyone who has had to deal with that sort of thing and has nonetheless lived a full, rich life.
I get that many medical interventions do actually have a terrible, permanent effect on quality of life, but these seem like pretty extreme views that ignore reality.
I don't know what the commenter who posted about chemo and amputation actually thinks or believes. But I hesitate to call them "nuts" or to lecture them about how they have a wrong opinion. And I would not expand their personal opinion as a judgment on people who decide they can live with the effects of chemo, or amputation, or loss of an eye, because nothing in the original comment included a judgment on other people. Everyone has their own threshold for what they consider a life worth continuing, but we should not impose our own thresholds on other people, or judge them for making different choices.
For me the question goes beyond "Can I survive chemo (or amputation) and resume something like a normal life?" When you have to face cancer or loss of a limb or any illness or injury that threatens your life, or perceived quality of life, or dignity and autonomy, you necessarily have to think about what that means for your future. Until you get a diagnosis of (for example) cancer you don't know what it feels like, or how you will react, to the fact that no matter if you survive the treatment or not, you will always have that threat and reminder of your mortality in your conscious thoughts. You think about how you might not get so lucky the next time, how much your treatments might cost, what your illness might put your loved ones through, how far you will go to keep yourself alive even when it imposes costs and obligations on other people. And you think that maybe other people will have to make hard decisions about your future if you can't. A cancer diagnosis doesn't just affect me, in other words. If I lost a leg or arm that would impose burdens on my wife and family, affect my ability to make a living. Those thoughts more than the medical condition itself lead people to arrive at opinions such as the original commenter expressed.
Having faced my own mortality already I know I think more about how my own end of life scenarios affect other people more than how they will affect me. I worry that I will suffer a stroke, or slip into dementia, before I can pull my own plug, leaving people I care deeply about with that awful obligation, and the burden of caring for me rather than living their own life. And it's that thought, not the fear of disease or dying, that leads me to my own ideas about how much I might endure, because I won't endure it alone or without cost to others.
I suspect part of extending human life much beyond 120 years is going to be finding ways to delay physical adulthood, so that proportionally you still have the same time to learn and grow, and those growth hormones are still kicking around repairing things for longer. Because the quality of life 100 years after your organs have stopped repairing themselves is not going to be that great, but if you could reduce that to 80-90 years then maybe.
> (…) and a prohibition against anyone amputating any of my limbs or sensory organs even if it's necessary to save my life.
> I will make sure I die with my autonomy and my dignity intact.
Amputees have autonomy, dignity, and rich lives. To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
What if instead of requiring an amputation, he loses faculties by accident like suffering from parosmia due to COVID or having a weight crush a body part? Did he suddenly lose his dignity? He certainly lost some autonomy. What’s the next step then?
Many people end their life when they find it's too painful to live. Many more wish they could -- the debate around end-of-life issues is raging in many countries.
If having to undergo a few months of chemotherapy in order for your cancer to go into remission is "too painful to live", then I think someone's threshold for pain is way below that of the average person, to a point where that's kinda sad.
I know several people who have gone through chemo and came out the other side happy and healthy, after recovery. They live full, rich lives. They are much happier living than dead.
Sure, there are some cancers where you end up with declining quality of life for months or years before you eventually die. I wouldn't fault anyone with deciding to opt out of that from the very start. But that's not what we're talking about, exclusively: the person upthread was very absolutist and rejects chemotherapy in it entirety.
What’s your point? I support the right to euthanasia, nothing in my comment contradicts that.
We’re not talking about someone in pain wishing to die, we’re talking about someone vehemently arguing they would rather die than live without a limb, without having experienced it. And their reasoning is a lack of autonomy and dignity, none of which are a given.
There are literally millions of people without limbs, half a million new ones per year in the US alone. They’re not poor invalids, they’re people who adapt and can do things we only dream off while living normal lives.
> To believe that the loss of a limb is so severe that death is preferable is absurd and insensitive.
No. Denigrating someone expressing their personal opinion seems absurd. Since the commenter did not impose their opinions on other people you had to put those words in their mouth to call them insensitive.
I prefer to die with autonomy and dignity as well, meaning I would like to pull my own plug. That other people might have a different threshold, or want to die differently than I might, seems neither absurd nor insensitive. The commenter just described their threshold, they didn't judge other people.
> Denigrating someone expressing their personal opinion seems absurd. (…) The commenter just described their threshold, they didn't judge other people.
My sentence does not judge the person, it criticises the belief. Learn to differentiate or you’ll be doomed to a life of ad hominem attacks and taking things personally.
If person A says they love spiders and person B replies they find spiders repulsive, there’s no value judgement passed on person A.
My remark was not a commentary on yourself, your world view, the author, or your approval of them. I don’t know you.
> I prefer to die with autonomy and dignity as well
Who wouldn’t? By itself that statement is meaningless. What’s in question is how one defines the terms.
I invite you to take a closer look at that quote and understand what it means to the people who live those situations. Let’s exaggerate to make a point: If someone said they refused to be treated by a black doctor even if their life depended on it, and followed up with the remark they would make sure to die with dignity, do you not see how that would be insensitive to black people? A writer, especially an ostensibly good one, would understand that basic sentence structure.
Again, that is a purposeful exaggeration to make a point. I’m not making a remark on yourself or the author, I am disagreeing with the belief.
I'll judge the person. Someone wanting to die because they lose an arm is nuts.
> My sentence does not judge the person, it criticises the belief.
An opinion or belief can't "be" insensitive. A person may intend to say something insensitive, another person may interpret an opinion as insensitive (as you did when dragging in amputees and people suffering from other conditions and injuries). "Insensitive" can only refer to a person's intention or another person's reaction. So calling someone insensitive for their expressed opinion does indeed judge the person.
> Learn to differentiate or you’ll be doomed to a life of ad hominem attacks and taking things personally.
Surely someone as skilled in rhetoric as yourself can see the irony of you warning me about "a life of ad hominem attacks" embedded in an ad hominem attack. Then you followed up with the implication that I don't understand "basic sentence structure." Address my actual comment rather than telling me what I need to learn and how I will get doomed for not thinking like you.
As for spiders and racists, those have nothing to do with anything in this thread. If someone says they don't want to live if they lose a limb or face chemotherapy, whether you agree with their stated choice or not, no other person or race got mentioned or implicated in the comment you replied to. Setting up a false and deliberately inflammatory analogy to make your point, equating an opinion about perceived quality of life with racism, doesn't help your argument. Try sticking with countering the arguments the commenter (and I) expressed.
Personal opinions about end-of-life care, personal autonomy, dignity have the same flavor as religious beliefs: you can't counter them with logic. Just calling someone wrong or "insensitive" or "nuts" as some other commenters have misses the mark, because the subject involves beliefs, not facts that we can argue. One can express their own different opinion, but going beyond that starts to verge into attacks on personal beliefs, which requires making assumptions about another person's faculties, judgment, and ad hominem, all of which you have deployed in your comments.
If someone else is free to decide that they'd rather die than lose an eye, or rather die than have to experience a few months of chemotherapy in order to be cancer-free, then I am also free to decide that those views are absurd and extreme, and reflect a deep misunderstanding of medical outcomes.
> Denigrating someone expressing their personal opinion seems absurd.
There's a difference between saying someone is foolish and saying their beliefs/opinions are foolish. The former is not what the GP did.
> then I am also free to decide that those views are absurd and extreme, and reflect a deep misunderstanding of medical outcomes.
I don't agree. You can decide that another person's expressed opinions don't align with yours, according to what you believe and think you understand about medical outcomes. The original comment didn't mention medical outcomes so I hesitate to judge how much the commenter knows about that. And I hesitate to call someone's personal views absurd. They have opinions I may or may not share. I can't make a rational argument to prove them wrong.
A person's beliefs can't "be" foolish or even wrong. Belief by definition does not come from an objective and rational evaluation of facts and probabilities. I can say I hold different beliefs, but no more.
We most often encounter this kind of argument around religion. Someone can sincerely hold religious beliefs that don't submit to rational and objective argument. We can have different beliefs but we can't prove someone else's beliefs wrong. To call a belief that you can't argue against with reason "foolish" or wrong equals calling the person holding the belief foolish and wrong. You can show that chemo can work and people with cancer can recover. You can't say how any individual should feel about that, or how they should choose to deal with a cancer diagnosis. The original comment didn't make any statement about whether chemo works or not, or whether some people can thrive with dignity after losing a limb. Rather the original comment expressed one person's belief about how they feel about those possibilities, for their own definitions of autonomy, dignity, and quality of life.
> I enjoy experiencing physical sensations and interacting with the world in meatspace. And if I can't enjoy either, then just let me die.
Who says a brain in a jar can't enjoy either of these? Who says that isn't, in fact, what you are enjoying right now?
More realistic to make plans for dying, than to fantasize about living forever.
One of Don DeLillo's later good novels is about this stuff (Zero K).
I always think people's attitude toward possible future worlds is interesting. You can see a wide spread of opinion in this thread -- whether you think functional immortality would be a good thing says a lot about who you are. Ditto for colonizing other planets, automating all work, building AGI, and so on.
I suppose I'm on the side of the technologists. I think immortality is probably possible and humans should try to achieve it. But along the way it will mostly be snake oil and cults. And, of course, it's all but guaranteed that everyone in this thread isn't going to make the cut-off.
I'm certain immortality is possible, and it's also likely to be achieved, because we always do everything we can do, regardless of consequences.
But I think this is the acme of selfishness. I don't want to be immortal, and I wouldn't want to live in a world with 500-year-old know-it-alls running around "oldsplaining" everything to everyone else.
I have, thankfully, a fairly good chance of dying before that happens.
How is immortality selfish? Selfishness requires taking from other “selves” who have unmet needs of their own. But there’s every reason to believe a society of immortals could either function perfectly well without producing new selves, or that it could choose to reproduce at a slow rate sustainable with its ability to extract resources to support itself. Any new selves that were born would be provided the same opportunities that we provide new selves in the present day—breastfeeding, education, healthcare. How would that be “selfish?”
Is it selfish when a centenarian lives past 100? Is each additional year of life obtained by a centenarian “selfishly” stolen from some hypothetical unborn self?
It's funny people always talk about this in extremes.
"Do you want to live longer?"
"Yes"
"OH YOU WANT TO LIVE A MILLION BILLION YEARS?!?!"
There are values in between immortality and ~80 years.
Oh and when we have defeated death, when will people decide it's time to go? And how will they do it?
> when will people decide it's time to go?
They'll decide when they want to decide. Some might choose to actually live forever, and that's fine. Others will choose a more current-human type lifespan, and that's fine. Some will choose 150, some 300, some 1000, some 10,000. All of those numbers are fine.
> And how will they do it?
There are already humane forms of medical euthanasia performed in progressive places in the world; this question already has answers, and likely more will be developed over time. I don't think it's an important question or issue to discuss, as long as people have legal options.
They'll decide when they're ready. I would love a little more time on this planet. And when it's time, I'll hop in the nitrogen pod. People are already making that decision in some parts of the world.
Which do you think the Jeff Bezos of the world will choose?
People living forever will stall humanity. Generations and their old ways dying with them are necessary for civilizations to progress.
There's no proof of that.
Life expectancy has been increasing over time, especially in the past century or so. I don't think it's credible to suggest that civilizations have progressed meaningfully slower now that people live to be 80 or so instead of only 30, which was common in recent history.
And even if immortality "stalls" humanity, so what? People matter, not technology or some amorphous concept of "progress".
I don't want to live more than 100 years and I don't want to come back from the dead either. Make way for the next generation and their wants.
Why 100? You can also make way for the next generation by living 80, or 60, or 40 years. Yet no one would be okay with that option. Funny thing is historically speaking 40 is a lot closer to the useful human lifespan than 100. So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades. Why do you think that won't change drastically another few decades from now?
"Funny thing is historically speaking 40 is a lot closer to the useful human lifespan than 100."
That isn't really true. Life expectancy was historically driven down by high infant mortality and lack of medicine. The meaningful human lifespan has been in the 70s for the majority of history. (Lifespan is different from life expectancy)
"So this strong belief of yours is really driven by advancements in society and healthcare over the last few decades."
Who says it's a strong belief?
"Why do you think that won't change drastically another few decades from now?"
Because there's no real evidence to support that. Life expectancy hasn't gone up drastically over the past 50 years. Rates of chronic illnesses, including things like dementia, have gone up drastically. So even if people are living a couple years longer, they're generally sicker and it's costing more. Even if medicine makes drastic improvements, 100 is still a lofty goal. I'd be fine making it only 80 too. I'm actually skeptical that will even happen. What I do know is that I don't want it to take more than 100 years.
You’re being defeatist and ignoring the evidence presented in the article—even hospice patients want to live longer. You, too, will desire to live before (and hopefully: if!) you breathe your last.
This is because the entire goal of the sentient consciousness is simply to preserve itself as long as possible. DNA has the essential goal of replicating itself in reproduction. Consciousness, by contrast, appears to have no goal other than self-preservation. People sometimes choose to sacrifice themselves, but usually only when death is inevitable and they wish to save someone else from it (Lily/Harry Potter and Medal of Honor type situations).
I'm not really being defeatist nor ignoring evidence. Perhaps I just have a different perspective. There can be moral/ethical arguments for why mortality is a good, or at least useful, thing.
That's fine, but please don't stand in the way of those of us who would love to experience the world on a longer time frame, and are frustrated that the current level of medical knowledge doesn't allow it.
It seems odd to me that you would be frustrated about a fact of the natural world that you have no entitlement to expect it to be different.
Why shouldn't we be frustrated by aspects of the natural world? Bad weather, disease, death and so on. Was eliminating smallpox odd because we had no entitlement expect to that?
Those things are about the timeliness - bad weather one day vs another, some people get the disease and others don't, early death verses a longer life. It's about what is reasonable to expect. It might be reasonable to expect good weather on a specific day, or even to live past the age of 50. It's unreasonable to expect to live indefinitely.
Deep down, many people want to be a kind of god. Our technological progress lets that illusion of godhood seem achievable.
Some people just take longer than others. I'm gonna be as patient as I can be with them.
Yeah, I find the need to live forever kind of.. juvenile? You can’t let go of your ego for long enough to realise that at some point it’s better to make room for a new human with new perspectives and new ideas?
I like to think of it this way: if life was a game would you want to play the same character forever? No.. if you’re gonna keep playing the game it’s more interesting to start from scratch now and then. I don’t believe in reincarnation. There’s no need to. What you really are deep down is an instance of humanity. Almost all your genes and all your culture comes from and is shared with other humans. Any new instance (new human) is you playing a new character, essentially. If you’ve contributed to shaping the world you’re leaving behind this is even more true.
Unless you’re believe in a soul in the christian/jewish/muslim sense I guess, but then why would you fear death?
IMO the pursuit of immortality is far more dangerous and far more likely to kill humanity than AI. At least it may make us deteriorate to insignificance. Humanity is a super organism and we have a name for the phenomenon where parts of an organism figures out how to die and yet still replicates: cancer
We don't need to live forever as shown by the fact we've got by without it so far but death is kind of depressing. I've never really got the distinction that say killing millions in the holocaust is terrible but similar millions dying through age is desirable.
Required reading for this theme, and one of my personal favorite short stories:
https://qntm.org/lena
> our societal acceptance of death comes from an inability to imagine anything otherwise
CGP Grey captures this sentiment nicely in this animated essay: https://www.youtube.com/watch?v=cZYNADOHhVY "Fable of the Dragon-Tyrant" [2018-04-24]
As usual, it must be warned that living forever means you can be tortured forever. Interesting that an article on Dante is getting upvotes today...
Eternal life doesn’t necessarily mean being impervious to harm. If you live indeterminably because your flesh brain was preserved or there’s a digital copy of you on a hard drive, a simple drop on the floor could terminate your existence.
And if we’re talking about fiction, there’s no obligation to make those lives unbearably immortal either.
https://www.youtube.com/watch?v=dlcxokM970M
> Our flesh is indestructible. Our lives are never ending. But not even in the dumb vampire way where after a while you hate it and you can’t die. We can die whenever we want. We just don’t have to.
Catholic theology actually justifies the belief in Hell by arguing that an eternity of suffering in Hell is a blessing, because it admits the one benefit of existence itself, while total annihilation has no redeeming factors whatsoever.
I was raised Catholic and it's stuff like this that turned me into an atheist.
Don’t threaten me with a good time!
Yep even a life sustaining brain vat that keeps you alive for 1000 years while feeding you some bad stimulation is not very hard to fathom
Even worse no stimulation... And with enough trickery maybe avoid your brain going to coma or even sleep.
I've been thinking a bit about how living on could play out without being too futuristic with the technology. Coming back from cryopreservation is probably many decades out but we are quite close to being able to make a virtual AI version of you, working a bit like an actor playing you. Tech like Heygen is quite good with avatars and voices, and chatgpt type AI could pretend to be someone in a pretty terrible way just now but that stuff will get better and the virtual you could act as an assistant, learn your ways and help out.
Then when the physical you passes and maybe is cryopreserved the virtual you could conduct the funeral type service and say hey guys physical tim333 is gone but I'm still here and you can chat to me on the web etc. Virtual you could maybe have some investments and could buy gifts for the grandkids and post on HN etc. Then in a few decades virtual you could get data from cryopreserved you and incorporate it.
Perhaps it could be done as a startup?
Those with eternal souls see the deception.
Part of personality lives inside of our stomachs as well, apparently, and so much of who were are is driven by our interaction with our nerve endings. I think even with a perfect copy of consciousness, it's pointless without a perfect copy of our bodies as well.
I think that AGI, when and if we ever reach that stage of technological innovation, will enable us to live in a utopian world where our bodies are impervious to biological defects or age and we have little to no reliance on food and water — where we can control all aspects of how we feel with a slider, or a thought.
Like the Lotus-eaters from Homer's Odyssey, some alien tribe will eventually show up and decide we're imperiled instead of living a perfect life in blissful stasis, and kill us to stop whatever afflicts us from spreading to them.
Anyway, I don't want the brain preservation thing, thank you.
Perfect copies are overrated.
> Part of personality lives inside of our stomachs as well
Is it the part of your personality that you like or maybe it's the part you hate?
One of the absolute best short movies I've ever seen is The World of Tomorrow.
The whole thing is cute, but every time I read about brains in boxes I can't help but think of this scene about grandpa's consciousness being uploaded into a cube:
https://www.youtube.com/watch?v=4PUIxEWmsvI&t=88s
Once the brain has been powered off and the back on, is it even "you" anymore? Hasn't that person died and a new one was "born"?
Used to go a little bit crazy thinking about this when watching people getting transported in StarTrek.
I like how the video game Soma showed it. If you fork a brain you kind of have a 50% chance of it continuing into the copy, and 50% chance of you being "left behind". https://www.youtube.com/watch?v=x790AjID0FA
I get the point, but I think the most self-consistent answer is that your conscious experience has a 100% chance of staying in the original body. (And similarly for destroy-and-remake teleporters.)
Indeed, the coin toss explanation "Catharine" uses on the Simon copies is merely a manipulation to ensure he continues following her instructions so that the ARK is launched, obscured to the player by the necessary mechanic of always controlling the surviving Simon copy. The only "real" Simon died in Toronto.
Thinking about that is like getting spooked by your reflection in a mirror - instincts being used in a situation where they don't apply.
If there's only one you at a time, neither you nor an outside observer can tell the difference.
LOL, "we're not married, your husband died in the transporter".
"Ah can I have his laptop? He's got some saved games he wanted me to have"
If all the theory is correct, yes you are still you. Maybe damaged you, but you nonetheless.
Imagine the ship of theseus thought experiment, but instead of replacing part by part you store it on a dry dock (lost its function), some time after you put it in the sea again (recover the function), for all effects this still is the ship of Theseus
See the Bobiverse books for some fun sci-fi on this.
I love how people have wet dreams about living forever by uploading their mind to a computer but put absolutely 0 fucking effort actually increasing their health/life span in real life, and/or waste their days/week/years doing stuff they hate to "enjoy life" later (when their body is already half way rotten). As long as it's sci fi and they have 0 effort to provide they'll suck every single drop of hope but as soon as there is something actually actionable they recoil in horror
imho if you're not lean and exercising every day you have no business talking about living longer, you've already refused the only magic pill there is. It makes all the difference between having one foot in the grave at 60 or still chopping your own fire wood at 80, and all it takes besides a bit of luck is to move your ass 45 minutes a day
There's no contradiction here. You're talking about people who have already detached the concept of self from their physical body. They think of 'me' as their brain, so why maintain the body they're 'trapped' in.
People who try to solve all their problems with intellect tend to suffer from this. And many of them never learned that being in shape makes you feel a whole lot better.
Do memories and/or brain functionality rely on volatile storage? So if you die, that storage is gone.
How our brain works may rely on more than the structure of the brain. When you are brought back (booted?), things may not work the same.
From what I've seen of research on apparently-dispersed storage of memories in worms, I'd not be at all surprised to find that a human brain separated from a body has (assuming we could "boot it up" in that state, as you put it) lost a lot of memories or functionality beyond the obvious, even assuming we could perfectly preserve everything present in the brain per se.
Not volatile storage for most of it. If you get knocked out or similar you have most of the memories. It probably is partly down to the structure of what's connected to what and parly chemical changes at the synapses, although I don't think it's fully understood.
It'd make it pretty hard to get memories out of a brain sample as it's hard enough just to see the structure in an electron microscope. I don't think they have any way to log the chemical changes in the synapses presently.
People get struck by lightning or have epileptic fits, and it doesn't wipe their personality.
We can barely even do “artificial intelligence” correctly. Let alone mapping the human brain to a digital representation. It’s an interesting concept but in order to “preserve” the brain, wouldn’t one have to get their brain dissected layer by layer and thus killing the subject in the first place?
Then storing all of that information without any loss would likely well exceed current technological limits. Then the computing power need to run the brain would have to be immense.
> We can barely even do “artificial intelligence” correctly
Even if we could the brain is an amazingly efficient mechanism for computation. It uses such a low amount of power that it's hard to imagine a computer based AI model ever genuinely competing with it.
If the environment and entropy are human scale concerns than AI is decidedly a dead end for us.
It'd be a long way into the future to do brain uploading type stuff. We are not close yet but maybe in fifty years?
Leaving aside whether or not I think this is A Good Idea[1], I am fascinated by what other limits people would run up against, if this were possible. Memory, for example. Is it even possible to retain a first person memory for hundreds of years? Is there some upper limit above which you can no longer form new memories? Would it become much harder to motivate yourself, without the pressures of a limited existence. How would it affect our psyche?
[1] I do not.
> Memory, for example. Is it even possible to retain a first person memory for hundreds of years?
It's not. But it's fine, because it's not even possible to retain memories of typical lifespan or even short one or even a decade or even a year. You lose way more than 90% of it and the rest gets largely altered anyways.
I think that immortality would be a disaster, personally. That we die is a critically important aspect of life. I'd prefer that we work on ways to make the death process less traumatic.
Why? We have already decimated any semblance of natural selection so death is no longer a necessity from an evolutionary standpoint. Maybe immortal humans would be more beneficent because they wouldn't be scrambling to get ahead in their short time here.
If I didn't only have 30-40ish productive years to achieve whatever I will in this life it would be much easier to donate money or time to charitable pursuits.
> If I didn't only have 30-40ish productive years to achieve whatever I will in this life it would be much easier to donate money or time to charitable pursuits.
Spoiler: The haves are not going to get any more generous when they've got thousand year lifespans. You would just have end up having to spend hundreds of years grinding away at the bottom of the ladder instead of 30 or 40.
We grind away to keep the body supplied and satiated.
A transcended/uploaded mind won't have the earthly needs and would be a lot cheaper to maintain, leaving it to achieve otherwise impossible results.
Want some runtime on the good CPU? Got to work.
> We have already decimated any semblance of natural selection
We certainly have not.
> If I didn't only have 30-40ish productive years to achieve whatever I will in this life it would be much easier to donate money or time to charitable pursuits.
True, but if you lived forever that would mean that it wouldn't be realistic for new people to come into the world, which means far fewer new ideas and ways of looking at the world. That would a net loss for humanity.
1) There's no danger of overpopulation. People have a natural tendency to reproduce slower when they feel safer.
2) Trivial argument: if people already lived indefinitely would you advocate murdering them to "make room"? Telling people they shouldn't be able to pursue a longer life is equivalent. Making that decision for yourself is perfectly fine; making it for others is not.
3) 150k people die every day, nearly 2 people per second. If fixing that tragedy creates new problems, bring them on; we'll solve those problems too.
1) Is not clearly true. Yes, 'feeling safer' pushes down on reproduction rates. But the _total_ effect on population growth could still be positive if the total death-rate drops enough -- we don't know enough to say for sure. And frankly, I think the most likely outcome is that people would be more likely to have kids if they didn't have to worry about missing out on their chance at XYZ dream.
2) Not true from most moral perspectives, including 'common sense morality'. In a pure utilitarian sense, sure, but most people don't subscribe to that. For example, choosing to not save someone from a burning fire is not the same as choosing to burn them to death. Both the actor and their intention matter.
3) I don't disagree with the first half of your point (that this is a tragedy) but I cannot share your optimism re.: us solving the consequent problems. If there's anything that the last fifty years of modernity have shown, it's that we're actually quite bad at solving broader social problems, with new and even-worse problems often arising well after we thought the original problem settled. Consider global warming (to which the 'solution' looks to be the further impoverishment of the third world, and probably mass deaths due to famine/drought/heat waves), or how we in the US 'solved' mobility by destroying main streets and replacing established public transportation with cars and mazes of concrete. Now we've "solved" loneliness by giving everyone a phone and -- well, I'm sure you know how that went.
1) We already have a growing population, and I don't think it's inherent that curing mortality must make it grow faster. The net effect would certainly be an ongoing upwards growth (since I would hope that population never goes down), but I'm arguing that the net effect does not inherently have to be unchecked exponential growth. Immortality doesn't solve resource constraints, and resource constraints do influence people's choices. That said, I also believe that even if it did result in faster growth, that isn't a reason to not solve the problem.
2) The equivalence here isn't "choosing to not save". Choosing to push someone back into a burning building, or preventing them from trying to escape, is equivalent to choosing to burn them to death.
3) I am an incorrigible optimist and don't intend to ever stop being one. Humanity is incredible and it's amazing what we can solve over time. I don't believe that any potential solution we might come up with is worse than doing nothing and letting 150k people die every day.
That sounds like a more tractable problem than death.
Everything would scale with it, 30 or 300 years, as long as you're running the same system it wouldn't matter
Not even talking about the fact that we live in a very finite system
Wouldn't any change we make to who dies when also be natural? How is that not natural selection? Did earlier generations not attempt to stay alive?
I like knowing that the worlds biggest assholes sometimes lie awake at night fearfully pondering their own death. I don't want to deprive them of that.
The Great Equalizer
Are you serious? Have you been around old rich people? Do you find them generous? People grow more selfish with age, because they care less and less.
The only thing to gain by abolishing biological death is perpetual civil war.
Plenty of the most-powerful already keep causing harm just to make number go up even more, well beyond the point at which they can conceivably personally benefit before they die. Imagine if they could conceivably personally benefit from that because they live for centuries. Why would anyone expect that to improve their behavior?
Well. How would you like a 290 years old Putin keeping a tight grip on Russia in the 23rd century?
I see, so this is where the God-Emporer of Man comes from.
purge the xenos
When I look back on my 45 years life, there are spans which feel like a different life altogether. I thought differently, and made choices that I won't make today. I'd say "in my former life" as if that life ended and a new one began. I suspect youthful immortality would be a sequence of many deaths and rebirths. If you had the neuroplasticity of a 25 year old and the experience and wisdom of a 50 year old, I imagine it won't get boring, and perhaps new ideas and modes of living won't require a generation to die, and a new one to be born.
What if death is just a 'feature' of how life evolved in this planet? What if we discover life in other planets that are just endless? It just seems too anthropocentric to think that all forms of life must die.
Ok, that maybe part of how 'WE' define life, but, for me, that looks a lot like an arbitrary definition.
It's a bit like fighting against the ocean imho, it doesn't matter how much you put into it you'll always lose eventually.
It's much simpler to accept and live within your constraints than to waste your life and mental energy wishing you could be/do something you will never be/do.
If you think about it most of the things we "fix" are extremely wonky, even something such as a bone fracture isn't guaranteed to heal 100%, most medicine have massive side effects, organ transplants have something like 50% survival at 15 years on average, &c. We think we're getting more and more control on things but most of it is a hack job temporarily delaying the inevitable
Also, anyone thinking being uploaded to a computer forever is heaven on earth must live a pretty fucking terrible life to begin with
>critically important aspect of life
Nothing says "life" like being dead.
I meant "life" in the big picture, not on the individual level.
[dead]
To be frank, the extent to which the very old now dominate science, business and politics is already unhealthy. I shudder to think what our world would look like if our most important positions of power were dominated by men born 110 years ago.
They say science advances one death at a time. Looking at congress, I think you can say the same for politics as well.
We could just have mandatory brain preservation at 50. Have to be a biologic to be in a position of authority.
You are welcome to die from any number of preventable deaths.
I choose life.
Nothing stopping you from dying. You make your personal decisions and let others make theirs.
Don't worry you'll die just as they will
It doesn't matter how much you like sci fi, even if it was technically possible it would be reserved to the 0.001% and you'd still be grinding your whole life just as you are doing now
The future never arrives in an evenly distributed fashion, but it reaches everyone eventually.
Yes. I have long believed that all those who applaud death as a good thing — Steve Jobs most memorably in his address at Stanford — will be first in line for life extension once it becomes an option.
There are some externalities at play here though...
- We currently only have one world that all living humans must share. - Imagine the sickening amount of power some people would be able to gather given a few centuries. That can't be good for everyone else.
Well, people already gather that sickening amount of power - not for themselves, for their heirs, but still - take the soon-to-be-again US president: he probably wouldn't be where he is right now if his father hadn't amassed a considerable fortune which he inherited.
So it’s bad already let’s make it an absolutely alhellish nightmare?
[dead]
It would be hell on earth literally
Imagine people like putin pouring billions in, easily sacrificing millions to achieve such thing for themselves, and ideally nobody else. A truly terrible scenario worth fighting against.
Some folks are scared of rogue AI, when biggest threat to mankind always was, is and will be other, properly messed up humans with certain capabilities.
I don't know why you're getting so much pushback... Planetary resources are finite. If you give up dying, you have to give up reproducing beyond the replacement rate. People like to imagine they'll be part of some small tribe of a lucky few immortals, but the reality is we'd be in exactly the same situation as today, but with a population rapidly screeching beyond all known sustainable limits far faster than it is today. To name just one obvious problem.
Success of something like this entails a way to regulate reproduction at a far more draconian level than even China's one-child policy. I don't think any civilized nation could impose a "no child" policy and remain intact.
> but the reality is we'd be in exactly the same situation as today, but with a population rapidly screeching beyond all known sustainable limits far faster than it is today
I don’t think this is a given. Most developed nations have bad birth rates for example
[dead]
Setting aside the actual physical and technological limitations here, I think such immortality would create a whole new kind of population problem. Population growth, in some sense, would explode. Or, more realistically, you’d have two classes of people: those living, and those “immortal,” those who could afford immortality, and those who were poor and would have to die permanently.
That said, I think this is all a pipe dream and totally infeasible.
Just want to plug Pantheon as a TV series exploring this idea with excellent storytelling. (https://en.wikipedia.org/wiki/Pantheon_(TV_series))
Alastair Reynolds wrote some great books which touch on this concept - the Perfect series has a AI character who originally started out as a fatal brain scan of a person who eventually escaped...
Cool idea, but it's science fiction.
If this intrigues you, try some really good science fiction on the topic, like Neal Stephenson's "Fall; or, Dodge in Hell".
A fun way to die, would be to transition digitally into the cloud as data for some sort of future "humanity LLM" that would be inquired in the future. The Council of Elders
When was the last time you read a book from the 1960s? Or browsed files from a hard drive from the 90s? I guess the same will happen with brains.
I'm reading a book from 1974 right now. The book I read before that was a 2013 adaptation of a book from 1605. The book I read before that was from 1956.
The book I read before was from 1985: I started it while the author was still alive but he died before I finished it. The book I read before that was from 1998.
Without spoiling too much of Cyberpunk 2077, that game offers an intriguing exploration of how this could go horribly wrong.
Similarly true for the game SOMA: https://en.wikipedia.org/wiki/Soma_(video_game)
More like, "with brain preservation, almost everybody has to die".
https://news.ycombinator.com/item?id=42006265
https://repaer.earth/
Is that satire?
Even if this worked, I find this idea morally repulsive. It just seems utterly selfish.
Next level transhumanist absurdity..
Waking up in a state of confusion looking down at my new body after my own death.
"What is this, some kind of altered carbon?"
I'm kinda surprised everyone in that show doesn't walk around with full neck armor to protect their stacks. Metal gorgets should be all the rage in this universe.
Jeroen Lanier in his book "You Are Not a Gadget" uses the example of a MIDI file: they can describe music, but even though they sound like the real thing, they are limited by the digital world. E.g. according to ChatGPT the minimum interval between 2 MIDI messages is 0.77 milliseconds.
And then he asks what sort of limitations might we have if our minds are software, and how would we not notice it?
> according to ChatGPT the minimum interval between 2 MIDI messages is 0.77 milliseconds.
Thank you for stating your source. However, ChatGPT isn’t deterministic. I asked it the same thing and it responded it depends on several factors, including the MIDI protocol version and the device or software used, and that the minimum between two messages is between 1 and 2 milliseconds.
Which of those is true? Perhaps neither. A quick web search didn’t provide a straightforward answer. Point being that we should avoid propagating even more wrong information, especially since it’s not relevant to your point (which makes sense).
[dead]
So, I've seen Bredo Morstoel, the Frozen Dead Guy of Nederland, Colorado[0]. The 'tour' you could go on was ... something. But, at the end, we were asked to help keep Bredo frozen and helped pile on dry ice. So, they had to open the sarcophagus and get the CO2 in there.
And, yeah...
Bredo looks like Otzi the ice man [1]. He's just a dead body, there's no saving him, he's gone.
It was a good lesson in any thoughts I ever may have had about cryopreservation. Unless you pay someone a LOT of money, for a very long time, and somehow manage to get them to actually really truly care about you, you are just having a really strange and long funeral.
Good excuse for a party though. Those dead salmon tossers are something else.
[0] https://en.wikipedia.org/wiki/Frozen_Dead_Guy_Days
[1] https://en.wikipedia.org/wiki/%C3%96tzi
I really hate this idea. The wrong people would get to persist indefinitely -- the malignantly greedy who hoard majority of the resources and make life miserable for billions of others.
Like so many scientific pursuits, this one has its roots in science fiction. A terrific trashy futuristic novel from the 70's by Lawrence Sanders called The Tomorrow File[0] features preserved heads that are kept around to spout ideas in the future, when they might become useful, among many other Brave New World-type concepts.
[0] https://archive.org/details/tomorrowfile0000sand_t5i1
Way older
https://en.wikipedia.org/wiki/Rip_Van_Winkle
https://en.wikipedia.org/wiki/The_Sleeper_Awakes
https://en.wikipedia.org/wiki/The_Door_into_Summer
and for anything fantastic you can find it in mythology and ancient literature
https://en.wikipedia.org/wiki/Seven_Sleepers
https://en.wikipedia.org/wiki/Muchukunda
https://en.wikipedia.org/wiki/Eight_Immortals
https://en.wikipedia.org/wiki/Ranka_(legend)
A good recent book series is
https://bobiverse.fandom.com/wiki/Bobiverse_Wiki
might be the most optimistic story of the genre.
Several other books pop into mind, e.g., https://en.wikipedia.org/wiki/To_Live_Forever
I oughta read this one again
https://www.goodreads.com/book/show/21617.Buying_Time
particularly since my esteem for Haldeman has gone up (https://sff180.com/reviews/h/haldeman/worlds.html changed my life)
If we are talking about preserving heads, Futurama is the final word on the subject.
Personally I think the head of Vecna https://www.rpglibrary.org/articles/storytelling/headofvecna... is the ultimate head transplant story.
I was really hoping this was gonna have one of those so bad it’s good science-fiction cover artworks
That internet archive version is particularly boring, isn't it. The novel is his best, if you ask me. His detective fiction didn't ever rise to that level and this was the only one he wrote in the sci-fi genre. Perhaps being relegated to the dime store novelist category in his other books prevented this from getting the attention it deserves, but it was wonderfully smarmy and prescient.
I wish we'd reframe the way this gets talked about. It's pedantic, but we can't escape death. All matter in the universe will eventually decay or transmute into something very different from what it now is. And my answer to the question posed of when you'd want to die is unknown. I can't say 150 years. I can't imagine a specific age I'd hit at which I'd want to die. But at the same time, while I might want to outlive the Earth, I don't want to outlive all baryonic matter and somehow persist into the age of the universe in which all other matter is black holes, and there is neither light nor sound, nothing to touch, and all I would ever experience is quadrillions of years of utter loneliness. There is no immortality and nobody would want it if they really thought about what it would entail.
But cessation of aging would be wonderful. I'd love to live indefinitely and not have my body or mind noticeably decay. Research like this should be done, but we need to be honest about what we're trying to accomplish. Nobody will ever escape death, and when you start getting large enough numbers, I'm not sure it would make any difference to live longer. Even if we somehow achieve brain uploading, which I don't see entails you living longer so much as making copies of yourself with different identities, every storage medium has a capacity limit. At some point, the only way to form new memories to evict old ones, and the experience of living 90 trillion years won't be any different than 1 trillion years if 1 trillion years of experience is all you can store.
That said, we need to also be humble about what is even achievable. The very idea of a high resolution scan capturing the entirety of your brain state is already science fiction. We have another home page story right now about controversy over whether the brain has a microbiome. The only reason that's a question is because we have no means of opening up a living brain to see. We can't even accurately measure a person's body fat without dissection. The limitations of non-invasive remote imaging that don't kill the animal being imaged are quite severe and constitute a large reason medicine isn't more effective than it is. There is no technology we are on a known arc toward achieving that will make it possible to capture molecular-level detail of an entire brain as it is still running. I don't see how you can base an entire research project on a premise that doesn't exist and we have no idea if it ever will exist.
There are so many ifs and buts in this idea of resurrecting the brain that it's laughable, yet "brilliant scientists" make press with it. What about the completely unknown mind-body problem that scientism pretends doesn't really exist but each one of us knows is very real? What if you wake up 500 years later but plugged into a matrix? Who guarantees cryo-maintenance of the frozen brain?
I wonder how much of a relief the belief that when you sit on your deathbed you will be woken up again would provide.
Even if the tech doesn't work, I wonder how much relief from suffering the idea that it will in a hundred years could be
Those hundred years won't exist for the dying. I would personally find comfort in knowing that I will feel waking up right away into a technologically much more advanced world
Probably similar to the relief that the belief you will wake up from general anesthesia provides prior to surgery.
Isn't that the main promise of most religion?
It's really debatable if religious people actually believe in eternal life. If they did, they would immediately see that their behavior drives them straight to eternal damnation.
They don't seem to care about that, and the only rational explanation is that they don't believe there's anything after death. They're nihilists. We all are.
What if the constituent of the brain changes, would you feel you or would it be like another person just go on being conscious. If you don’t know what makes consciousness, most likely you just “die” after the brain freeze. This ain’t like sleeping or coma where your system is kept running continuously
People have been brought back to life after being dead for more than an hour after submergence in very cold water.
But did their "consciousness" continue? We currently have no way of telling.
Exactly whose consciousness would it be, if not theirs?
Let’s say you clone yourself l 1:1 who would you see thru?
[dead]
The brain is not enough; the mass-spaced effect allows for the formation of memories in non-neural tissues throughout the body.
https://www.nature.com/articles/s41467-024-53922-x https://www.nyu.edu/about/news-publications/news/2024/novemb...
I view death as the saviour of democracy and the most democratic aspect of life.
Without death, our fragile democracies will die. Tyrants will be in power forever.
Being able to preserve your brain after a natural death won't stop you being assassinated. A few bullets to the brain, game over.
I think it’s more likely we can preserve our ego rather than our consciousness. For instance, create an AI replica of yourself that accurately behaves the same way you do. Although you would be dead, your ego can carry on living and responding to changes in the external world, and people could have interactions with you that accurately simulate how you would respond to them long after you’re gone. And as your ego learns about the world, it develops opinions closely similar to opinions you would have based on your life experience. Perhaps in this way people in power could remain in power indefinitely.
Cue Nixon in Futurama.
I'm gonna break into your houses, and wreck up the place!
yeah but who really wants to keep the c*nts alive who want to live forever, they’d be insufferable. Like the know it all brains in a jar in futurama
Maybe you can just put a towel over the jar for those guys. like a bird cage
[dead]
"1918 when diabetes had no known treatment" What nonsense. A history of diabetes shows knowledge and treatment for the disease for thousands of years. https://en.wikipedia.org/wiki/History_of_diabetes