Ray Kurzweil, Cyberprophet or Crack-Pot?

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

Post Reply
JoeStrout
Site Admin
Posts: 285
Joined: Tue Jun 26, 2007 7:40 pm
Location: Fort Collins, CO, USA
Contact:

Post by JoeStrout »

TallDave wrote:Heh, I wrote a post about the identity and consciousness problem about this a while back.
Me too, back in 1997. This is an old argument that was (in my opinion, at least) resolved years ago.
TallDave wrote:First off, consider that the you of this moment is not, strictly speaking, the you of a decade ago, a year ago, or even a minute ago -- you constantly acquire new knowledge, evolve new heuristics, etc. So, sadly, you are going to cease to exist by the time you finish reading this.
Only a little bit. I am still almost entirely the same person I was a minute ago.
TallDave wrote:It seems therefore that there is -- and this is a bit disturbing, because it cuts against our compulsion for bodily self-preservation -- no difference between you just floating along evolving in your body as you normally do, and being physically destroyed and precisely recreated anew each second -- or, for that matter, being duplicated into dozens or trillions of new copies. They're all you for an instant, and soon none of them are you, and you're not you anymore either.
Well, no, personal identity is not a Boolean property. They're all exactly you for an instant, and soon they are all slightly less you, and you're slightly less yourself too. But you're quite right that there is no philosophical difference between continuing to exist as we are used to today, and being destroyed and recreated.
TallDave wrote:But, of course, you're not temporally continuous. You experience regular periods of unconsciousness.
Right. And though your brain is still active while you're asleep, there are people who have been completely flatlined for extended periods of time, and we all seem to agree that they're the same person afterwards.
TallDave wrote:As for physical continuity, it seems relevant until you start asking which pieces we can remove and have you still be you. Our soft tissues are replaced every 90 days iirc, so "physically" you are not continuous either. What seems to matter are the patterns, so if we can replicate the patterns in silicon that could be said to be a new you as well - for an instant or so, before it becomes someone else.
All good points, except that adhering to Boolean logic will open you up to logical tangles — the same ones that Locke ran into hundreds of years ago. But Boolean logic was the only rigorous form of logic available to him at the time; we have more advanced logic available today.
Joe Strout
Talk-Polywell.org site administrator

JoeStrout
Site Admin
Posts: 285
Joined: Tue Jun 26, 2007 7:40 pm
Location: Fort Collins, CO, USA
Contact:

Post by JoeStrout »

ZenDraken wrote:The issue here seems to be continuity of consciousness. And yes, we do have periods of unconsciousness, but we have a *sense* of continuity. That sense is not possible either in The Prestige scenario, or in the mind-upload scenario because the continuity of consciousness is broken. As long as the process allows you to walk away and leave a separate copy, you have not maintained a continuity of consciousness.
As you point out, there is no continuity of consciousness — just the "sense" or illusion thereof. There is nothing philosophically relevant about that illusion, however. People used to have the sense that their soul was in their heart too, and initially saw heart transplants as an abomination. But they got over it.
ZenDraken wrote:And then there's the Star Trek transporter scenario: would you use that thing knowing that your body is going to be obliterated?
Of course, because I don't care about this illusion of continuity you seem so concerned about.

But then, if we had transporters, you could probably get by avoiding them and taking inefficient means of travel. But when uploading is the only way to avoid death, you will let go of your squeamishness about personal duplication, or you will die.

Cheers,
- Joe
Joe Strout
Talk-Polywell.org site administrator

JoeStrout
Site Admin
Posts: 285
Joined: Tue Jun 26, 2007 7:40 pm
Location: Fort Collins, CO, USA
Contact:

Post by JoeStrout »

93143 wrote:All of this talk of instantaneous identity is somewhat beside the point. I've thought about that before, and it doesn't explain why I am I and you are you, which is the main issue here.
Yes, you are right, this is the issue called "personal identity" in philosophical circles. And here's the answer: you are you because you are the only person in the world that can remember something you were thinking yesterday. (Or put more clearly: if you can remember thinking something, then you are the same person as the one who did that thinking.)

That's Locke's idea from the 1600s, and it's a bit crude: I would go further and say that identity is defined not just by episodic memory, but by all parts of mental structure: the several different forms of memory, plus personality traits, aptitudes, etc.

I am I and you are you because we each have our own unique mental structure.
93143 wrote:As for unconsciousness, brain states and scientific detectability thereby, is it really unconsciousness, or do you experience things that simply aren't recorded in your memory?
A moot question given that people have been completely flatlined for extended periods of time — their brains inoperational, clearly not having any more experience than a rock.

Best,
- Joe
Joe Strout
Talk-Polywell.org site administrator

JoeStrout
Site Admin
Posts: 285
Joined: Tue Jun 26, 2007 7:40 pm
Location: Fort Collins, CO, USA
Contact:

Post by JoeStrout »

JohnSmith wrote:I think the 'Prestige' argument is the best one. Questions of identity aside, one of you will die. I'm guessing, since you did upload your brain, you were worried about that. What if it's you?
In this scenario, there are two copies of me. One copy will die; I'm OK with that, as long as one copy lives on, because then I (the person) lives on. (Of course, I'm not OK with experiencing great suffering, whether it results in my death or not. Also, this assumes that the duplication event is recent, so that the two are almost entirely the same person; the longer it's been, the more they are different people, and the more uncomfortable I would get with one of them being destroyed.)
I can't understand why you wouldn't try for continuous instead. Gradually replace biological brain function with computer, until you're almost entirely computer based. It avoids the problem entirely.
Because that's a silly crutch, and was shown long ago to make no difference to personal identity. (See John Perry, Personal Identity, 1975 for example.)
Joe Strout
Talk-Polywell.org site administrator

JohnSmith
Posts: 161
Joined: Fri Aug 01, 2008 3:04 pm
Location: University

Post by JohnSmith »

Joe, I think we've come to an area we won't agree on. You say that you're ok with the idea of dieing, as long as there is a 'you' that survives. I think that you will change your mind as you're lying on your deathbed, waiting for the end. Maybe I'm wrong.

TallDave, you're comments about how soft tissue only lasts 90 only emphasizes my point. It's not the same 90 days for all the neurons. so you've got turnover, lots of growth and death below the level of the mind. So it would be easy to sneak in and start replacing it bit by bit.

And back to Joe. I haven't had time to find or read the book, but I doubt it would change my mind. (I will try and find it, though!) I see a large difference between a forking of the mind and a slow replacement. Either way, I'm happy with my crutch.

I like thought experiments instead of philosophy.
Just to posit a scenario, say you went to the hospital, got your mind uploaded. Ok, it's you. I don't argue that. And there's a physical you.
As far as the physical you is concerned, nothing has happened. So you walk in, sit down in the chair. Then the nurse comes in. Do you follow her when she says, "Ok, we're all done, this way to the euthanasia chamber?"

JoeStrout
Site Admin
Posts: 285
Joined: Tue Jun 26, 2007 7:40 pm
Location: Fort Collins, CO, USA
Contact:

Post by JoeStrout »

JohnSmith wrote:Joe, I think we've come to an area we won't agree on. You say that you're ok with the idea of dieing, as long as there is a 'you' that survives. I think that you will change your mind as you're lying on your deathbed, waiting for the end. Maybe I'm wrong.
You are. I've been debating this issue with people for over a decade. Usually, if I hang in there long enough and am patient and clear, they eventually come around to the realization that they haven't got a philosophical leg to stand on. You're reasoning from intuition, which is based on experience, and we have no experience living in a world where it is possible to duplicate a person. So you're convinced of your conclusions, even though you can't explain why, and there is no logic behind them. Intuition leads us astray when something as new as this comes along.

(This is similar to the intuitive reaction people had to the idea of heart transplants before they became available, but even more so.)

Of course, when the time actually comes and people have a chance to get used to dealing with duplicates, their intuition will change and it won't be such a struggle to think about it clearly.
JohnSmith wrote:And back to Joe. I haven't had time to find or read the book, but I doubt it would change my mind. (I will try and find it, though!) I see a large difference between a forking of the mind and a slow replacement. Either way, I'm happy with my crutch.
No doubt, but I fear it's going to lead to your unnecessary death. Do read the book if you can; it's a little dated (it too neglects the application of fuzzy logic to personal identity, which is really all that's needed to resolve the problems with Locke's original thesis), but its arguments on gradual vs. discontinuous replacement are detailed and sound.
JohnSmith wrote:I like thought experiments instead of philosophy.
Er... what do you think philosophy is based on? It's all thought experiments. For any given theory of personal identity, you posit various scenarios, see what that theory would imply, and then check that conclusion against standard conceptions of the term. For example: suppose you believe that person A is the same as person B if they have the same color and pattern of socks. We can easily falsify this theory of personal identity with two thought experiments: (1) you and I happen to be wearing the same color and pattern of socks; and (2) you take off the socks you're wearing and put on different ones. In experiment 1, the theory identifies us as the same person, even though all would agree we are not; and in theory 2, it would identify you as a different person after the sock switch, when all would agree you're the same.

If a theory can't handle these common, every-day cases, then there's no point trying to apply it to the tricky ones. The thing is, most intuition-based theories of identity fall apart on currently-feasible thought experiments, such as deep hypothermic surgery or physical disability. A good theory of personal identity handles these just fine.
JohnSmith wrote:Just to posit a scenario, say you went to the hospital, got your mind uploaded. Ok, it's you. I don't argue that. And there's a physical you.
As far as the physical you is concerned, nothing has happened. So you walk in, sit down in the chair. Then the nurse comes in. Do you follow her when she says, "Ok, we're all done, this way to the euthanasia chamber?"
Sure (assuming of course that I believe her, and that she can give me a good reason why I shouldn't be even happier being two instead of one). Why not?

Let me posit a scenario back to you: suppose we discover that, every night for the last three months, aliens have been sneaking into your bedroom at night, killing you in your sleep, taking you apart atom by atom for study, and then replacing you with an exact atom-for-atom duplicate. Would you kill yourself before going to sleep again? Remember, this has been going on for some time.

I sure wouldn't, because who cares? If the duplicate is exactly the same as the original, then it is me, and I survive this nightly procedure just fine — in keeping with the common-sense conclusion that I am the same person today that I was yesterday, regardless of what might have happened while I slept.

Best,
- Joe
Joe Strout
Talk-Polywell.org site administrator

93143
Posts: 1142
Joined: Fri Oct 19, 2007 7:51 pm

Post by 93143 »

Okay, it seems on a quick read-through that none of you have understood what I'm talking about.

Google "hard problem of consciousness". You should find the Wikipedia article. The last line in the article is wrong. Try to figure out why.

Unfortunately, I haven't got time for detailed discussion right now due to a project I need to get done...

JohnSmith
Posts: 161
Joined: Fri Aug 01, 2008 3:04 pm
Location: University

Post by JohnSmith »

I wouldn't kill myself before I slept, because I'm me. I'd try and kill the aliens!

Your alien scenario is a bit like sleep from your viewpoint, I'll give you that. Not sure if I agree, mind you, but it makes sense.
My problem is simply that after you're dead, the pause in your mind never resumes. THAT's what any copy of me would try and avoid. And you can't avoid one of the copies going 'kaput' without the continuation method.

And I just realized the first line sounds like "I'm the kind of person who'd fight for my life!" That's not what I was getting at. The argument was supposed to be, "I don't care about the guy who died earlier, I don't want to die!" So I'd try and prevent this from happening to me.

And maybe philosophy is supposed to be lots of thought experiments, but I always see arguments about 'logic' and 'obviously true.'

ZenDraken
Posts: 22
Joined: Wed Feb 20, 2008 5:14 pm
Location: Pacific NW

Post by ZenDraken »

MSimon wrote:No. What the Buddhists mean is that we are a collection of programs triggered by events. i.e. there is no gentle flow. There are big jumps. Sometimes very big jumps.
I didn't mean to imply only gentle flow, just the sense of a point of view moving forward in time, gentle or otherwise. An extreme (if fictional) example of "otherwise" is the protagonist of Slaughterhouse Five. He presumably had a sense of self, but his consciousness kept jumping around in time relative to everyone else.

And I would argue that the collection of programs are just mechanism, not consciousness. Consciousness is the observer of the mechanism (to me anyway).
MSimon wrote:The "cure" for this is to install a "watcher" that can follow the change in programs. Once the watcher is functioning then you can get a "controller" that can remove the calling from the automatic to the autonomous.
Aren't the watcher and controller just more programs? And in terms of self, isn't the Society of Mind just a collection of "selves", which may or may not be well integrated? I think in the SoM model, the consciousness integrates all those selves to a greater or lesser degree. The "selves" are still there, are just as illusory. But there is still a consciousness that is observing the various "selves".

ZenDraken
Posts: 22
Joined: Wed Feb 20, 2008 5:14 pm
Location: Pacific NW

Post by ZenDraken »

JoeStrout wrote:Of course, because I don't care about this illusion of continuity you seem so concerned about.
True enough, I do feel attached to my continuity, even if it's an illusion. You claim to have let go of the self, which is honestly admirable. But you do still seem to be attached to the "pattern" of you. You seem to be OK with painless obliteration as long as your pattern is reproduced elsewhere. Question: Would you require that your replica be instantiated the instant that you die? What if your pattern was stored, and a replica was recreated some time later? How about a thousand years from now? A billion? What if your replica was made before you were killed? Would you have second thoughts?

No criticism here, I'm just exploring.

ZenDraken
Posts: 22
Joined: Wed Feb 20, 2008 5:14 pm
Location: Pacific NW

Post by ZenDraken »

JohnSmith wrote:My problem is simply that after you're dead, the pause in your mind never resumes.
Agreed.

Betruger
Posts: 2336
Joined: Tue May 06, 2008 11:54 am

Post by Betruger »

I don't see it either. The only copies of me I'd consider being me would be those thru whose eyes I could see at the same time as mine. Anything else is just a replica.

Looking into the mirror I see myself.. If a copy of me was facing me, instead of a mirror, and mirrored everything I did, it would only be an illusion of reflection. The two copies are distinct and separate. I wouldn't be living on by such cloning. It would be a sequence of iterations of an initial "me" template. There would be a me_n, me_n+1, etc. None of them would be living the same life at the same time from the same position in space, none of them would share consciousness.. Only a common pattern (at first) of thoughts and an ease of relation to each other as twins have. But the void between them would be the same as between me and anyone else.

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

Well, no, personal identity is not a Boolean property. They're all exactly you for an instant, and soon they are all slightly less you, and you're slightly less yourself too.
It's true you are very similar from moment to moment, but when you start to think about copies it highlights the fact you may have diametrically opposed interests just from the fact you now occupy different physical avatars.

Consider the Prestige situation -- one copy of you is about to die, so now suddenly the tiny difference between you and your copy becomes a life or death detail.
Last edited by TallDave on Thu Aug 21, 2008 8:43 pm, edited 2 times in total.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...

TallDave
Posts: 3152
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

93143 wrote:Okay, it seems on a quick read-through that none of you have understood what I'm talking about.

Google "hard problem of consciousness". You should find the Wikipedia article. The last line in the article is wrong. Try to figure out why.

Unfortunately, I haven't got time for detailed discussion right now due to a project I need to get done...
Sounds like mere mysticism to me. I understand but reject it, because subjective experience is now fairly easy to explain at the physical level, if you allow that it can be an emergent property from very complex information processing routines.

While it's hard or even impossible to fully understand a very complex process, that does not make the physical explanation of the process any less likely, any more than the fact one can't fully envision all the details of a Pentium processor means that it can't be built and perform complex functions.

I think one could, in fact, climb inside a human brain and say "here is a group of neurons firing to recognize a friend, here is another worrying about status, here is another processing the feel of wind, here is another sorting the sound input into words and another interpreting those words into meaning" and recognizing that all these together constitute a moment of subjective experience.
Last edited by TallDave on Thu Aug 21, 2008 8:46 pm, edited 1 time in total.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...

JohnSmith
Posts: 161
Joined: Fri Aug 01, 2008 3:04 pm
Location: University

Post by JohnSmith »

Sorry TallDave, but that's a terrible way of rejecting the argument.

"subjective experience is now fairly easy to explain at the physical level, if you allow that it can be an emergent property"

If I have to 'allow' that it is an emergent property, that means it's not explained at all.

Post Reply