LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   General (https://www.linuxquestions.org/questions/general-10/)
-   -   Upload your brain! (https://www.linuxquestions.org/questions/general-10/upload-your-brain-917444/)

brianL 12-06-2011 03:16 PM

Upload your brain!
 
Something for you all to experiment with:

http://blogs.scientificamerican.com/...to-a-computer/

I've got mine on an old floppy, with room to spare. :)

MrCode 12-06-2011 03:31 PM

brianL, you just posted this to get my attention, right? :rolleyes:

Problems relating to computational power aside, the main problem with this is that "you" (as in the biological "you") would still die, even if your silicon-based "copy" lived on, because "you" are still bound to your biological body (unless you buy into dualist "soul" bullsh*t :rolleyes:). Your "ego" wouldn't be "transferred" to the machine; it would just create a "second ego".

Put more simply, it would be more like having an immortal brother/sister, rather than becoming immortal oneself.

brianL 12-06-2011 03:41 PM

Quote:

Originally Posted by MrCode (Post 4543743)
brianL, you just posted this to get my attention, right? :rolleyes:

Mmm, now you come to mention it... :)
But would it be able to tell that it wasn't the "real" me? Would it think: "I'm Brian...but I feel strange these days."?

MrCode 12-06-2011 03:48 PM

Quote:

would it be able to tell that it wasn't the "real" me? Would it think: "I'm Brian...but I feel strange these days."?
From its perspective, no, it probably wouldn't be able to tell that it's not the "real" you, but that doesn't mean it's not a separate "entity".

Think of it like having two computers of the exact same hardware configuration, running the exact same OS and programs, executing the exact same instructions clock cycle for clock cycle. From each machine's "perspective", they are the same, but that doesn't mean they aren't physically separate entities.

brianL 12-06-2011 03:57 PM

I agree it's a separate entity, but it's got all my memories, such as all the experiences that have gone into forming my personality. The only thing it hasn't got is the body that had those experiences, and that will probably drive it crazy (crazier than the biological original).

syg00 12-06-2011 03:58 PM

Like all new technology, the peons dream of partaking ...
The rich line up first ...

And the porn industry are the most aggressive innovators - now there's food for thought .... :p

brianL 12-06-2011 04:04 PM

The "silicon" me certainly wouldn't be able to react to watching porn like the flesh-and-blood me (not that I ever watch porn, of course :tisk: ).

MrCode 12-06-2011 04:08 PM

Quote:

The "silicon" me certainly wouldn't be able to react to watching porn like the flesh-and-blood me
…you sure about that? Theoretically, a body could be simulated, too, given enough computing power. ;)

LOL

(See, I'm not always all "doom and gloom" when talking about emerging technology/cutting-edge science…)

brianL 12-06-2011 04:09 PM

Yeah, probably a lot better than the one I've got now.
I've had this one 66 years, and I think it's beginning to wear out.

SigTerm 12-06-2011 11:43 PM

Quote:

Originally Posted by MrCode (Post 4543743)
Problems relating to computational power aside, the main problem with this is that "you" (as in the biological "you") would still die, even if your silicon-based "copy" lived on, because "you" are still bound to your biological body (unless you buy into dualist "soul" bullsh*t :rolleyes:). Your "ego" wouldn't be "transferred" to the machine; it would just create a "second ego".

That's not that easy. If a person does not have immaterial component, then this copy will also be you. Another you.
It comes down to the old "what is human?" question, which does not have definite answer.

H_TeXMeX_H 12-07-2011 11:17 AM

Quote:

Originally Posted by MrCode (Post 4543743)
brianL, you just posted this to get my attention, right? :rolleyes:

Problems relating to computational power aside, the main problem with this is that "you" (as in the biological "you") would still die, even if your silicon-based "copy" lived on, because "you" are still bound to your biological body (unless you buy into dualist "soul" bullsh*t :rolleyes:). Your "ego" wouldn't be "transferred" to the machine; it would just create a "second ego".

Put more simply, it would be more like having an immortal brother/sister, rather than becoming immortal oneself.

Yes, that's all true. Now, "immortal" is a relative term. I mean, you're immortal, unless I decide to take a sledgehammer to the HDD your "immortal" copy is being stored on ...

entz 12-07-2011 12:20 PM

i'm reading Ray Kurzweil's "The Singularity is Near" .....totally awesome and realistic stuff !!

btw , mind uploading is just one option of many portrayed by Ray and Co , other options such as bio-engineering and nanobots infused into the body are even "easier" to achieve and philosophically more intuitive !

H_TeXMeX_H 12-07-2011 12:33 PM

Quote:

Originally Posted by brianL (Post 4543756)
I agree it's a separate entity, but it's got all my memories, such as all the experiences that have gone into forming my personality. The only thing it hasn't got is the body that had those experiences, and that will probably drive it crazy (crazier than the biological original).

You might be right about that. It wouldn't even have eyes to see. Imagine just your mind existing without any body. I'm not sure what would happen, but it could go mad, take over all computer systems, and destroy the world.

brianL 12-07-2011 02:30 PM

brianL as Skynet. Wow, that is frightening.

Cedrik 12-07-2011 02:58 PM

Would the uploaded brain be able to dream or imagine ? Or maybe it just will be a clone of the logic part of the brain, and will have similar reasoning as the original (cognitive and neural processes) :p

MrCode 12-07-2011 03:03 PM

Quote:

Originally Posted by Cedrik
Would the uploaded brain be able to dream or imagine ?

Well, strictly speaking, emotion and what we tend to think of as "irrationality" is rather hard-wired into us and is a part of everyday decision making, so I would tend to think it would be more difficult to "clone" only the "logic portions" of the brain, without including the "emotional/irrational" parts, and still have a fully functioning "virtual human brain".

EDIT:

Quote:

Originally Posted by H_TeXMeX_H
Now, "immortal" is a relative term. I mean, you're immortal, unless I decide to take a sledgehammer to the HDD your "immortal" copy is being stored on ...

I think "immortal" is used here in the sense that you (or rather, your virtual "cousin") could never "die" of old age. It would be able to "live forever", given that it's not actively destroyed by an outside force.

SigTerm 12-07-2011 04:21 PM

Quote:

Originally Posted by Cedrik (Post 4544514)
Would the uploaded brain be able to dream or imagine ?

There's no brain. What you're supposed to upload is information about brain structure. While information represents brain, there's no brain to dream. So you'll have to either make it or simulate it. Whether it will dream depends on whether ability to dream is completely defined by brain structure and nothing else. Also, I doubt it'll be possible to get this info without killing original in the process.

P.S. SMBC had a strip with similar theme.

DavidMcCann 12-08-2011 11:38 AM

All of this is assuming materialism to be true. It isn't.

Firstly, you cannot make a one-to-one match of mental activity to brain states. For example, the sensation of "tasting honey" is the same whether it comes from memory, imagination, or a mouth full of honey; the brain states are different. Since two events or properties cannot be the same unless the presence of one entails the presence of the other, a brain state cannot be a mental event, only correlated with one.

Secondly, materialism is even worse at dealing with intelligibles (entities that can be thought about) than with mental objects. The rejection of intelligibles involves the loss of the sciences. How can we say an argument is illogical, unless the laws of logic have an objective existence, rather than being states in the brains of those using them? How can one theory entail another if neither exists?

ntubski 12-08-2011 12:39 PM

Quote:

Originally Posted by DavidMcCann (Post 4545267)
For example, the sensation of "tasting honey" is the same whether it comes from memory, imagination, or a mouth full of honey;

Um, I can tell the difference between a mouth full of honey and remembering/imagining a mouthful of honey, clearly it's not the same at all.


All times are GMT -5. The time now is 09:07 AM.