RE: Ghost in the Machine: What Computer Scientists and

From: Phil Osborn (philosborn@altavista.com)
Date: Sat Nov 11 2000 - 22:32:11 MST


Neuroscientists can Learn from Each Other
Sender: owner-extropians@extropy.org
Precedence: bulk
Reply-To: extropians@extropy.org

Ha!

I told you so.

See my question re what was the actual difference between having something
in my memory and having it stored in a chip in my head, or an implanted
hard drive, or a book on my shelf. (The answer is not access time.)

This "re-entrant" feature is what "consciousness" is really about. You are
aware that you are aware that you are aware, etc. Not like the "little
theatre in the mind," but by virtue of the fact that neural processes are
aware of each other. Koestler used this idea more or less implicitly, but
he didn't know all the mechanisms. He thought, as I recall, it was due to
competing releases of hormone-like neurochemicals that cued mental focus,
among other things.

If Edelman is correct, then it involves layers of cross-communication
between neural networks as well, to a degree of complexity that would be
virtually impossible to fully emulate via linear processing. In fact, it
is probably provable right now that you could not do such an emulation in
realtime, simply because no switching device could possibly operate fast
enough. Maybe you could run an emulation at 1/10,000 human speed with
reasonable accuracy.

Furthermore, although neural nets get you through the first set of problems
in an emulation (and note just how much linear processing it takes to
emulate a simple real hardware neural net) I don't see any indication that
neural nets are capable of handling the problem of reentry on the level
Edelman indicates.

So, if we are planning on uploading, and retaining real consciousness, then
we are likely to need some radically new hardware designs. Just faster
computing ain't goin to make it.

But those are not the end of the problems for uploading, by any means. I
well recall discussions of neural nets in the early '70's, in which it was
naively argued that they could substitute for the human brain - by
themselves, as they were then, if sufficiently large. Neural nets have
come a long way, but they still have not satisfied the criteria of
"intent." I argued then that the mind was formed by competing processes
"taking action," not just processing input.

What Edelman's message will hopefully drive home to the naive uploaders in
Extropic circles is that the mind is not a floating consciousness. It is
an integrated part of a biological organism. In order to preserve the
essence of our minds, it will be necessary to look very carefully at what
that mind really is. It is NOT just a set of algorythms and data. It is
part of an integrated "system." "Memory," as Edelman argues, is a "system
phenomynon." It isn't stored someplace separate as in a PC.

The really nasty result of taking the naive upload position, as I have
discussed before, is that we might find ourselves, via any of several
plausible routes (eg., incremental replacement of biological
sensory/memory/processing with chips like the ones UCI is working on) with
Stepford people - or worse, Stepford SIs, who were utterly unconscious and
utterly unworried by that fact. Such machines might easily pass any number
of Turing tests and even present themselves as more plausibly human than
the real humans, but they would just be dead machines, with no more real
awareness than my motorcycle.

(I really worry about the people who don't find that to be a problem.)

_______________________________________________________________________

Free Unlimited Internet Access! Try it now!
http://www.zdnet.com/downloads/altavista/index.html

_______________________________________________________________________



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:20 MDT