"Robert J. Bradbury" wrote:
> Here is an interesting twist, based on some of my comments
> regarding the need for robotic junk retreival systems and
> the telepresence robots -- say the zombie 'body' or 'hologram'
> is being remotely operated by a real human being (or an SI).
> Now in this case you can do whatever you want to the body/holo
> and you are not harming anyone. However humans are designed
> to have empathy for things that look, walk, and talk like
> humans (they are also designed to be wary of them until
> a trust-bond is developed). So the tele-operator is going
> to have to be 'tightly' wired to the remote body, such that
> pain caused to the remote body will generate natural reactions
> in the operator. [Presumably sensory safety devices cut
> in at extreme levels.] Now, the operator is getting paid
> quite well for performing this service and have freely
> entered into this situation.
> In this case is the remotely piloted body/holo a zombie?
I know that Robert Heinlein coined the term "waldo" for a
tele-operated body in the 1942 novella _Waldo_
( http://www.amazon.com/exec/obidos/ASIN/0345330153 )
but the most heartbreaking and disturbing fictional exploration of
this question I have read is Alice Sheldon's (as "James Tiptree, Jr.")
"The Girl Who Was Plugged In" which appeared in the
collections _Warm Worlds and Otherwise_ (1979)
and _Her Smoke Rose Up Forever_ (1990)
both of which seem to be out of print.
This story was filmed for TV as part of the short-
lived Sci-Fi Channel series _Welcome to Paradox_, but
I don't recall it being as effective as the story
( http://www.scifi.com/sfw/issue76/screen.html )
> But 'feelings' are gen[e]tico-socio-'thingys' (insert a word
> here to represent a neurocomputational 'pattern') that are
> designed to promote survival. There is no reason to elevate
> them to significance (if that is what you are doing). They
> are subroutines designed to promote behaviors that have
> specific goal seeking strategies in the framework of the system.
It's not at **all** obvious that "feelings" can be dismissed
as insignificant in any entity that could be dignified with
the adjective "intelligent", despite the hoary SF convention
of monotone-voiced computers, robots, and aliens,
including _Trek_'s Mr. Spock and Seven of Nine. My impression
from the little neuroscience reading I've done, including
Gerald Edelman, and a glance at Antonio Damasio's _Descartes'
Error_ ( http://serendip.brynmawr.edu/bb/damasio/descartes.html ,
and see also Damasio's review of Joseph LeDoux's _The Emotional Brain_,
http://www.sciam.com/0697issue/0697review1.html ), is
that the stereotype of emotionless intelligence may actually
be an incoherent one. For Edelman, the "key reentrant loop"
supporting consciousness involves **both** the newer, quicker parts
of the brain (categorizing the rapid flux of sensory
stimuli, or exteroceptive input, and motor behavior) **and** the older,
slower parts of the brain (categorizing interoceptive input, or "feelings";
see http://www.lucifer.com/exi-lists/extropians.2Q00/5580.html .
Cut the connections between these domains, and there is no consciousness.
Without feelings, a Jupiter-sized ball of computronium may
be nothing more than a transcendental adding machine.
Olaf Stapledon also touched on this idea in _Last and First
Men_, when a future race of men called the Great Brains
concluded by means of pure logic that they were fatally
flawed, and designed a successor race to embody a more
desirable balance of logic and emotion.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:43 MDT