Re[2]: Protean Self-Transformation

Guru George (
Sun, 30 Mar 1997 12:24:23 GMT

On Sun, 30 Mar 1997 12:46:36 +0200 (MET DST)
Anders Sandberg <> wrote:

>On Sat, 29 Mar 1997, Gregory Houston wrote:
>> Emotions are felt in a physical manner. I *feel* sexual pleasure via my
>> organs. I *feel* hunger via my organs. I *feel* excitement via my
>> organs. Without this specialized hardware I would *feel* nothing. As a
>> computer I might be "aware" of conditions, but I would not *feel* those
>> conditions unless someone created specialized hardware that would allow
>> me to *feel*.
>OK, what you seems to say is that if I put damage sensors in a robot, it
>would be able to feel pain. But this relates to the awfully tricky
>problem of qualia: does the robot *experience* pain, or doe it just think
>"pain"? How can we tell?
>(My personal view is that the hardware doesn't matter; after all,
>stimulating a sensor or nerve produces the same sensation, so the qualia
>are likely happining in the brain, which I also think could be run on a
>different hardware)
>> If a computer is to feel pleasure from the outside world it will require
>> peripheal nodes which can register sensation (pleasure and pain). If a
>> computer is to feel pleasure from its own internal processes it will
>> require internal nodes which can register sensation (pleasure and pain).
>> We can program computers to think because that is what the hardware was
>> designed for. The hardware has not yet been designed to feel emotion.
>It would be quite possible to create programs that can watch their
>internal states and hence feel internal pleasure, no hardware needed. In
>fact, programs that watch or affect their own states have been written
>(but as far as I know nobody has done anything truly serious with it).
I have to say that I agree with Gregory here. And it's not just sensors
that are the problem. As I understand it, emotions are mediated by the
'R-complex' and limbic system in the brain. These electro-chemical
processes create the kinds of events we call 'pleasure' and 'pain' in
response to stimuli. They comprise primitive forms of cognition, highly
peculiar, particular, and specialised (i.e. they have to to with the
typical concerns of reptiles and lower mammals), that are 'gerrymandered'
by the neocortex. AI people could perhaps duplicate the functions of
these systems electro-mechanically right now, with enough work, but as
Dennett has said, what's the point?

But unless you uploaded into a system that did duplicate them, you
wouldn't experience them, I think. (Then again, maybe that does provide
a 'point' to the exercise of trying to duplicate them?)

Speculation:- Without systems that duplicate emotional functions, the
nearest I can imagine to what it would be like uploading would be the
experience of doing pure mathematics: all experience would occur as a
kind of mathematical trance, except with sensory images as mathematical

This wouldn't be transhuman so much as totally non-human, perhaps no
different from the experience of a built or evolved AI.

Guru George