Re: Re[2]: Protean Self-Transformation

Anders Sandberg (
Sun, 30 Mar 1997 21:25:49 +0200 (MET DST)

On Sun, 30 Mar 1997, Guru George wrote:

> I have to say that I agree with Gregory here. And it's not just sensors
> that are the problem. As I understand it, emotions are mediated by the
> 'R-complex' and limbic system in the brain. These electro-chemical
> processes create the kinds of events we call 'pleasure' and 'pain' in
> response to stimuli. They comprise primitive forms of cognition, highly
> peculiar, particular, and specialised (i.e. they have to to with the
> typical concerns of reptiles and lower mammals), that are 'gerrymandered'
> by the neocortex. AI people could perhaps duplicate the functions of
> these systems electro-mechanically right now, with enough work, but as
> Dennett has said, what's the point?

Actually, duplicating them has a point. As you say, they are a form of
primitive cognition that is quick and not unnecessarily complex. This is
ideal for simple robotics! Being a bit of a connectionist I might be a
bit biased, but subsumption architectures and neural net modules are
clearly very promising for building robots. Having a robot with a limbic
system is actually a quite good idea; it will want to please its owner,
it will avoid harm and mistakes (they feel bad), it would learn and
associate. For many tasks there is no need for a complex brain, drives
and motivation suffice quite well.

> But unless you uploaded into a system that did duplicate them, you
> wouldn't experience them, I think. (Then again, maybe that does provide
> a 'point' to the exercise of trying to duplicate them?)

It would be meaningless to upload into a system that didn't duplicate the
limbic system - why upload only *part* of the brain? Uploading will most
likely encompass everything from the brainstem and upwards (and I'll
admit that I have been thinking about the spinal cord too). The point of
the Strong AI hypothesis is that duplicating emotion only is a matter of
the right software structure, the hardware is irrelevant as long as it
can run it.

> This wouldn't be transhuman so much as totally non-human, perhaps no
> different from the experience of a built or evolved AI.

Would be interesting to try.

Anders Sandberg Towards Ascension!
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y