Re: Why would AI want to be friendly?

From: Brent Allsop (allsop@fc.hp.com)
Date: Thu Sep 07 2000 - 11:01:52 MDT


Hal <hal@finney.org> responded:

> But our decisions to make these changes aren't "really" free. That
> is, if freedom is the ability to change our wants and desires, we
> can't be free by that definition, because only pre-existing wants
> and desires can motivate us to make these changes.

> So you want to make paper work be orgasmic? This isn't freedom, it
> is merely one built-in goal being given more priority.

        I agree with you, but think you missed what I was trying to
say. Sure there is the cognitive part of our mind that wants to do,
say paper work, then there is the more baser part of our mind that is
trying to get us to be sexual. What I'm talking about is the unfair
advantage the baser part of the mind has in that it has such a big hot
dog to hold out in front of us and make sex so worth while - i.e. the
phenomenal orgasmic reward we get from the sex. What I am talking
about is, wonder if we could take such rewards and liberally control
them, or wire them to other stimulation and activities, such as that
of doing paper work or whatever. So now, the rational more cognative
part of the mind can fairly compete and have it's desires be as
phenomenally joyfully rewarding to us as other things.

        True, there's always going to competing things we want to do,
but once we've made that decision as best and as rationally as
possible, if it is to do the paper work, wouldn't it be nice to be
able to rewire the reward so that we didn't have to give it up to do
the paper work? Then, when we really do want to have sex, put it back
the way it was...?

                Brent Allsop



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:37:29 MDT