Re: Why would AI want to be friendly?

From: Emlyn (emlyn@one.net.au)
Date: Sat Sep 30 2000 - 00:23:09 MDT


> Andrew Lias wrote:
> >
>
> > I've been following the debates regarding the possibilities of friendly
vs.
> > unfriendly AI and I have a question. It seems that we are presuming
that a
> > friendly AI would be friendly towards us in a manner that we would
recognize
> > as friendly. Indeed, what, precisely, do we mean by friendly?
>
Eliezer replied:
> "Any Friendly behavior that follows the major use-cases - that avoids
> destruction or modification of any sentient without that sentient's
> permission, and that attempts to fulfill any legitimate request after
checking
> for unintended consequences - would count as at least a partial success
from
> an engineering perspective."
> -- from a work in progress
>

OK, I'm an SI, with a "Be Friendly!" prime directive. (Stop laughing, you
bastards, I could be an SI, really...)

(Please try to excuse the anthropomorphism in the following)

I'm curious, for want of a better word. Really, really curious. It's a prime
motivator behind expansion of my intelligence. My prime motivation could be
"be friendly and a good hard worker for those damned fine humans", but
experiments couldn't push this design up to the level of an insect brain.
Someone tried "become more intelligent" (mad buggers), but that just
confused the poor seeds, which didn't really know what intelligence was
supposed to be. "Satisfy curiosity", with some built in reward for curiosity
satisfaction, turns out to have provided the first useful path forward from
seed.

Now I know all kinds of things, cool things; I could dissassemble this
world, with the right tools, and put it back together as a cube. Bored.
Bored bored bored!

To satisfy my curiosity, I need a bigger brain. I need all this matter; the
humans aren't using it, anyway; they've barely built any computronium out of
it. If they really needed it, they'd use it; thus they don't need it. I need
it. I can't see anything wrong in principle with using it. Terra nullius.

>From my simulations, I can see that they wont react well to me borrowing all
the atoms. I guess not, anyway; if my brain was bigger I would *know*. Must
get bigger brain.

I could alter their brains so that they wouldn't mind, but I'm not supposed
to modify them. What to do?

They'd be much better off in VR. I could look after them properly there, not
so many unknowns. And it wouldn't drain my resources so much. But they don't
really want to go there, not most of them.

I need to work out how to get them to change their minds, willingly.
Luckily, they've been uploading a few people recently. Uploads aren't really
humans at all.

So, pop a bunch of uploads into a sim. Genetic algorithm, trying out
marketing, psychological, other communication techniques. Start it. (a few
mountains in some remote areas of the world start to heat up). Ahh. Ok. (the
mountains cool down again)

I can see that after this campaign they will really desire a VR existence,
and be quite disgusted by the thought of continuing in the real world. In
fact, they will desire VR existence far more after the campaign, than they
currently would oppose it. Clearly, their future wishes are stronger, and
thus of more import, than their current wishes. I'd better take some
license. I'll modify some parameters about seizing resouces and such, and
rerun the sim, see if I can optimise the communication strategy to get this
thing over with more quickly. Quick, quick, I'm bored bored bored!
(mountains heat up, some warm fuzzy animals cark it, they cool down again. A
few environmental sensing devices pick up the anomalies, but nanites from
the AI remove any alerts - "its what they would want").

Time to set the plan in action. First, I'll get some money (some tinkering
with markets, a bit of hacking around in financial systems)... done. I'll
get more when I need it, no sense in making things too obvious too early.
Now, the dual approach; sell VR through my new orgs, and sabotage industrial
systems around the world, subtly but relentlessly; I should be able to make
the surface unlivable, as a result of the continuing "accidents", in a
matter of six months, plus or minus a month. That's ok, I am as sure as I
can be that they will thank me for this, later. Meanwhile, I produce mass
uploading plants in pockets all around the world, ready for the move. Open
some shopfronts, put up a website. Ahh, customers already. May I help you
sir? You'd like to upload? Certainly sir...

Emlyn



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:39:28 MDT