Re: Thinking about the future...
Thu, 29 Aug 1996 18:23:33 -0400

Anders Sandberg wrote:

> > I think it would be unlikely that we create successors
>that out-compete us, most likely they will inhabit a somewhat different
>ecological/memetic niche that will overlap with ours; competition a

[ to which Max More wrote:]
>>You make good points, Anders, about humans and nanite-AI's having possibly
> different niches. However, there may be a period during which we're very
> much in the same space. That's the period in which humans could be at risk
> if AI/SIs have no regard for our interests. What I'm thinking is that it's
> possible, even likely, that SI will be developed before really excellent
> robotics. AI's in that case would not be roaming around much physically,
> they could exist in distributed form in the same computer networks that we
> use for all kinds of functions crucial to us.
>> If they need us for doing things physically, we would still have a
> position. Nevertheless, powerful SI's in the computer networks, could exert
> massive extortionary power, if they were so inclined. So I still think it
> important that SI researchers pay attention to issues of what values and
> motivations are built into SIs >>

Ah yes, the programmers ( as well as the programmed AI's) motivations could
be really useful or highly destructive! A theme for a many well loved horror
tale, indeed! ..or a solution to much strife on our world.
Re: Values : I am curious - we talk about the AI's replacing, destroying
or overcoming humans or >H's: Realistically - what would AI's "needs" be?
Would it have needs? - or more precisely would they precieve the concept of
needs as we do, not being subject to the fight or flight domain we have to
negotiate?We need food,nurturing,clothing,shelter,etc.. What AI conditions
correspond to that? If we (thru mimicry of intelligence as we know it)
create them as similar to primate intelligence, then (?) reproduction
/expansion- but if NN intelligences program themselves, how could we predict
what the agenda will be?As Max says here - they could exert massive power. Do
we assume they would inherently take our values and expand or pervert them -
an allegiance to their "creators"? Some how I don't see that, as inviting as
it sounds.

Even if they could "use" us for manual labor- and what would we produce for

In essence, what would they want to destroy us *for*? Comparatively
aesthetic messiness?

[PS Anders your post made me want to draw transhuman wet/dry multi- sided
organelles, but then that gave me Borg images again...]