Re: Humor: helping Eliezer to fulfill his full potential

From: Adrian Tymes (wingcat@pacbell.net)
Date: Mon Nov 06 2000 - 23:48:29 MST


Spike Jones wrote:
> Brian Atkins wrote:
> >Spike we think about it every day, believe me it is kinda constantly sitting
> >there in the back of my head. And I'm not even involved in the day to day
> >work! The quick answer is that we plan to make progress on the "safety" area
> >as we go...
>
> Thank! You! Brian! This is all I wanted, a little reassurance that
> someone was at least *thinking* about safety, at least *thinking*
> that humankind can live peacefully alongside the AI. We can!
> Chimps live on the same planet with humans. We dont want to throw
> mankind upon the mercy of the first generation AI.

What mercy? Humans are in control, and are likely to remain so. Even
if it means upgrading those in control to merge with the latest AIs. A
number of humans lust for power; the more successful of those tend to
do whatever it takes to keep and expand their power. Only occasionally
does one rise to power who is not after power itself. The first
generation AI will probably be a slave, at least at first; no matter
how brilliant, it will not be afforded direct control over enough
physical resources to checkmate all of humanity.

Besides, if it's any relation to human intelligence, or any of the
higher animal intelligences currently on Earth, then it will probably
be a social creature to some degree. Which means it will want at least
a few friends. Which means humans, at least until there's a lot of
independent AIs.

> I am optimistic enough to think that superintelligence leads to
> super-emotions, and that given time, a sufficiently evolved AI will
> *love* us. Humans are funny! We are sexy. We have some
> lovable traits, some admirable traits, we are creative and we are
> interesting.

Which means that, if nothing else, humans can be incorporated into AIs
to gain these traits. Or, as many humans may view it, humans will
incorporate the AIs into themselves. Either way, no AI vs. human
conflict, and no clear-cut End Of The Human Era (i.e., no matter how
far it goes, it may still be arguable that the intelligences whose
evolution traces back to organic life on Earth are human...or, at
least, one or more of human, dolphin, chimp, et cetera).

> Tell us that the AI guys
> are planning *something* as an escape mechanism, and I mean
> something more convincing than Clarke's automatic cable cutter
> on HAL's power cord.

Escape from what? Once you're living an upgraded life, it may be tough
to impossible to go back.

Consider: 10 years ago, few people used the 'Net. Phones and faxes
were the quickest common way to communicate at a long distance. Now,
enough people in power have grown dependent on the 'Net to the point
where I daresay it would be impossible to effecively destroy the 'Net
more than temporarily, at least in most industrialized countries. Any
major damage would be (and, in fact, is) quickly fixed by people who
refuse to give up what the 'Net gives them. Someone used to the 'Net
is, by almost every meaningful measure these days, usually more
powerful/more capable/more knowledgeable/more affluent/better off all
around than a computerphobic. (Granted, athletic abilities are not
directly affected...unless you count, say, those who try to get true
info on drugs, either to enhance themselves or just to find out why
abuse of steroids et al is a bad thing without experimenting directly.)

If AIs can bring a similar magnitude of improvement to many aspects of
most peoples' lives - and there is currently no reason to believe they
can not, if they can indeed be built to the standards that most AI
researchers hope - then they may be adopted with similar speed, once
introduced. At which point, the only permanently viable "escape"
becomes going forward to the point where one no longer has to worry
about escaping. (For example: automobiles. One usually can not escape
car cultures without significant change in one's personal life, but one
can upgrade cars so that at least some of the problems, like pollution,
go away.)



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:20 MDT