Re: making microsingularities

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat May 26 2001 - 13:36:44 MDT


Spike Jones wrote:
>
> The term "slow" is not really what I was getting at. AI will be plenty
> fast as far as that goes, but we need to deal with other axes besides
> speed and friendliness, such as ambition.

A Friendly AI will have the ambition to be Friendly. An unFriendly AI
will have supergoals which, if they remotely resemble what we regard as
normative cognition, will express themselves in an attempt to fulfill the
supergoal maximally. Either way you get what we would call "ambition". I
see no particular reason to speculate "lack of ambition" - I don't see any
emergent forces producing it, I don't see anyone attempting to design it
deliberately, and I don't see any way it *could* be designed in
deliberately for normative goal reasoning.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:08 MDT