Re: Singularity: AI Morality

Eliezer S. Yudkowsky (sentience@pobox.com)
Tue, 15 Dec 1998 15:04:49 -0600

Samael wrote:
>
> 1) One must have a reason to do something before one does it.
> 2) If one has an overarching goal, one would modify one's subgoals to reach
> the overarching goal but would not modify the overarching goal, because one
> would not have a reason to do so.

I suggest that you take a look at
http://tezcat.com/~eliezer/AI_design.temp.html#det_igs for an explanation of how to build goal systems without initial ("overarching") goals.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.