Re: Singularity: AI Morality

Eliezer S. Yudkowsky (
Tue, 15 Dec 1998 15:04:49 -0600

Samael wrote:
> 1) One must have a reason to do something before one does it.
> 2) If one has an overarching goal, one would modify one's subgoals to reach
> the overarching goal but would not modify the overarching goal, because one
> would not have a reason to do so.

I suggest that you take a look at for an explanation of how to build goal systems without initial ("overarching") goals.

--         Eliezer S. Yudkowsky

Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.