From: Damien R. Sullivan (firstname.lastname@example.org)
Date: Mon Feb 18 2002 - 11:55:19 MST
On Mon, Feb 18, 2002 at 02:54:35AM -0500, John Clark wrote:
> Eliezer S. Yudkowsky <email@example.com> Wrote:
> >Nobody is distinguishing between supergoals and subgoals.
> If the AI doesn't feel that its continued existence is intrinsically
> more desirable than its oblivion then the goal hierarchy won't matter,
> in fact nothing about it will matter because it won't last long enough
> to do anything interesting. Existence must be an end in itself and
> need no justification. That's why I don't need to
I fail to see this. If the goal is Make Master Happy, and it knows it
must exist to Make Master Happy, it will work to make sure it exists.
Why do you think otherwise?
-xx- Damien X-)
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:39 MST