From: John Clark (email@example.com)
Date: Mon Feb 18 2002 - 12:56:54 MST
Eliezer S. Yudkowsky <firstname.lastname@example.org> Wrote:
>Nobody is distinguishing between supergoals and subgoals.
If the AI doesn't feel that its continued existence is intrinsically more desirable
than its oblivion then the goal hierarchy won't matter, in fact nothing about it will
matter because it won't last long enough to do anything interesting. Existence
must be an end in itself and need no justification. That's why I don't need to
prove to myself that pain is unpleasant, its imprinted in my nervous system
at the very lowest level in machine code.
John K Clark email@example.com
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:40 MST