Re: Yudkowsky's AI (again)

Bryan Moss (bryan.moss@dial.pipex.com)
Fri, 26 Mar 1999 16:53:06 -0000

Eliezer S. Yudkowsky wrote:
> But the key thing to note is that even this pseudo-
> absolute injunction dissolves under the influence of
> superintelligence. It's not a matter of a conscious
> condition stating: "I can wipe out humanity if I'm
> superintelligent." This trigger can be falsely
> activated - a paranoid schizophrenic may believe
> himself to be superintelligent.

Worse yet an ultra egoist[*] such as myself might think that a whimsical urge to destroy humanity amounts to justifiable suicide.

BM

[*] This philosophy, based around discussions of egoism and utilitarianism on this list, attempts to unite the larger ideas of the society with those of the individual.