From: Dan Clemmensen <Dan@Clemmensen.ShireNet.com> To: email@example.com <firstname.lastname@example.org> Date: 12 December 1998 01:04
Subject: Re: Singularity: AI Morality
I don't expect it will, because I expect the AI's to be prgorammed with
strong goals that it will not think about.
>> But why would it _want_ to do anything?
>> What's to stop it reaching the conclusion 'Life is pointless. There is
>> meaning anywhere' and just turning itself off?
>Nothing stops any particular AI from deciding to do this. However, this
>doesn't stop the singularity unless it happens to every AI.
>The singularity only takes one AI that decides to extend itself rather than
>If you are counting on AI self-termination to stop the Singularity, you'll
>have to explain why affects every single AI.
I don't expect it will, because I expect the AI's to be prgorammed with strong goals that it will not think about.