Re: Better never to have lived?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jan 05 2003 - 13:23:17 MST


Lee Corbin wrote:
> Eliezer writes
>
>
>> Morality is a function over 4D spacetimes, not 3D spaces. If you
>> terminate an existing mind, the termination event is undesirable.
>> Declining to create a *new* unhappy mind is not morally equivalent to
>> *killing* an unhappy mind.... instantiating a particular mind has no
>> effect; it already has a Platonic existence or whatever.
>
> Not so. Whether or not a mind or program obtains actual execution time
> is extremely important in any consistent value system I know of. It is
> as ludicrous to suppose that starting execution of a process has no
> effect as it would be to suppose that termination of a process has no
> effect.

Uhhh... Lee, you left out a key part of that quote, thus, with a few
selective deletions, changing its meaning to the 180-degree opposite of
the original. Is this a subtle joke?

Eliezer really wrote:
>
> Morality is a function over 4D spacetimes, not 3D spaces. If you
> terminate an existing mind, the termination event is undesirable.
> Declining to create a *new* unhappy mind is not morally equivalent to
> *killing* an unhappy mind. Existing is different from not existing.
> If this isn't the case, then instantiating a particular mind has no
> effect; it already has a Platonic existence or whatever. If
> instantiating a mind *does* make a moral difference, then among the
> moral differences it makes is that *now* the mind has civil rights.

Maybe I should have written that in the subjunctive mode to be clearer:

"Existing is different from not existing. If this weren't the case, then
instantiating a particular mind would have no effect; it would already
have a Platonic existence or whatever."

The reason I did not originally write this way is that we do not know FOR
A FACT that existing (instantiation in our own apparent subjective
universe) is different from not existing. But I do tend to use it as a
working assumption.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:35:50 MST