From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jan 05 2003 - 00:40:16 MST
Well, here's my take on it:
*Before* a new mind comes into existence, I want the mind to have the
maximum potential for fun and the maximum probability of fulfilling that
potential.
*After* a new mind comes into existence, it has citizenship rights,
including the right not to be terminated, including the right not to be
terminated and replaced with another mind that could have 3% more fun.
This holds true *even though*, if I'm considering which mind to create and
*haven't done it yet*, I should create the mind that can have 3% more fun.
Morality is a function over 4D spacetimes, not 3D spaces. If you
terminate an existing mind, the termination event is undesirable.
Declining to create a *new* unhappy mind is not morally equivalent to
*killing* an unhappy mind. Existing is different from not existing. If
this isn't the case, then instantiating a particular mind has no effect;
it already has a Platonic existence or whatever. If instantiating a mind
*does* make a moral difference, then among the moral differences it makes
is that *now* the mind has civil rights.
Which goes to show that parents had better get it right the first time.
I'm crossposting this to wta-talk because, having expressed my felt
responsibility to the countless zillions of sentients who might come into
existence after the Singularity, some people accused me of advocating
equal rights for all possible people and/or trying to capture the unborn
as my unwilling political constituency. (For those of you who only read
Extropians: Yes, really.)
I hope this clarifies the moral imperative I see. *Before* the fact, and
only before the fact, a parent has a responsibility to give birth to the
best possible child, with the highest potential and greatest probability
of happiness. *After* the fact, any child, once it exists, has a right to
go on living.
I can treasure the possibility that in the future, (a) many (b)
high-potential sentients will have come into existence, which is
preferable to (a) few (b) low-potential sentients. This does not mean
that all possible sentients must be instantiated.
The odd idea that I am trying to make the unborn my own political
constituency probably comes from reading too many far-left political
tracts. I'm not from that culture; I don't play by those rules myself and
wouldn't expect anyone else to pay attention if I did.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:35:50 MST