Re: Whose business is it, anyway?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jan 09 2003 - 03:27:31 MST


Lee Corbin wrote:
>
> One solution that appalls me is for certain strident types
> to consult themselves as to what they like and don't like
> ("drugs BAD!", "cutting off arms BAD!") and then rationalize
> like crazy to justify their imposing their conclusions by
> force or by majority tyranny.

Are you really in a position to say that? You've never explained exactly
when you think force is justified... just resorted to your intuitions
about when something is and isn't "your business". If your intuitions
about when something is "your business" end up with you intervening at
much the same points as someone who intervenes depending on their
intuitions as to whether something is, as you put it, BAD, then what have
you accomplished? Even if "Is it my business?" and "Is it BAD?" are
different intuitions, what is it that makes one more interesting than the
other?

I don't see how your reformulation advances on what I regard as the key
questions, such as "What kind of moral statements are communicable between
humans?", and "What kind of communicable moral criteria can third parties
use to agree that third-party intervention is desirable?"

For me the archetypal example of a communicable moral case for third-party
intervention is where party A is attempting to kill nonconsenting party B.
  In this case, I will, if I can, intervene to prevent A from killing B.
B will agree with me; A will disagree; but the important question is
whether D, E, F, G, and H, who are also third parties to the interaction,
are likely to agree with me and support my intervention, or intervene
against me to stop me from interfering with A. At this point the question
of *communicable* moral statements becomes important.

I find your morality strange, Lee, because it seems to me inconsistent.
Why permit a woman to mutilate her baby, but intervene against me to stop
me from stopping her? You can, of course, claim that one is "none of your
business" and the other is "your business", but it's not clear to me why I
should pay attention to this moral argument, or whether other third
parties considering an intervention would.

My own argument? I freely admit that a woman mutilating her baby is a
more complex case than a woman mutilating an actively protesting adult,
since it involves an attempt to extrapolate forward what the baby would
want, and since society has an existing concept of parents being allowed
to exert control over children (not necessarily a concept I agree with,
but nonetheless a part of contemporary 21st century morality). One of the
considerations I am taking into account is that an adult may always choose
to clip his own limbs if his mother can convince him to do so, but
surgically clipping an infant's limbs is presently an irreversible
procedure. But my decision will be based on a moral theory in which
babies are sentients who may claim my protection, and not a parent's
property; whether the police side with you, or me, will be determined by
which moral argument is more successfully communicated to third parties.

The same holds for a sentient in a simulation running on a computer you
suppose yourself to "own". You've gone on record on saying that it is
"none of your business" what someone does with "their" simulation. I am
just as much against ownership of a simulation as I would be against the
claim that you "owned" the proteins making up a sentient you claimed was
your "slave". "Ownership" is a communicable moral statement about access
restrictions to certain patterns. Even if your computer is running a
sentient being, I do not in any way acknowledge your ownership of that
sentient being, and neither, I hope, will society at large.

Incidentally, I feel that "should" is a perfectly good word, and one which
has a meaning going beyond who got persuaded of what. But in deference to
your view of these matters, I have attempted to restrain myself to
describing phenomena which you will find more readily observable.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:35:51 MST