Re: Beating a dead horse?(Was: Transhumanist Principles)

Sonny B. (sonb@hotmail.com)
Wed, 08 Apr 1998 20:15:51 PDT


>Dan Fabulich, <daniel.fabulich@yale.edu>, writes:
>> Hal Finney wrote:
>> >Based on this rather cautious principle, I would tentatively
classify
>> >the above procedures as moral, as long as they were done carefully
so
>> >that suffering was minimized.
>>
>> I'm dubious of this conclusion... By this premise, all of the above
would
>> necessarily be moral if they were created in such a way that they
were
>> always (artificially) happy, no matter how difficult or otherwise
painful
>> their circumstances...
>
>What is the difference between artificial happiness and the real thing?
>Can someone be happy and still be in terribly difficult and painful
>circumstances? That seems a bit contradictory.
>
>Perhaps the real difficulty here is the difficulty of knowing what the
>true mental state is. Obviously, creating someone with a fixed smile
>on their face is no indication that they are actually happy. Even if
we
>have good general understanding of genetics it may be much longer
before
>we can say with confidence that a particular design will lead to a
happy
>being, one which does not suffer.
>
>Still, in terms of judging morality, I'd say we have to use our best
>understanding of these matters, just as in other areas of uncertainty.
>We may want to err on the side of caution, in which case creating
beings
>with drastically new mental structures could be considered immoral,
>since we might unknowingly create a being who was constantly in
terrible
>pain, and who (of course) never volunteered to undergo that experience.
>This would obviously hinder the growth of understanding the mind and
>brain, but it would be wrong to gain knowledge at the cost of
involuntary
>suffering by others.
>
>Hal

The idea of gain of knowledge at the painful expense of others is
akin to the controversy of abortion. To include issues of morality in a
discussion where everyone is not in agreement with a moral, is a waste
of time.
I don't feel that we could ever know for sure whether or not our
procedures/experiments are inflicting unnecessary pain on an individual.
In the name of advancement, it becomes necessary to take those "immoral"
steps to whatever degree necessary before dedicating efforts to
modification of experiments to decrease discomfort for the specimens. I
know the preceding statement may sound Hitlerian, but when push comes to
shove, and life to death, how else do we humans operate in this world
anyway?
I propose that we could develop interfaces to allow ourselves to
become the specimen being subjected to whatever stresses other impose on
it. Later on, we could, upon exit of such xfers, more accurately
interpret what the specimens may be "feeling".

Sonny

______________________________________________________
Get Your Private, Free Email at http://www.hotmail.com