Jeff Davis wrote:
>
> There is an un-extropian quality to this, a contradiction of the concept of
> personal responsibility, a communitarian rather than rugged individualist
> undertaking. And it might even tend to encourage some to further
> procrastinate in making their own arrangements. (Eliezer's immortal,
> right?) On the other hand it might serve to provoke--shame them into it,
> if you will--the very personal-responsibilty-driven action which will make
> the entire business unnecessary. But in the end, it is, to me, an act of
> deliberate self-interest: for my sake, not theirs, I wish to insure that
> this sort of tragedy does not happen again.
No, I'm not signed up for cryonics. But it's not because I'm immortal.
I can get hit by a truck, same as anyone else. My personal estimate,
however, is that cryonic revivification requires post-Singularity
technology. In this world, I'm the author of "Coding a Transhuman AI"
and "The Singularitarian Principles"; post-Singularity, my life has no
greater importance than anyone else's. If I died with a $30K insurance
policy, that $30K would save more lives in the hands of the remaining
Singularitarians, or even the Foresight Institute, than the one life
that might be saved by cryonics.
I would never, ever suggest that this logic ought to apply to anyone
other than myself. There is absolutely nothing morally wrong about
cryonics. There is absolutely nothing morally wrong about saving your
own life with your own money. But, by a similar token, there is
absolutely nothing morally wrong with choosing to say that your own life
is no more important than anyone else's. It's my choice. I've made it.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:11:06 MDT