> Brent Allsop wrote:
> > I think that it therefor could be argued that it is very
> > selfish on your part to put this suffering and loneliness on and
> > making it so hard on those that remain after you are gone. (Of course
> > we could be considered selfish by not wanting you to die if such is
> > truly against your will! ;) Your life may not be more important than
> > anyone elses, but it is also no less important either. Every single
> > life is important isn't it!? Any singularity must make this more true,
> > not less true don't you think!? How or why would anyone ever think
> > otherwise?
I don't want to die. I very much want to see the Singularity. And it
goes without saying that I would care about the suffering and loneliness
of the people who cared about me. One, you're humans, and suffering is
wrong; two, you're humans who tend to be important to the Singularity,
and suffering might detract from that; three, I care about you, too.
Even so, the decrease in pain and suffering that might result from
signing up for cryonics is not as great as the decrease in pain and
suffering that could be brought about by helping the Singularitarians
who remain, or (since I haven't, in fact, signed up for insurance)
brought about by spending my money on Singularitarian efforts now.
Besides which, I don't care if it's selfish, ha, ha! I'm not the kind
of altruist who feels guilty about being called selfish!
I'm the kind of altruist who feels guilty about being called
"irrational", or even worse, "observer-biased".
Dan Fabulich wrote:
> True, he chose his ends, but he quite obviously *hasn't* chosen
> himself as one of them.
Not exactly. As far as I'm concerned, people are ends in themselves,
*including* myself. I will continue in both opinions unless the reasons
behind them are definitely and unambiguously contradicted. It's just
that, at least in theory, my life is not intrinsically worth more than
anyone else's, and is therefore a vanishing quantity in any equation
which refers to six billion people. I will confess to the extremely
immodest but factually correct belief that my actions now have a
significant probability of saving billions of lives, which provides a
utilitarian justification for socially acceptable forms of selfishness
such as not shipping all my money to starving Freedonia or whatever.
But that rationale just doesn't hold for any society advanced enough to
practice revivification, so the good provided by cryonic suspension is a
"This is the Alcor Emergency Suspension Team! Freeze!"
-- firstname.lastname@example.org Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:11:10 MDT