Brent Allsop wrote:
>
> Eli <sentience@pobox.com>,
>
> I'm not sure I completely understand where you are coming
> from. Could you help me out a bit.
>
> > One, you're humans, and suffering is wrong; two, you're humans who
> > tend to be important to the Singularity, and suffering might detract
> > from that; three, I care about you, too.
>
> When you say suffering is wrong do you mean it is wrong that
> Sasha died, or wrong for us to suffer over his death?
Yes.
> Are you saying
> that we shouldn't be talking about sasha's death, but instead should
> be working to bring about the singularity...?
Would "not talking about it" decrease your suffering? I thought not.
> > Even so, the decrease in pain and suffering that might result from
> > signing up for cryonics is not as great as the decrease in pain and
> > suffering that could be brought about by helping the
> > Singularitarians who remain, or (since I haven't, in fact, signed up
> > for insurance) brought about by spending my money on Singularitarian
> > efforts now.
>
> Tell me if I have this right. Are you saying that instead of
> spending $120K on cryonic preservation we should donate that $120K to
> computer R&D or something that would bring the singularity here
> faster?
"We"? What's this "we"? What's this unstoppable semantic leap between
"I choose X" and "you should choose X"? I, personally, would rather
spend money on seed AI, and I, personally, would rather spend my money
now than take out an insurance policy.
> > It's just that, at least in theory, my life is not intrinsically
> > worth more than anyone else's, and is therefore a vanishing quantity
> > in any equation which refers to six billion people.
>
> Some day I hope to be like a God that can be very intimate with
> trillions of people and more all at the same time. Some day I hope to
> feel more pain and suffering if one of those trillions dies, than I do
> know when one of the 100 or so extropians I know dies. The
> singularity or when we start approaching the infinite it will
> certainly amplify, not diminish the importance of 1 in 6 billion will
> it not?
I don't know.
> > I will confess to the extremely immodest but factually correct
> > belief that my actions now have a significant probability of saving
> > billions of lives, which provides a utilitarian justification for
> > socially acceptable forms of selfishness such as not shipping all my
> > money to starving Freedonia or whatever
>
> Hmmm, I think I almost understand this. So you are saying
> that the $120K spent on cryonic preservation could be instead spent on
> something that has the possibility of saving billions of lives right?
> If so this makes some sens and I must admit it is a good point. So
> you are saying you're going to give up your chance at eternal life,
> forgo cryonic suspension, with the hope that doing so will save 6
> billion others? I guess if it even had a reasonable chance of saving
> 20 or 30 of our children you'd be a real hero I'd think. If so, what
> is it you're working on or pushing for that might accomplish such?
Right now? "Coding a Transhuman AI" version 2.0 alpha.
And no, I don't want to be a hero. I think it loses some of the whole
point if you're a hero. I'm not making a permanent moral commitment
which, if abandoned or altered, would mean that I'd betrayed everything
a Singularitarian should stand for. Once you become a hero, then PR
forces you to never re-evaluate your own choices. And there's also a
sort of gloating involved, where, even if you *say* you refuse to
comment on the choices of others - well, obviously, they're still
*wrong*, right? Even if you're too *wise* and *tolerant* to call anyone
else on it?
What I want to do is say that I'm making a particular decision using a
particular type of logic, in total freedom from the usual social
implications that would be attached to altruism. Alas, even here, it's
not easy to do so.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:11:12 MDT