From: Damien Broderick (email@example.com)
Date: Mon Feb 04 2002 - 19:01:01 MST
Eugene Leitl asked me to repost this reply to the list, since he's
On Sat, 2 Feb 2002, Damien Broderick wrote:
> At 10:18 PM 2/1/02 -0500, Eliezer S. Yudkowsky wrote:
> >For what it's worth, Wayne, if we somehow wind up in a scenario where both
> >Eugene and I are around as uploads, and we both have the unrestricted
> >ability to affect Earth, and you're still around on Earth, then I'll
> >protect you from Eugene.
> Gotta love you guys. :)
Wayne Hayes wrote:
> me> You do not interact meaningfully with a statue-littered landscape. By
> me> means other than bulldozering it, I mean.
> I hope this attitude isn't common among those who wish to download
> themselves into AIs. Unless it was a joke, you're advocating
> contemptuous genocide of those who decide to remain in flesh-and-blood
> form. Are you saying that you'll regard flesh-and-blood humans with
> about the same empathy as you do insects or bacteria today?
You misunderstood me. The current me would of course want to prevent that
fate (broken statues do look ugly). However, that's the *current* me, and
future-I of course have very little power on actions of *others*. There's
clearly rather nasty things relatively to mankind written in game theory
and evolutionary pressure of postbiological beings. A great motivation to
not stay behind, and to proceed slowly, and cautiosly, imo.
eli> For what it's worth, Wayne, if we somehow wind up in a scenario where
eli> both Eugene and I are around as uploads, and we both have the
eli> unrestricted ability to affect Earth, and you're still around on
eli> Earth, then I'll protect you from Eugene. I can't make any realistic
Yes, and I will protect you from Eliezer, of course. ;P
eli> commitment on behalf of a transhuman future self because my goals
eli> might change, but that's the current output of my goal system.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:38 MST