Re: SI Comparative Advantage [was Re: Free Will]

Eliezer S. Yudkowsky (
Wed, 25 Aug 1999 18:18:21 -0500

Lee Daniel Crocker wrote:
> Yup. But for the context of this discussion, the only resources
> threatened by the existence of humanity are the atoms that make us
> up and the energy input required to keep us alive. In order for it
> to benefit an SI to keep us around, it is only necessary that we
> "earn our keep" by producing something the SI values slightly more
> than our atoms, even if it's something they can also produce for
> themselves. Entertainment (such as art, or merely entertaining
> behavior) might be sufficient.

No, it could also be necessary that the SIs don't have something *else* they want to do with atoms. I mean, if you're postulating optimization, I bet even I could devise far more entertaining lifeforms that run on our atoms. If we're going to be this silly, you've just proved that they'll kill us off and replace us with entities that are more efficient at entertaining; say, Buffy the Vampire Slayer.

I think it's *more* likely - but not actually likely per se - that they would keep us around to "encourage" other Singularities. If they acted for reasons we could understand and rely on, it might be worthwhile to devote a huge quadrillionth of computing time to make us a little less nervous. Depends on how observant we can be, I suppose.

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way