Preservation, Rationality, and ETI

Eliezer S. Yudkowsky (
Tue, 02 Nov 1999 21:31:30 -0600 wrote:
> The next step in this slippery slope is to say that the aliens are here,
> silently monitoring us, but as each person dies they take a snapshot of
> his brain state and take him off to live a new life elsewhere, one free
> of trouble and pain, until they can be rejoined with their loved ones.

The thought has occurred to me.

> This addresses the moral issue in a very familiar manner, one which has
> been taught in Sunday schools for centuries.

And that's exactly why this *is* a significant probability, albeit not one that much affects the choices I make. If the Prime Directive is psychologically plausible, so is mind-state preservation. In fact, I would argue that they're more plausible together than they are apart. Given the number of people on this list who insist that Powers (AI, upload, whatever... Powers are Powers) can be preprogrammed, or retain their human motives, don't you think that the tremendous mortal-level emotional attractiveness of mind-state preservation makes it one of the most likely motives for a persistent-motive Power to have?

Of course, I personally don't care much about such things; in fact, I think I may actively dislike the whole concept, since what I *do* care about is a no-holds-barred Singularity, while *this* scenario implies that the first race to get there imposed limits on everyone else. I know that if my desires were given free reign, there wouldn't be any damn Prime Directive anywhere that people were suffering.

And before you ask: No, I don't care how the concept has been abused in the past. I do my own navigation, and what other people think doesn't enter into it, either way.

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way