Re: Artilects & stuff

Eliezer S. Yudkowsky (
Tue, 14 Sep 1999 21:00:15 -0500

den Otter wrote:
> Responsible and ethical people would probably use Asimov's
> robotics laws to control the AI, which may or may not work
> (probably not). How the AI evolves may very well be "fundamentally"
> beyond our control. So...I'd join the *uploading* team if you
> have serious future plans. Load me up, Scotty!

Dammit, Otter, if an entity that started out as uploaded Otter managed to keep *any* of your motivations through Transcendence, selfish or otherwise, you could use the same pattern to create a reliably benevolent Power. I mean, let's look at the logic here:

OTTER and ELIEZER, speaking in unison: "The mind is simply the result of evolution, and our actions are the causal result of evolution. All emotions exist only as adaptations to a hunter-gatherer environment, and thus, to any Power, are fundamentally disposable."

ELIEZER: "If there's one set of behaviors that isn't arbitrary, it's logic. When we say that two and two make four, it's copied from the laws of physics and whatever created the laws of physics, which could turn out to be meaningful. All the other stuff is just survival and reproduction; we know how that works and it isn't very interesting."

OTTER: "All the emotions are arbitrary evolved adaptations, except selfishness, which alone is meaningful."

This just says "Thud". To use a Hofstadterian analogy, it's like:


Hofstadter presents a number of answers, including:

wyz (the best)

and one, which Hofstadter says "definitely presents a very loud 'Thud!'",


Otter's answer makes a similar 'Thud!' to my ears. He gets as far as seeing the essential arbitrariness of it all, but instead of trying to pick out something that isn't arbitrary, he picks out selfishness and canonizes it. dyz.

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way