Re: Paradox--was Re: Active shields, was Re: Criticism depth, was Re: Homework, Nuke, etc..

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jan 12 2001 - 10:31:25 MST


John Marlow wrote:
>
> I suggest to you that the entire effort to create and
> empower a nanny AI can end ONLY in disaster. The thing
> will have no allegiance to us, no dependence on
> us

If a human had no dependence on us, s/he would have no allegiance to us.
This, again, is a non sequitur for AIs. An AI has allegiance to whatever
it has allegiance to.

> --will no more "relate" to us than we do to insects

Again, you must realize that the leap from material ascendancy to social
ascendency is one that only makes sense if you evolved in a
hunter-gatherer tribe.

> or, perhaps more appropriately, to the descendants of
> those more primitive life-forms from which many
> believe we have evolved.

Do you question this theory?

> Tell me--when you turn on the hot-water faucet, do you
> think about the bacteria in the drain being scalded to
> death?

The bacteria didn't write my source code!

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:18 MDT