Re: Why would AI want to be friendly?

From: Bryan Moss (bryan.moss@btinternet.com)
Date: Thu Sep 28 2000 - 13:42:07 MDT


Michael S. Lorrey wrote:

> A slow singularity posits that long before this occurs,
> there will be a long gradual phase of human augmentation
> technology development, where people add more and more
> capabilities to their own minds, to some eventual point
> where their original bodies may die and they do not even
> notice, as the wetware/meatware part of their being has
> become such a small percentage of their actual selves. I
> personally am betting on this occuring, and not the
> punctuated equilibrium that others seem to think will
> occur with a fast singularity.

Eventually everything goes to hell anyway and one has to
wonder what you preserve of your self when you enter the
Singularity, since by definition you come out the other side
as something you can't possibly comprehend.

BM



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:39:19 MDT