Re: Major Technologies

Dan Clemmensen (Dan@Clemmensen.ShireNet.com)
Tue, 19 Jan 1999 20:59:07 -0500

Chris Wolcomb wrote:
>
> On Mon, 18 Jan 1999 21:53:59 Dan Clemmensen wrote:
>
> >For the record, I don't actually indend to destroy the earth, or even threaten
> >it. I intend to continue to try to advance the singularity.
>
> Could you be more precise?
>
> If the singularity is defined as the horizon beyond which we can
> conceptualize, how do you intend to advance it? The closer you get,
> the further this singularity=horizon will receed from you.

This is true for an exponential model, but not for a hyperbolic model or a discontinuous model. I feel that the discontinuous model is the most valid.
>
> If the singularity is more like the noun 'SINGULARITY' as defined by Eli,
> what makes you certian that your intention not to destroy or threaten the
> earth is not in conflict with this Singularity you want to advance? In other
> words, since you imply that the singularity will not threaten the Earth,
> perhaps you could describe specifically how it will not.
>
I didn't say that I intend to protect the earth, I merely saind that I will not intentionally harm it. I feel that the singularity is inevitable, and I feel that we cannot predict its outcome. the outcome is either good or bad. If bad, a delay is of little worth. If good, a delay is of substantial harm (specifically, to those who die during the delay.) Since any reasomable projection that does not include a singularity ends in catastrophe, A "good" singularity is our best bet.

My preferred SI is a sort of collective consciousness that includes substantially independent sub-entities, some of which are or derive from humans. I have no reason to believe that the SI will organize itself this way, but it's the best chance for individuality.