From: Anders Sandberg (asa@nada.kth.se)
Date: Wed Apr 16 2003 - 12:27:40 MDT
On Wed, Apr 16, 2003 at 10:58:42AM -0700, Charles Hixson wrote:
> Spudboy100@aol.com wrote:
>
> >...
> >The Singularity, as I understand it, is when technology becomes
> >powerful and complex enough to become self-aware, and have "internal
> >conversations with itself" and become asymptotic in intelligence. Most
> >on this list believe it will approach amazing and overwhelming...
>
> As I understand it, that's the way we predict the Singularity will
> happen. But from my understanding of the definition, the Singularity is
> that point at which the past becomes useless for predicting the future.
Both of these definitions are different from Vinge's original definition,
although they are close in the general sense the word is used. He suggested
the singularity as a strong feedback loop in technological development
causing a fast transition to a state we could not at present predict. Note
that it does not have to involve AI, it could be IA or something else. The
feedback loop is the precondition, the unpredictability is the consequence.
It does not say anything about indeterminism; even a deterministic
singularity would likely be computationally unpredictable since so much
happens that can not be predicted without the same level of technology.
> As it is, it often seems to be that our only hope lies in the
> unpredictable future, because the ones that a straight-line prediction
> yields are so terrible.
Also, an unpredictable future allows freedom of action rather than having
to work within a limited aim. It has room for diversity and exploration.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Wed Apr 16 2003 - 12:31:13 MDT