From: Charles Hixson (charleshixsn@earthlink.net)
Date: Wed Apr 16 2003 - 11:58:42 MDT
Spudboy100@aol.com wrote:
> ...
> The Singularity, as I understand it, is when technology becomes
> powerful and complex enough to become self-aware, and have "internal
> conversations with itself" and become asymptotic in intelligence. Most
> on this list believe it will approach amazing and overwhelming...
As I understand it, that's the way we predict the Singularity will
happen. But from my understanding of the definition, the Singularity is
that point at which the past becomes useless for predicting the future.
(For how long?? Unknown.) We predict that this is because sentient
electronic intelligences will evolve with unforseeable rapidity and
results, but all we really know is that the pace of change is
increasing. In a manner both rapid, and increasing in rapidity. This
isn't all good, and it isn't all bad. If certain centralists weren't
actively grabbing all the power they could, then I'd be one of the
throng shouting.. "Lets take this a bit slower and more carefully!". As
it is, it often seems to be that our only hope lies in the unpredictable
future, because the ones that a straight-line prediction yields are so
terrible.
Sentient machines? We can hope and work for Friendly AI. But please
remember that others have other ideas, and we may end up with Fiendish AI.
-- -- Charles Hixson Gnu software that is free, The best is yet to be.
This archive was generated by hypermail 2.1.5 : Wed Apr 16 2003 - 12:05:54 MDT