Re: What is the singularity?

From: Michael M. Butler (butler@comp-lib.org)
Date: Tue Jul 31 2001 - 02:30:49 MDT


A superintelligence does not have to be "an AI". I believe that Eliezer agrees with me. I believe that's one of the main
reasons he's trying to get there first with an AI.

If there is some "intelligence 'speed of light'" of which we are presently unaware, that might still limit change to
something more sigmoid.

Not proposing, just considering.

"J. R. Molloy" wrote:
>
> From: "Michael M. Butler" <butler@comp-lib.org>
> > The working definition I use: _A_ singularity is when everything seems to be
> happening at once; the rate of change
> > becomes so great as to be incalculable.
>
> Eliezer's definition is when AI exceeds intelligence of smartest humans.
> Recursively self-improving Artificial Intelligence tends to parallel
> recursively self-improving complex adaptive systems found in nature, IOW,
> living organisms (at first glance, anyway). This new life form, this RSIAI,



This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:59 MDT