At 06:43 PM 4/8/99 +0100, you wrote:
>Dan Fabulich wrote:
>> But Singularity somehow, someway, with a species that may
>> not even be our own? Totally inevitable.
>
>What if 2n increase in intelligence equals n^2 increase in computer
>power? What if 2n increase in intelligence equals 2^n increase in
>computer power? What if intelligence involves not-easily-computable
>elements? What if there's an upper limit to complexity? Eh?
>
>Such confidence in the face of such uncertainty!
My definition of Singularity is pretty weak. I'd consider any kind of a Borganism a Singularity. Thus, even if there *is* an upper limit to intelligence, theoretically or practically, we should still expect some kind of Singularity to result someday.
-Dan