Re: Singularity: Human AI to superhuman

Peter C. McCluskey (pcm@rahul.net)
Mon, 21 Sep 1998 17:56:24 -0700

hanson@econ.berkeley.edu (Robin Hanson) writes:
>
>>There are many others, not-surprisingly including AIers (like
>>Moravec), and nanotechnologists (like Drexler), and others who actually deal
>>in the technology of transcendence, who are also Strong Singularitarians. In
>>fact, I can't think offhand of a major SI-technologist who's heard of the
>>Strong Singularity but prefers the Soft.
>
>Is it clear they mean the same thing by "strong singularity"? And if so many
>people agree, why can I find no published (i.e., subject to editorial review)
>coherent analysis in favor of explosive growth?

So many people? I doubt that there are more than a half-dozen people who meet Eliezer's criteria, and it's quite possible that none of them are aware of informed disagreement with their arguments. If he thinks more than a handfull of people meet his criteria, then he's probably overlooking a fair number of them who don't believe in any kind of singularity.
I'm sure that Drexler would be interested in debating this in the proper forum, although it might be hard to find a forum he would accept.

hanson@econ.berkeley.edu (Robin Hanson) writes:
>If you think that post is a persuasive technical analysis on timing claims,
>rather than suggestive prose, I suspect you don't know what such analysis is,
>and so my requests for it are futile. Does *anyone* reading this other than
>Eliezer think that Eliezer's first post constitutes such a persuasive
>technical analysis?

If he understood your request, I'm sure he would do something other than just tell us what his analysis doesn't consist of.

-- 
------------------------------------------------------------------------
Peter McCluskey          | Critmail (http://crit.org/critmail.html):
http://www.rahul.net/pcm | Accept nothing less to archive your mailing list