Re: What is the singularity?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jul 30 2001 - 19:34:54 MDT


"J. R. Molloy" wrote:
>
> Eliezer's definition is when AI exceeds intelligence of smartest humans.

No, hardware-improved humans also count, as long as it's genuine hardware
improvement and not just using Google or forming corporations or whatever
the latest Fad Pseudo-Singularity of the Month.

> I ask because a list member has expressed fear that a system
> which identifies incorrect thinking might do so with extropians. Wouldn't that
> actually be a friendly thing to do? I mean, if extropians think incorrectly, a
> friendly AI would be doing all sentient beings a big favor by removing that
> incorrect thinking, right?

By *identifying* the incorrect thinking, to those people who would want an
AI to help them identify incorrect thinking.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:59 MDT