Re: Singularity: Human AI to superhuman

Eliezer S. Yudkowsky (sentience@pobox.com)
Fri, 11 Sep 1998 18:18:50 -0500

Robin Hanson wrote:
>
> The idea was to have Vinge here to flesh out and defend his claims.
> If he can't or won't, then I would be content to at least create a consensus
> that he hasn't supported his vision with a coherent analysis.

Vinge is _busy_. If _I_ support his vision with a coherent analysis, that counts too. One of the ideas behind a colloquium is that you don't have to defend every single tiny point someone else doesn't like, just the ones that stump your supporters or for which your supporters give unsatisfactory arguments.

> Your analysis seems to me to be based on unusual enough assumptions that it
> can't really stand for what Vinge probably meant, but didn't get around to
> saying. It also seems opaque enough that in discussing it, one spends more
> time trying to understand Eliezer's concept of singularity, rather than
> Vinge's.

Ah, now we come to a major point of dispute. For you, the Singularity is a concept in Vinge's mind. It gets analyzed as a concept. The criteria of discussion is how neat a concept it is. Coherent discussion requires that we discuss only Vinge's concept and not everyone else's, because a controversial concept becomes incoherent when spread across multiple minds.

For me, the Singularity is a real event that happens to a civilization. Our conceptions of it are irrelevant except insofar as they accurately describe real events. My description of a seed AI is detailed, but if the Singularity actually occurs via neurological enhancement, then both my description and the conclusions I draw are totally irrelevant, no matter how rational they may have seemed at the time. To me, Vinge isn't the person who invented the Singularity, he's the person who noticed it. And the Singularity certainly isn't just a concept in Vinge's mind!

> >... what _you_ (Hanson) originally
> >asked is whether the concept of a Singularity was flawed.
>
> I asked about *Vinge's* singularity concept, exactly to avoid this elephant
> that becomes all things to all people.

Well, there you go. I'm asking about *Earth's* Singularity.

Facts about Earth are accessible to this discussion. Facts about Vinge aren't accessible unless he wants to waste a lot of time. What wouldst thou of me? That I present a detailed deconstruction of Vinge's mind when he thinks about the Singularity? Supposing that I succeeded and supposing that Vinge didn't punch me in the nose, how would we have significantly advanced the knowledge of humanity?

My conception of the Singularity is recognizably an instance of Vinge's concept: It has unknowability, greater than human intelligence, and changes in the basic rules - with positive feedback as a fourth principle. If this instance is rationally acceptable, so is the Singularity. If this instance is rationally flawed, the reasons (e.g. "no big wins in intelligence") will probably generalize to any version of the Singularity. Most high-level flaws in Vinge's Singularity would apply to Eliezer's Singularity, and maybe even vice versa. So again, what's the problem?

I'm not sure what you're asking me to say or do. I mean, specifically. I think you're asking for debate on a level that can't be rationally debated.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.