Wed, 10 Nov 1999 09:11:47 EST

In a message dated 99-11-09 22:22:02 EST, I believe Eliezer S. Yudkowsky writes:

<< Actually, the very eve of Singularity, when superintelligent AI is no
longer some never-k'never but an immediate prospect, is when I expect some of the largest social problems. Remember, we haven't won until the SI has nanotechnology. >>

What's this "We" stuff, monkey-boy?

Glen "Kowtowing to the AI overlords" Finney

PS - Sorry folks, I just couldn't resist paraphrasing Lithgow's John Warfin<sp?> from Buckaroo Bonsai.