Re: Singularity: Human AI to superhuman

John Clark (jonkc@worldnet.att.net)
Fri, 11 Sep 1998 00:55:56 -0400

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Robin Hanson <hanson@econ.berkeley.edu> Wrote:

>If most of AI progress comes from a half dozen big win insights, then you
>should be able to write them all down on one short page. Anyone who read and
>understood that page would be nearly as good an AI programmer as anyone else.
>This is very far from the case, suggesting the importance of lots of little
>insights which require years of reading and experience to accumulate.

That's a good point but the question remains, even if a billion small insights are needed what happens if their rate of discovery increases astronomically?

By the way, does anybody know how Lenat is doing on his CYC project? I haven't heard much about it lately?

>there hasn't been much human genetic evolution over the last 50K years!

True.

>There has been *cultural* evolution, but cultural evolution is Lamarkian.

Yes but the distinction between physical and cultural evolution would evaporate for an AI, it would all be Lamarkian and that's why it would change so fast. And artificial intelligence is only one path toward the singularity, another is Nanotechnology, perhaps another is Quantum Computers, and you only need one path to go somewhere.

>There many have been breakthroughs within "short" times, though these times
>were only "short" when compared to a million years.

And the singularity may seem like a slow steady deliberate process when our super fast electronic successors look back on it, but that's not the way it will seem to us when we live through it, or die in it.

John K Clark jonkc@att.net

-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.5.5

iQA/AwUBNfitXN+WG5eri0QzEQK0jACgry3TY7i3x8q+SzkxFjPv/7V32sAAnAhp ztb+nH+h4uMXDPe72uhUWI/n
=ydx2
-----END PGP SIGNATURE-----