Looking over Eugene's posts, I begin to become confused. As far as I can
tell, Eugene thinks that seed AI (both evolutionary and nonevolutionary),
nanotechnology, and uploading will all inevitably end in disaster. I could be
wrong about Eugene's opinion on uploading, but as I recall Eugene said to
Molloy that the rapid self-enhancement loop means that one-mind-wins all even
in a multi-AI scenario, and presumably this statement applies to uploading as
If this is the case, then I can't imagine how even the use of nuclear weapons
would help, except possibly temporarily. As far as I can tell, in Eugene's
scenario we're flat-out doomed.
So, Eugene, how does humanity win this?
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:14 MDT