Re: Paradox--was Re: Active shields, was Re: Criticism depth, was Re: Homework, Nuke, etc..

From: Eliezer S. Yudkowsky (
Date: Thu Jan 11 2001 - 23:20:55 MST

John Marlow wrote:
> Yeah but the point is, all power is not concentrated
> in a single individual. A leader who goes berserk can
> be stopped or killed. You hand all weapons of mass
> destruction to an orbiting AI, you got problems.

A transhuman AI doesn't *need* weapons of mass destruction. So we may as
well minimize our problems by keeping WMDs out of the hands of humans.

Obviously, I'm not advocating handing all the WMDs over to an "orbiting
AI" of human-equivalent or lesser capacity; *that* would be stupid.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:18 MDT