> Obviously, I'm not advocating handing all the WMDs
> over to an "orbiting
> AI" of human-equivalent or lesser capacity; *that*
> would be stupid.
I quite agree. What I suggest is that it would be
orders of magnitude more stupid to hand such things
over to something of greater "capacity"--because the
act would seem irrevocable.
--- "Eliezer S. Yudkowsky" <firstname.lastname@example.org> wrote: > John Marlow wrote: > > > > Yeah but the point is, all power is not > concentrated > > in a single individual. A leader who goes berserk > can > > be stopped or killed. You hand all weapons of mass > > destruction to an orbiting AI, you got problems. > > A transhuman AI doesn't *need* weapons of mass > destruction. So we may as > well minimize our problems by keeping WMDs out of > the hands of humans. > > Obviously, I'm not advocating handing all the WMDs > over to an "orbiting > AI" of human-equivalent or lesser capacity; *that* > would be stupid. > > -- -- -- -- > -- > Eliezer S. Yudkowsky > http://singinst.org/ > Research Fellow, Singularity Institute for > Artificial Intelligence
__________________________________________________ Do You Yahoo!? Yahoo! Photos - Share your holiday photos online! http://photos.yahoo.com/
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:18 MDT