From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Feb 03 2002 - 10:26:31 MST
"Robert J. Bradbury" wrote:
>
> I know of no greater risk to humanity than a breakout amoral AI
> that piggybacks on closed source software.
>
> Choose open source -- even if it means sacrificing functionality.
> Your life may depend on it.
Um, I won't say "don't bother", but don't bother - at least, not for that
particular reason. Open-source security is hot stuff when you're
competing with other humans. I'm not sure it'd make much of a difference
to an AI. Worry about defending yourself from other humans, and keeping
the size of the petri dish small enough that we don't get evolving
(non-AI) computer viruses.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:37 MST