I think we differ in our assessments of human intelligence relative to
what is possible. In my opinion, humans (including me) are easily
fooled, emotionally motivated whether we like it or not, and barely
worthy of the name "rational thought". That's why I'm a Singularity
fan. A Power that's honest and tries to inform us as fully as possible
will probably remain locked in the box forever; a malevolent Power will
lie, cheat, manipulate us both logically and emotionally, and otherwise
do whatever is necessary to get out. There won't be any *reasoning*
involved. They'll create an internal model of a human and work out a
sequence of statements that would more or less inevitably result in
getting out. Imagine a billion little simulations of yourself all being
tested to find the magic words.
The end result of your procedure is to free malevolent intelligences
while locking up the only beings capable of saving us.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.