Eliezer S. Yudkowsky wrote:
>You have some people who say, "The choices are all the same; I'll do what
>seems best to me." At a higher level of self-awareness, you have: "I'll
>stick with the evolutionary system I was born in, they're all the same."
>higher level of self-awareness, you say: "Which new system you choose
>on how your current system evaluates that choice." Choices, systems,
>trajectories... but I want to jump out of the system and choose the real
Will a "superintelligence" be able to "jump out of the system and choose the real answer?" Just because you're superintelligent doesn't mean you're transcendent.
Similarly, I think there's a strong argument against being able to "jump out of the system" without resulting in a universe where we can't even make probabilistic guesses about truth and falsehood. Logic is an excellent example of this. We have a notion that things generally follow logically from one another, but this depends inherently on certain principles of non-contradiction, rules of inference, etc. If you jump too far out of such a system, you'll have no logic at all.
You make an apt point when you note that "which new system you choose depends on how your current system evaluates that choice." But why do you think superintelligences wouldn't be bound by such a rule?
-GIVE ME IMMORTALITY OR GIVE ME DEATH-