"Weaver, Evan A." wrote:
> Since we are looking for SIs, what if all the SIs are busy looking for super
> super intelligences, and those are looking for really super super
> intelligences, and those are... Does anyone see any reason why this would
> have to stop somewhere?
Yes. We're mortals. We don't have the vaguest idea of what's really going on. We live our lives in fog. Mortals do not engage in actual "reasoning", but a process which may occasionally look like reasoning or duplicate some outputs. This is the defining aspect of being mortal. The whole thesis and hope of the Singularitarians is that with enough intelligence you can stop being mortal and everything falls into place. If there are SSIs and SSSIs, we can hope that either the SIs know it perfectly well, or that the transition is simple, easy, and obvious.
There's no a priori way of saying that either our theory or yours is impossible, but we tend to assume that there's a qualitative difference between limited thought and thought backed by a planetary supercomputer; that the situation from an SI's perspective is *not* analogous to ours unless proved to be so.
That, in turn, is what defines the "Singularity"; some people, like Robin Hanson, assume - in accordance with the Law of Mediocrity and Occam's Razor - that everything posthuman is analogous to the human situation unless proved otherwise. Some people, like Vernor Vinge, assume - in accordance with non-anthropocentrism and Occam's Razor - that everything posthuman bears no resemblance to the human situation unless proved otherwise.
I go by the later theory because I think I understand the forces that generate "the human situation", and these forces are not present in posthumans. Remove the elements and the holis (the holistic result of element interactions) can't survive.
-- email@example.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way