Re: Singularity-worship

Twirlip of Greymist (
Mon, 9 Dec 1996 15:20:20 -0800 (PST)

On Dec 3, 9:05pm, Eliezer Yudkowsky wrote:

} I have a technical definition of "smartness" in terms of our cognitive
} architectures, available in "Staring." To quote the first paragraph:

Since I thanked you for your terms of inventability and
comprehensibility, and used 'Perceptual Transcend' before, redefining
them was a bit pointless.

} straightforward mechanisms, to Powers incomprehensible from any human
} standpoint, with semantic primitives that our brains are inadequate to
} comprehend and could only simulate through billions of years of labor.
} Turing-equivalence is practically irrelevant to this scenario.
} Singularity seems like a fine name.

You've just conceded my points, or I've never understood yours. If a
dog runs for a billion years it will not have achieved human level
thought. A dog is not on our level. There are many possible Turing
machines which could never emulate a UTM. My perception of the strong
Singularity has been that the Powers will work on a level we utterly
cannot comprehend. That we *can* simulate them at all is my contention.
Turing-equivalence may be practically irrelevant between us and a 1000
kilometer brain, but is not theoretically irrelevant. We couldn't
invent what they could, but we could understand it. (And actually we
could invent it, but it would take even longer.)

} So what's the basis of your unmodeled, unbacked claim that there isn't
} going to be a Singularity? It seems to me like pure faith, but then I

And I distinctly remember describing one scenario I would call a
Singularity; I've just been challeging the super-strong Singularity.

} There is hence valid logical reason to believe the Meaning of Life to be
} observer-independent, much as one would expect the First Cause to be
} independent of anything else whatsoever. If the Meaning of Life turns

But I don't expect an answer to either question. The universe exists.
Life lives and breeds.

} a valid logical chain for a Power as well. Depending on the ability of
} such a Meaning, Interim or otherwise, to override built-in goals, the
} Power will be ethical. This is a very good reason not to screw around

You don't know what the Meaning is, but assume that a Power that finds
it will be ethical?

} that. If they don't, I - ethically speaking - don't care if we get
} ground up for spare atoms. I do believe, however, that our lives have

I care.

} So yeah, the Powers will be ethical and they won't knock us off. Is
} this statement now sufficiently atheist for you?

I remain highly unconvinced.

Merry part,
-xx- Damien R. Sullivan X-) <*>

"Actually, I've always been rather fond of Lucifer. He was, after all, the
brightest of all the angels before his fall."
-- Rhydon of Eastmarch, _High_Deryni_