Eliezer S. Yudkowsky wrote:
>Do I swear it would work? Of course not. But as a
>premise, it is completely plausible. What is not
plausible, IMHO, is
>that this invention would result in an age of world peace.
>violent chaos followed by totally new forms of government
would be my
The phrase, "widespread violent chaos followed by totally
new forms of
government" could accurately describe the last several hundred years of
human history. An incomprehensibly complex evolutionary development (such as
the Singularity) may either reverse or accelerate this trend.
>The effect would be to strengthen all forms of power.
>enforce honesty; dictators could enforce obedience. The
>would win, but first there'd be an interregnum in the
>politicians and bureaucracies, faced with en masse
>band together and do anything to hold onto power. (The
modern U.S. is a
>factionalized oligarchy with the demos holding the balance
of power, and
>the oligarchic factions competing to please the demos. The
threat of a
>truth machine might cause the factions to unite against the
>term limits but more so.)
The form of power most strengthened by a totally reliable
would naturally lead to the Singularity, since a true Singularity would have
only peripheral use for lesser powers. Once the Truth Machine Agency seizes
control of human destiny, it declares Judgment Day, and everyone gets to
testify. Like term limits, but you only get fifteen seconds.
"I cannot fear the Singularity, for I've loved its Truth