Eliezer S. Yudkowsky wrote:
>Do I swear it would work? Of course not. But as a
science-fictional
>premise, it is completely plausible. What is not
plausible, IMHO, is
>that this invention would result in an age of world peace.
Widespread
>violent chaos followed by totally new forms of government
would be my
guess.
The phrase, "widespread violent chaos followed by totally
new forms of
government" could accurately describe the last several
hundred years of
human history. An incomprehensibly complex evolutionary
development (such as
the Singularity) may either reverse or accelerate this
trend.
>
>The effect would be to strengthen all forms of power.
Democracies could
>enforce honesty; dictators could enforce obedience. The
democracies
>would win, but first there'd be an interregnum in the
democracy -
>politicians and bureaucracies, faced with en masse
unemployment, would
>band together and do anything to hold onto power. (The
modern U.S. is a
>factionalized oligarchy with the demos holding the balance
of power, and
>the oligarchic factions competing to please the demos. The
threat of a
>truth machine might cause the factions to unite against the
demos, like
>term limits but more so.)
The form of power most strengthened by a totally reliable
truth machine
would naturally lead to the Singularity, since a true
Singularity would have
only peripheral use for lesser powers. Once the Truth
Machine Agency seizes
control of human destiny, it declares Judgment Day, and
everyone gets to
testify. Like term limits, but you only get fifteen seconds.
-zen
"I cannot fear the Singularity, for I've loved its Truth
Machines dearly."
--The Psychonomer