Zenarchy wrote:
>
> >I reserve the right to disagree with whatyever system of morality it
> decides
> >upon.
>
> The SI duly accepts your position on this matter... along with billions of
> others. You may practice whatever form of morality you wish, but only in the
> privacy of your own home. You have no right whatever to foist it upon anyone
> else.
Wrong. Sorry, Zenarchy, this isn't the Culture. This isn't even the Archive. Hell, this isn't even A Fire Upon The Deep. The Singularity is not some magic box that contains everything you want. The Singularity is a gigantic tidal wave and it does not turn to our wishes. There's a chance you'll get what you want, but it's by no means certain.
Once I predicted that the Powers would be ethical, and I was happy until I admitted to myself that I had no idea what "ethical" meant. I had predicted two things: First, that there would be titanic powers, unbound by game theory, following a purpose I couldn't forsee, and with no reason not to flatten anything in their way. Second, that this would be a good thing, the thing that we ourselves would approve if we were intelligent enough.
And third, that even if we wanted to protect ourselves, the best course would still be to keep the Singularity clear and clean, simple and elegant and direct. Trying to slow down the march of progress will get us all killed, and trying to coerce the Singularity will put us in hell.
(And the Singularity surely isn't inevitable; one nuclear war would be enough to put it off hundreds of years. As Michael M. Butler says, "Waiting for the bus is a bad idea if you turn out to be the bus driver.")
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.