>Wrong. Sorry, Zenarchy, this isn't the Culture. This isn't even the
Archive.
> Hell, this isn't even A Fire Upon The Deep. The Singularity is not some
>magic box that contains everything you want. The Singularity is a gigantic
>tidal wave and it does not turn to our wishes. There's a chance you'll get
>what you want, but it's by no means certain.
Please accept my humble apology, Eliezer. I used the term /SI/ to indicate a superintelligence, not the Singularity, which may of course entail much more than a superintelligence, or even more than a superintelligence may comprehend.
>Once I predicted that the Powers would be ethical, and I was happy until I
>admitted to myself that I had no idea what "ethical" meant. I had
predicted
>two things: First, that there would be titanic powers, unbound by game
>theory, following a purpose I couldn't forsee, and with no reason not to
>flatten anything in their way. Second, that this would be a good thing,
the
>thing that we ourselves would approve if we were intelligent enough.
"a good thing" ...but intelligence has already moved Beyond Good and Evil
(Nietzsche, ca. 1880s).
May we not therefore supplant moralism with an honest quest for less naive
truths?
To the extent logic and reason remain tethered to glorified superstition in
the form of moral religiosity (and its boring memetic rituals), rationalism
(pancritical and otherwise) may not reach far enough to discover what we
need to live extropically in the age approaching the Singularity.
If we survive the Singularity, I suspect it will have singed off, stripped
away, and denuded us of our belief systems, including ethical constructs.
If, OTOH, I personally fail to survive the Singularity, I write this as my
epitaph:
"I do not fear the Singularity, for I have loved the Truth."
[Stolen from the astronomer's epitaph: "I do not fear the night, for I have loved the stars."]
>And third, that even if we wanted to protect ourselves, the best course
would
>still be to keep the Singularity clear and clean, simple and elegant and
>direct. Trying to slow down the march of progress will get us all killed,
and
>trying to coerce the Singularity will put us in hell.
This confuses me because while I have no desire to protect myself, I also have no desire to prevent a march of progress that kills... hell describing, as it does, the rightful place for those who coerce.
>(And the Singularity surely isn't inevitable; one nuclear war would be
enough
>to put it off hundreds of years. As Michael M. Butler says, "Waiting for
the
>bus is a bad idea if you turn out to be the bus driver.")
Waiting for the bus remains a silly idea if we all turn out as bozos on the
bus.
Do you remember the story/play of "Waiting For Godot"?
Even "waiting" constitutes doing something.
As Nasruddin used to say, "Don't just do something. Sit there!"
A possibly extropic view:
"Life does not consist in solving the puzzle of life.
Life consists in finding out how to create a great honking wonder,
an endlessly enchanting adventure of discovery,
and enjoying the hell out of it."
-zen