john grigg wrote:
>
> I want to say how impressed I am by the magnum opus that Eliezer has written
Good term. In fact, the "Low Beyond" index (see .sig) lists it under
"Magnum Opii" along with "Coding a Transhuman AI".
> on his goal to bring about the singularity. It is a tremendous plan and I
> wish you the best of luck. I hope people get behind you on this great
> endeavor.
>
> Eliezer wrote:
> Keeping the Singularity quiet might prove a challenge, but I think it'll
> be possible, plus we'll have a transhuman guidance. Once drextech is
Oops. This should read "have transhuman guidance". (The spelling error
was caused by an incomplete change from "have a transhuman guiding us".)
Well, it's fixed.
(Also, thanks to Cole Kitchen for spotting the fact that I forgot to
take the "Draft Edition" notices off. Likewise fixed. Only two days
and we're already at v1.0.2.)
> developed, the assemblers shall go forth into the world, and
> quietly reproduce, until the day (probably a few hours later) when the
> Singularity can reveal itself without danger - when there are
> enough tiny guardians to disable nuclear weapons and shut down riots,
> keeping the newborn Mind safe from humanity and preventing
> humanity from harming itself.
> (end of reproduction)
>
> Taking control of things may upset the powers that be! :)
I know; one of the sections-in-progress is 3.5.4 "The Last Minute" (a
more fully developed treatment of the issues raised in 2.2.14
"Transcendence"). Dunno when you'll see it, though.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Typing in Dvorak Programming with Patterns Writing in Gender-neutral Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:01:58 MDT