Re: >H RE: Present dangers to transhumanism

Eliezer S. Yudkowsky (
Wed, 01 Sep 1999 15:05:02 -0500

Stan Kretler wrote:
> I just joined this list recently, but I do not see a lot of what I would
> I call "naive technophiles". I do see a lot of technophiles, and a lot
> of naivete, but the naivete seems to be about the philosophical
> undergirding of transhumanism, not about the technology. Taking
> "controlled sips from a tidal wave" seems easy enough.

"Seems" is the glaringly operative word here. Yep, you're a naive technophile, all right.

Look at it this way: One person can take controlled sips from a tidal wave. A thousand people can't. Neither can two competitors.

> Technology can do anything.

And it probably will, whether you want it or not.

> But thought about * why* taking a controlled sip from a tidal
> wave is a worthwhile goal seems to be absent from this list.

> Seems important to think about these things. Is anyone here worried
> about slipping into a psychosis of nihilism once their IQs are high
> enough that they realize that their goals have no philosophical
> foundation (or *might* have no foundation)?

I suppose in theory I shouldn't worry about going nuts if I find out that there are no ultimate goals, because then my going nuts wouldn't intrinsically be worse than not going nuts. In practice, I do worry about it, partially because it sheds light on the "foundations of the foundations", partially because it helps me locate residual desires, partially because I'm still not sure I have the right formulation of anything, and partially because it's tied up with cosmological problems like the Great Filter Paradox.

In any case, I've already designed a philosophical foundation for transhumans and the initial stages of AIs. A fraction of that is in the link above. So yes, the problem has occurred to me, and I took care of it.

> Worrying about grey goo seems less important (and less interesting) than
> worrying about 400 IQ nihilists with a *lot* of technological power.

Only *incorrect* nihilists. If the nihilists are correct, then it's obviously meaningless whether we worry or not.

           Eliezer S. Yudkowsky
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way