"Eliezer S. Yudkowsky" wrote:
> Stan Kretler wrote:
>> But thought about * why* taking a controlled >> sip from a tidal wave is a worthwhile goal seems >> to be absent from this list.
> http://pobox.com/~sentience/tmol-faq/logic.html
Very interesting stuff!
>> Seems important to think about these things. Is >> anyone here worried about slipping into a >> psychosis of nihilism once their IQs are high >> enough that they realize that their goals have >> no philosophical foundation (or *might* have no >> foundation)?
> I suppose in theory I shouldn't worry about
> going nuts if I find out that there are no
> ultimate goals, because then my going nuts
> wouldn't intrinsically be worse than not going
> nuts.
I'm not sure precisely what Stan meant by nihilism, or "having no philosophical foundations," but if he means something Nietzschean, which I'm guessing is the case (partly because I know he's sort of a Nietzschean on these issues), you're not exactly speaking to the issue, here, or on your Web page. Nihilism isn't what happens when you "find out there are no ultimate goals," it's also about there being no "finding out," since "finding" can't be justified, since "justifying" can't be justified (etc.). All of this means (in the Nietzschean view, which I think is right) that nihilism, in fact, can't even be stated coherently as a position (which problem, is, itself, _not_ evidence against nihilism).
> partially because it's tied up
> with cosmological problems like the Great Filter
> Paradox.
Yes indeed (though tied up in an empirical way, not a philosophical way).
> In any case, I've already designed a
> philosophical foundation for transhumans and the
> initial stages of AIs. A fraction of that is in
> the link above. So yes, the problem has
> occurred to me, and I took care of it.
In the fraction I've seen, you haven't taken care of the problem at all. Is there more on some other Web page?
>> Worrying about grey goo seems less important (and >> less interesting) than worrying about 400 IQ >> nihilists with a *lot* of technological power.
> Only *incorrect* nihilists. If the nihilists
> are correct, then it's obviously meaningless
> whether we worry or not.
I think nihilism is a far deeper problem than you realize. It's not even clear that it can be judged correct or not (since it gets to the issues of judgement and correctness themselves).
Unrelated:
> Voting for Libertarians [....]
I totally agree with your stuff on the importance of voting for third party candidates, and, myself, have even voted for Libertarians, despite the emptiness of their philosophy (on economics, though, libertarians seem to be right most of the time).
Best,
Brian.
-- Brian Manning Delaney <b-delaney@uchicago.edu> (No need to CC replies to me.)