Re: The Extropian Religious War--thank you

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jul 14 2001 - 00:32:23 MDT


Samantha Atkins wrote:
>
> "Eliezer S. Yudkowsky" wrote:
> >
> > Samantha, I do think you're being too tolerant here. Can you give me an
> > example of a correct conclusion or useful strategy that can be arrived at
> > by being religious and cannot be arrived at in any other way? Actually,
> > let me ease that requirement; can you show me a strategy or knowledge
> > which is easily *invented* by religion but is harder to arrive at through
> > rationality?
>
> Is life only about "correct conclusions"?

I don't know. But I don't think that incorrect conclusions have any part
to play in it. And the human mind, as we are all fond of saying, is
complicated. If a way of thought - whatever its other effects - leads me
to form incorrect conclusions, not just of a temporary and transient type,
but of a long-lasting and self-rationalizing type, then it isn't worth
it. I'll stick with rationality in the here-and-now, and if there's some
irrational element that's really important, I'll get around to it after
the Singularity, when it isn't so dangerous.

Actually, I *don't* believe that any ways of thought that lead to
incorrect conclusions will turn out to be ultimately important. I think
that "being in harmony with the Universe", a "clear pool", an undistorting
mirror, i.e., correct conclusions, is the sigil of all things worthy of
care.

But that's really almost irrelevant. A human being is a seed, in the same
way that present-day civilization is a seed. Most of my effort goes into
protecting the civilization, rather than myself, but I can't really know
what's worth defending unless I at least think about what I'd be defending
if I was defending myself. And the answer, on an individual level, and on
a civilization level, is that what must be defended above all is
potential.

So I tend to be really unfond of self-reinforcing incorrect beliefs that
will probably lead a great many people to refuse to take part in the
Singularity for utterly wrong reasons. I'm very suspicious of your
assertion that there might be something to be valued in spirituality,
unless you can give me an example of something specific to be valued. I
expect value only when I have reason to do so. But people often "want" to
see value, in one of the many awful cases where human thinking distorts
the premises to make the conclusions less uncomfortable. I see that as a
negative force in the Universe. So I'm suspicious.

> If a different way of
> looking at my life, what I am doing, what we are all doing and
> what can come of it integrates into something that strengthens
> my ability to contribute maximally and may help others to do so
> then that is in itself quite important.
>
> Can a vision of the type of future you wish to create come only
> from rationality or do you need to add quite a few different
> elements and include a large dose of what you would most like to
> see, do what you can to weave it into a unity and then look for
> tools and memes to encourage it to come into being? Is this all
> a rational process or are only rational tools such as science
> and logic allowed in forming the vision and in encouraging it
> into reality?

I think perhaps you underestimate the powers of rationality. Perhaps you
have, in your life, drawn on spirituality in places where I have learned
to draw on rationality. I hope not. As far as I'm concerned, I've
achieved what I've achieved because of my determination to always walk the
path of rationality, including in places where rationality is not commonly
used because it looks a little difficult. And the reason is that I had
enough past experience to predict that even if the path of rationality
looked a little difficult, sticking to the problem and not giving up at
first glance would yield results eventually. So far, this strategy has
always paid off really big.

In particular, rationality has the power to judge which "arational"
elements of the mind are important, whether or not they're too dangerous
to be used, and where their place is in the larger scheme of things. One
of the things I really object to is the idea that rationality is somehow
"incomplete", that there's something outside rationality that has the
power to dictate whether I should be rational, rather than the other way
around. I've simply never found anything like this. I've never found
anything even close.

What I have found, though, is that rationalizers and rationalizations of
all kind are very fond of making this claim, with respect to flawed ways
of thinking that my rational mind can easily understand, encompass,
visualize, and accurately predict.

> Religions are great tools for weaving visions. For good and
> ill. They are also quite good for cohering societies devoted at
> different levels to common goals. And I believe some of these
> systems have quite a lot to say about letting go of your current
> mental/emotional lockstep and redefining and reintegrating
> yourself. This is a very important skill for what most of us
> contemplate doing and becoming.

Again, I use rationality for that.

> > Even so, I wouldn't want to be religious because, to my mind, it's better
> > to be rational and strive to improve that rationality than to do something
> > that has short-term benefits but is crippling in the long term. But what
> > I'm asking for here is an example of even a short-term cognitive benefit,
> > and by that I mean "coming up with a right answer", and not alleged lower
> > stress levels and so on.
>
> Isn't this a bit circular? You say it is better to be rational
> and improve that rationality but exactly what is that to you and
> what do you use to validate that conclusion as best? What makes
> you believe that all spirituality/religion is crippling in the
> long term? In point of fact it seems not to be so for many
> spiritual/religious-ly inclined people.

Do they still believe wrong things? Are they ready to give up those wrong
beliefs if they are disproved? Or are they more likely to doubt the
(correct) disproof and even begin to inveigh against the Bayesian
Probability Theorem itself? It is this perniciously self-protective
quality that I refer to as "crippling" because it results in people
digging themselves deeper into their holes.

> Is the 'right answer" the only thing important? I growing into
> a saner, more caring, more generous person of less worth than
> getting some cognitive "right answer"? Is living life with an
> inner calmness even in the midst of the most strenuous and
> hectic activity so worthless that that without any extra "right
> answer" is of no worth?

I use rationality for that, too.

I have lived my life according to the principle that if you pursue the
right answers, other things come to you; if you pursue other things at the
expense of truth, then you will simply wind up building stronger and
stronger shackles for yourself, and you will achieve neither truth nor
those other goals that were so important to you.

My answer, then, is that I don't know whether "correctly modeling the
Universe" is the ultimate meaning of life, but I very definitely know that
correctly modeling the Universe is important to protecting this
civilization as it heads into Singularity, and even without that, I would
still never step aside from the path of correctly modeling the Universe
even if it made me temporarily happy, because I would fear that by doing
so, I was destroying a part of my own potential.

I think that the best thing I can do to protect the potential of others is
to encourage them to use self-criticizing systems like rationality, and
argue against any suggestion that some other system is somehow "outside"
rationality. In practice, the sole use of that belief is always to
protect mental wrongnesses from healthy ways of thinking. Of course
mental cancers want to avoid the slings and arrows that are sent their
way, so of course people who haven't learned to think painful thoughts are
very happy with the idea that their pet theories are magically immune to
the discomforts that are being provided by their own innate intuitions and
external sources such as scientists. But I view discomforting intuitions
as part of a healthy mind, and discomforting scientists as part of a
healthy society.

I'll say it again: Painful thoughts are part of a healthy mind. But it
is equally true that it is human nature to avoid painful thoughts. And
the conclusion is that those who love sanity, and wish to engender sanity
in others, should argue against those ways of thinking whose sole motive
force is their power to deflect healthy painful thoughts.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:48 MDT