From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jun 17 2003 - 21:04:37 MDT
Robin Hanson wrote:
> On 6/17/2003, Eliezer S. Yudkowsky wrote:
>>
>>> The vast majority of humanity believes they are in this situation.
>>
>> What of it? ... As it happens the belief is wrong, and isn't that the
>> point? ...
>
> It sounds as if you don't disagree with my claim; you just don't see why
> I would bother to make such a claim.
That is correct; it seems uncontroversial that the majority of humanity
doesn't know what's coming down the line. Of course, neither may we, but
that would only seem to bolster the point about rationality; if the future
is not as strange as we imagine it will probably be stranger.
> There are many things that I do not like about academia, but on this
> point I have internalized the academic priority given to intellectual
> modularity. In academia, different people work on different topics, and
> hope to combine their results later. When discussing each topic, one
> tries to minimize the dependencies of results in this area to results in
> other areas. Of course there will have to be some dependencies, but
> avoiding unnecessary dependencies allows more rapid progress, just as
> modularity aids the design of most systems.
>
> If there is a standard position in some area, and a contrarian
> alternative position in that area, and you are doing work in some other
> area which depends on this first area, your first priority is to say
> what the standard position would imply for your research, and only after
> having done that may you turn to seeing what the alternative position
> would imply.
>
> I am trying to make our discussion of "why believe in truth" be modular
> with respect to the very contrarian position that our goals are very
> different from what evolution has given us, or that the world will soon
> be very different from what evolution has adapted to. The fact that you
> and I might happen to agree with this contrarian position is besides the
> point. My first priority is to make our conversation be accessible and
> relevant to the majority who do not share this contrarian position.
Deliberately strive for modularity? In a consilient universe?
There is only ever one explanation. In it, all the pieces fit together
perfectly, without strain. Any divisions in that explanation are
artificial - human biases. I would not take it as a good sign if my
theories about one part of cognitive science were consonant with many
possible alternatives elsewhere; it would be a sign that the theory was
inadequately constrained by the evidence. I certainly wouldn't
deliberately alter the theory to make it consistent with more than one
alternative elsewhere. Nature doesn't work that way in constructing
explanations; how could it be a good method for discovering them? Your
first priority should be to discover what the real answer is about the
usefulness of rationality. When you know the real answer, then worry
about how to explain it - constructing an accessible explanation. How can
you do both at the same time without biasing your conclusions? And if
your conclusion is wrong, what can you expect but that it will be
difficult to explain, being not consonant with many other facts, as truths
are?
This is one of the reasons why I am not an academic...
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 21:15:11 MDT