From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 18 2003 - 10:52:16 MDT
Robin Hanson wrote:
>>
>> ... whether many people or a few people are working out their areas of
>> the One Explanation, my point is that I would not strive for
>> modularity in my maps unless I thought that reality itself was modular
>> with respect to the thing I was mapping. This idea of building a
>> philosophy that is modular, where you can stand regardless of who else
>> falls... it may help on public relations but how can it possibly be
>> *right*? Isn't this a sign that one has fallen into "mere
>> philosophizing", unconnected from the rest of the universe? ... When
>> I am *designing* something, then yes, I will try and make the design
>> modular because that is a good heuristic for humans to use. When I am
>> trying to *discover* something I will not try to make the
>> *explanation* modular unless I think the *reality* is modular ...
>
> A *contribution* to knowledge does not equal an *explanation*. If you
> are studying field B that depends on field A, and you happen to believe
> A2 while most people believe A1, it is not enough for you to show that
> A2 leads to B2; you should also show what A1 leads to. If A1 leads to
> B1, you should say so. You are not saying anything false or
> misleading by doing this. You are allowing progress and the division of
> labor to continue without requiring that everyone agree with you on
> everything.
If we suppose the Singularity to be wrong, then we have no idea what the
next century will bring; hence, not enough information to even try and be
clever about what we believe. Ergo, since we can't measure the possibly
catastrophic cost of self-deception, rationality seems like the safest course.
I have a problem with saying: "But what if people who confidently believe
that things will go on just as before are right?" That's not a simple
counterfactual because it strikes at the heart of one of the basic issues
of rationality - lack of confidence that basic beliefs are right, the
future being different from the past. This is not just hypothesizing an
alternate future scenario; it is hypothesizing that beliefs don't need to
change as much as they do and are more reliable than they are. It is one
thing to ask about *different changes*, but to counterfactual on *no
changes* is to start out by denying one of the chief things that makes
rationality instrumentally necessary to arbitrary coherent goals.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jun 18 2003 - 11:02:07 MDT