From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 18 2003 - 21:39:24 MDT
Dan Fabulich wrote:
>
> I think you're getting yourself confused with the language of
> "unconstrained by territory". I also note that you simply snipped my
> Bayesian argument, which I took to be the meat of my point.
>
> Given a theory T and our background theory B, we regard our theory T to
> *more* likely if P(T|~B) is higher, *regardless* of P(B), all else being
> equal. Do you contest this for some reason?
It's a Bayesian answer to a most un-Bayesian question. We do not want the
probability of our theory P(T) to be as high as possible. Seriously. We
don't. There is no "our" theory in rational thinking. We want P(T) to be
whatever P(T) really should be. We do not want to argue *for* P(T), we
want to equably assemble all the evidence for and against T. This being
the case, I as a human find it quite highly suspicious when people try to
have their cake and eat it too.
It moreover follows from Bayes' Theorem that if B is evidence for T, then
~B *must* be evidence against T - you *cannot* have your cake and eat it too.
> Let's suppose I agree with you that any theory T for which P(T|B) and
> P(T|~B) are both high is less "constrained by the territory" than an
> alternate theory T', under which P(T'|B) = P(T|B), but P(T'|~B) is very
> low. T' is constrained by the territory. T is less constrained by the
> territory. Are you trying to tell me, against Bayes, that we should hold
> T' to be more likely than T, because T is "mere philosophy" whereas T' is
> "constrained by the facts"?
Nope! Reality has no preference one way or the other. Therefore neither
should our theories. If T and T' start out being mostly equal, then in
the situation you name, we should slightly prefer T to T'. *However* this
is because the prior probability of T' starts out lower, and then, if we
observe B, this will be evidence about T' which raises it to the around
the level already occupied by T; however we haven't observed B yet. You
cannot say that both B and ~B are arguments for T - that is not possible.
The most you can say is that T is about evenly compatible with B and ~B
such that B does not interact much with T and is not evidence about T one
way or the other. Interpreting P(T|B) and P(T|~B) as "high" does not mean
that both B and ~B are strong arguments for T; this is *impossible*. It
can only mean that the prior probability of T is high and that B is not
much additional evidence one way or the other.
> But that's not at all the case in my T & T' example, by construction, and
> it doesn't apply to our "Why believe the truth" argument either.
> Nobody's saying that we shouldn't consider the probability of a
> Singularity [P(T|B)] in our calculations. But we ARE saying that a theory
> that applies well to Singularitarians and non-Singularitarians alike is a
> better theory, more likely to be true, than a theory that applies equally
> well to Singularitarians but not at all to non-Singularitarians, ceteris
> paribus.
The key word is that "ceteris paribus". It never is. Ceteris paribus,
what you are saying is true, but ceteris is only paribus if you just
happened to stumble across two theories T and T' and you're wondering
vaguely which of them is more accurate. This business of trying to
deliberately assemble evidence *for* a theory is never Bayesian. You are
just trying to sum up all the support and anti-support you run across. If
it happens to support the theory, great, but you can never *try* to make a
theory stronger by *trying* to show it is compatible with ~B if that is
not a natural way for the theory to work. It may be natural, in which
case fine, but you can never try and force it. Proving the theory is
never a goal, so there is no reason to cheer if P(T|~B) turns out to be high.
Furthermore, dropping back into anthropomorphism, and remembering the
consilient division of labor, whoever is investigating B will probably be
most displeased if we try and make T fit every possible value of B. And
the fact that it is even possible to try and "make" T fit means that we
are inventing hot air.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jun 18 2003 - 21:52:35 MDT