Re: Do you recognize this equation?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Feb 01 2003 - 15:20:04 MST


Eliezer S. Yudkowsky wrote:
> I recently considered the question of proving that Bayesian observations
> never actually *decrease* your knowledge. To do this, I first had to
> define "knowledge". Let's say your prior knowledge is that the
> probability of case A is p, and hence that the probability of case ~A is
> (1 - p). In this case, in p cases you would score p points and in (1 -
> p) cases you would score (1 - p) points. For example, say that in a
> sample group of 100 patients, 20% have HIV - this is your prior
> knowledge and it's all the prior knowledge you have. So for any given
> patient, your sole knowledge about them is that they have a 20% chance
> of being infected with HIV. For 20 patients, or 20% of the group, the
> truth is that they have HIV; you score the 20% probability you assigned
> to this outcome. For 80% of the group, the truth is that they don't
> have HIV; you score the 80% probability you assigned to that outcome.
> And your total knowledge is .2*.2 + .8*.8 = .68.

Xiaoguang Li pointed out that "knowledge" increases in the above case
given the *mistaken* belief that only 10% of the group has HIV.

After doing a bunch of math, I'm pretty sure that the above definition of
"knowledge" turns out to be a crippleware version of the "squared error".
  Nonetheless, it still looks like the final formula given before remains
the same (i.e., the formula for how much Bayes' Theorem reduces the
squared error):

2[p(1 - p)(m - n)]^2/[pm + (1 - p)n][p(1 - m) + (1 - p)(1 - n)]

If anyone knows the official name for this formula, I'd be much obliged.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Sun Feb 02 2003 - 21:26:09 MST