RE: [wta-talk] Re: Bayes vs. LP

From: Harvey Newstrom (mail@HarveyNewstrom.com)
Date: Wed Jun 18 2003 - 05:50:35 MDT

  • Next message: Alex Ramonsky: "Re: META: Time to enforce the List Rules! (personal revelation)"

    Amara Graps [mailto:amara@amara.com] wrote,
    > (ccing extropians too, only because this topic has appeared there before)
    >
    > "bzr" <bzr@csd.net>, Sun, 15 Jun 2003:
    > >We have a penny. We toss it. What are the odds that we'll get heads?
    >
    > >The answer: 0
    >
    > >Zero? Yes. This is very counterintuitive, admittedly, particularly for
    > >statisticians. However, the truth is there are no "odds" here at all.
    > >Penny tossing is deterministic.
    >
    > I think that using determinism in this way is putting up a smoke
    > screen in addition to missing the large picture of how scientists
    > intuitively do science. You have a real experiment, so it is
    > physical, and all propositions are testable. How do you define
    > determinism for this system? Your determinism is based on a model of
    > some physics, is it not? No matter how 'deterministic' something may
    > be, your prediction for the outcome of the coin toss is based on
    > data and a model and what other information you have about that
    > system. A Bayes discussion is always in the realm of epistemology,
    > i.e. how we know what we know.

    I agree. In fact, what "bzr" says is NOT counterintuitive to statisticians.
    Anybody who knows anything about statistics knows that it only applies to a
    large-enough randomly distributed population. Out of a hundred coin-flips,
    50% will be heads and 50% will be tails. You can't argue statistics for a
    single event. This is a common error (also found on extropians) of taking
    statistics that apply to a group and trying to apply them to a single
    individual who is a member of that group. This isn't valid.

    > Bayes Theorem is only a multiplication rule of probability theory,
    > which shows a relationship between a posterior probability, a
    > likelihood of data to model, and prior probability. The prior
    > probability and posterior probability are not necessarily related in
    > time. These concepts show just a different relationship to the data
    > to be analyzed. The Bayesian methodologies approach the scientific
    > inference from "first principles", grasping an n-parametric event
    > directly with an n-dimensional posterior probability distribution.

    This is a good example to remember. People often think that new ideas
    completely invalidate everything that came before. More likely, new ideas
    evolve our understanding into better models. Thus Newtonian physics still
    works at the macro level for many situations, we just have more information
    about more complex systems. E=MC^2 works for stationary objects, but more
    complicated formulas are required for more complex calculations. Statistics
    is still valid for some things, but the real world gets more complicated.
    Bayes Theorem is a more complete framework, not a rejection of previous
    mathematics.

    --
    Harvey Newstrom, CISM, CISSP, IAM, IBMCP, GSEC
    Certified InfoSec Manager, Certified IS Security Pro, NSA-certified
    InfoSec Assessor, IBM-certified Security Consultant, SANS-cert GSEC
    <HarveyNewstrom.com> <Newstaff.com>
    


    This archive was generated by hypermail 2.1.5 : Wed Jun 18 2003 - 06:01:01 MDT