Re: Rightness and Utility of Patriotism

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jun 23 2003 - 18:38:37 MDT

  • Next message: Eliezer S. Yudkowsky: "Re: The Food Timeline"

    Harvey Newstrom wrote:
    > Eliezer S. Yudkowsky wrote,
    >
    >> Harvey Newstrom wrote:
    >>>
    >>> This is an extremely important point, I think. Most of the
    >>> "disagreements" on the list are not really disagreements.
    >>> Different people have different data or assign different values.
    >>> Most of the facts themselves are not in dispute. This may be the
    >>> primary root of most if not all semantic misunderstandings. If
    >>> this were recognized more often, perhaps more people would act in a
    >>> more Bayesian manner?
    >>
    >> The above point is what I wanted to warn against when I talked about
    >> the danger of generalizing from ideal Bayesians to humans, because it
    >> is possible to have simplified Bayesians that are missing a kind of
    >> complexity that general Bayesians can have, i.e., nontrivial
    >> structure in computing the utility function. It is possible to
    >> "disagree" over values, not just "assign different" values, but only
    >> if you are a certain class of mind, such that values can depend on
    >> probabilistic external facts or probabilistic approximations of
    >> computations.
    >
    > OK, I must have misunderstood you and gone off on a tangent.
    >
    > But could you clarify which part you disagree with? that my point
    > derives from yours? with my analysis of human disputes? or with my
    > conclusion that maybe humans could act in a more Bayesian manner? or
    > none of the above?

    I disagreed that "the facts themselves are not in dispute" and that
    disputes derive from assigning different values. It seems to me that much
    of the disputation derives from disagreement on facts, and the remainder
    from differences on values that are perceived as disagreements and on
    which people would not agree to disagree. Little of it appears to be
    genuinely "assigning different" values.

    -- 
    Eliezer S. Yudkowsky                          http://singinst.org/
    Research Fellow, Singularity Institute for Artificial Intelligence
    


    This archive was generated by hypermail 2.1.5 : Mon Jun 23 2003 - 18:50:34 MDT