Altruistic punishment

From: Rafal Smigrodzki (rafal@smigrodzki.org)
Date: Sat Mar 22 2003 - 20:16:11 MST

  • Next message: Christian Weisgerber: "Re: Spam attacks again."

    On Sat, 2003-03-22 at 15:12, Eliezer S. Yudkowsky wrote:

    >
    > I oppose the punishment of non-punishers; it may be an ESS but I don't
    > think it's a good thing.
    >

    ### I wonder if this is a rational position, or an emotionally based
    one. The following two links point to some evidence that altruistic
    punishment is a part of an evolutionary stable strategy, built into the
    minds of most humans:

    http://bostonreview.mit.edu/BR23.6/bowles.html

    http://www.nature.com/cgi-taf/DynaPage.taf?file=/nature/journal/v415/n6868/full/415137a_r.html

    (the second link might require a subscription to Nature. I can send a
    copy to those who request it)

    In the very long run, maintenance of stable societies made of
    potentially immortal persons is probably possible using strictly
    rational reasoning. If your attention-span is capable of encompassing
    very long time spans, you will be able to act according to your
    interests, and this will include cooperation, as well as discouragement
    of non-cooperation. Since a failure to discourage non-cooperation can in
    some cases bring about significant losses, it has to be discouraged as
    well, as a second-order consideration. Shirking a duty to punish a
    cheater puts a burden on other participants in the game, and is
    second-order cheating, calling for second-order punishment. A concept
    easy to grasp for a mind looking far forward, without the need for the
    emotional crutches that we rely on.

    On the other hand, the emphemeral creatures that we are, such sustained
    rational actions were not possible in the EEA. Instead, common
    hard-wired responses developed, probably through group selection, with
    an innate tendency to seek revenge, and, punish and exclude those who
    fail to seek revenge (also derided as "walk-overs", doves, peaceniks,
    etc.).

    I agree that the urge to seek revenge and the scorn for those who don't
    share this thirst, are dark feelings, but I wouldn't unequivocally
    condemn them. On one hand, hopefully humans will be able to reconstruct
    themselves to survive without them, reducing the risk of useless
    vendettas, chauvinism, and other forms of tribalism, caused by obsolete
    calibration of the hard-wired systems. On the other hand, we will need
    to use some parts of these hardwired strategies in a reasoned way, as we
    will still need to act against opposing forces, dispassionately,
    efficiently, relentlessly, to improve long-term cooperation and survival
    outcomes of cooperators. A calm justice.

    It's interesting to think about the programming parameters that will
    need to be entered into our decision systems. To achieve maximum
    survival, you might need to adjust the tendency to reciprocate (the
    trigger-happiness) according to the reliability of information available
    (the more reliable information over many iterations, the less need to
    shoot at the first defection), general level of risk (low-risk, low
    response), and certainly many other elements. Very, very interesting.

    Rafal



    This archive was generated by hypermail 2.1.5 : Sat Mar 22 2003 - 20:24:45 MST