RE: The Future of Secrecy

From: Rafal Smigrodzki (rafal@smigrodzki.org)
Date: Wed Jun 18 2003 - 20:00:53 MDT

  • Next message: Wei Dai: "Re: The Future of Secrecy"

    Robin wrote:

    >
    > This seems to suggest a discrete split in our futures. Either agents,
    > or certain internal modules, are standardized enough to allow biases
    > to be checked, giving truly unbiased agents or modules, or biases are
    > hard to see but beliefs are not, so that agents self-deceive.
    >
    > To see that there really is a demand for such self-deception, let's
    > work through an example. Let us say I know how much I really like my
    > girlfriend (x), and then I choose my new beliefs (y), under the
    > expectation that my girlfriend will then see those new beliefs, but
    > not any memory of this revision process. (Of course it wouldn't
    > really like this; analyzing this is a simple way to see the
    > tradeoffs.)
    >
    > I face a tradeoff. The more confident I become I like her the worse
    > my future decisions will be (due to the difference y-x), but the more
    > she will be reassured of my loyalty (due to a high y). The higher my
    > x, the higher a y I'm willing to choose in making this tradeoff. So
    > the higher a y she sees, the higher an x she can infer. So this is
    > all really costly signaling.

    ### At this point, the terms signaling and self-deception are perhaps no
    longer quite appropriate. The signal you send and the referent of value to
    your girlfriend are one and the same. Word and deed is one, which is
    different from the usual context of signaling.

    As in many other cases, sci-fi literature already considered the scenario
    you mention - Philip Dick's "Total Recall" describes a state agent who
    erases his memories to gain access to a rebel group (whose leader is a
    mind-reader), and loses his individuality. One could leave the old goal
    system hidden within the mind as a viral entity, ready to reassert itself
    over the decoy's personality at some predetermined time, but this is pure
    theorizing now - we probably know too little about mind structures to
    predict if successful use of such hidden layers would be easy or difficult.

    It might be useful to restrict the term self-deception to situations where
    an unfulfilled need stemming from one part of a goal system leads to the
    formulation of a belief and a secondary goal, which is in conflict with the
    initial goal. It serves to conceal the initial goal from observers and, to a
    lesser extent, from self (the rest of the goal system). So a desire for
    inclusion in a coalition for selfish reasons will induce acceptance of the
    coalition's goals and their zealous pursuit, marred by a partially conscious
    conflict between the goals. This is self-deception. Sometimes the secondary
    goals can overwrite the initial goal system totally, and this where self
    *deception* does not apply any longer - the absolutely committed soldier is
    no longer deceiving himself about his goals. He has new ones. I also feel
    that terminological clarity would be served by differentiating the above
    from deviations from rationality which never enter the conscious mind, and
    are due to deep conflicts in the goal system, or inborn habits of thought
    which are not critically evaluated by the individual, a lack of insight
    rather than intellectual honesty.

    Rafal



    This archive was generated by hypermail 2.1.5 : Wed Jun 18 2003 - 17:10:54 MDT