From: Rafal Smigrodzki (rafal@smigrodzki.org)
Date: Tue Jun 17 2003 - 11:24:01 MDT
Robin wrote:
>
> Now I actually do think that the world our descendants will live in
> within the next century may well be different enough that acting with
> the usual level of self-deception may lead to very bad outcomes. So
> I personally can accept this as a strong argument for seeking truth.
> (And I even once had this argument in my self-deception paper.) But
> for most audiences, this is a very weak argument. And my original
> claim was just that I could only find disappointingly weak arguments
> for believing truth.
### The strength of an argument seems to be a function of both the argument
and the mind it is directed at. So whether an argument is strong will depend
on whether it is applied to rational minds, or strength as applied to
irrational (e.g. evolutionarily programmed to self-deceive and to fall for
certain salient ideas, like religion) minds. In the latter case, indeed
arguments for the truth might be fall on deaf ears. But then, if you choose
your target appropriately, any argument will be as weak or as strong as you
wish. The only limitation is that certain minds tend not to survive in our
environment too long, so certain arguments are likely to remain weak because
of the rarity of appropriate targets.
It's also useful to differentiate between consistent and temporary
self-deception. In the former case, all traces of truth are erased. In the
latter, truth is temporarily suppressed, allowing lying. I would venture
that the vast majority of social self-deception is temporary - it makes
sense to believe the girlfriend-lie while trying to obtain a sexual favor,
but not when choosing a long-term mate. If you really make yourself believe
a sub-par girlfriend is a great choice, you will damage your reproduction
potential. Most people believe their self-deceptions with a part of their
mind only. The truth is almost always there, hidden under the surface.
For me, the arguments for truth-seeking presented by Eliezer, Dan, Hal, and
Emlyn, are compelling. I have an instrumental approach to large parts of my
own goal system, and this reduces the level of reliance on evolutionary
programming. In the Singularitarian future I will have to know the truth
about self and the world, to avoid damage to my goal system as a result of
manipulation of self (this is in a way quite analogous to the problem facing
the FAI during transformation - how to grow rather than replace itself with
something else). I do not know if in the long run there could be situations
where self-deceiving minds would have a stable and significant survival
advantage over truth-seekers. The paper Robin quoted
(http://econpapers.hhs.se/paper/cwlcwldpp/1319.htm) seems to imply
otherwise. For now I will wait and see. Only when I see as an empirical fact
that a mind can lie to itself, live forever and beat the Bayesian market
competition, would I consider switching to the dark side.
Rafal
This archive was generated by hypermail 2.1.5 : Tue Jun 17 2003 - 08:33:08 MDT