How rational is nonconformity?

Robin Hanson (hanson@hss.caltech.edu)
Mon, 31 Mar 1997 12:13:32 -0800 (PST)


I wrote:
>the weight you should give to the opinions of others ... can be
>objectively calculated in principle, and is typically large

Carl Feynman suggested:
>any significant evolutionary component, I would suspect that it would
>push people in the direction of lower swayability than is rational
>under current circumstances.
>Under the hunter-gatherer conditions in which our ancestors lived for the
>past seven million years, a typical person only knew a hundred or so other
>people. Half of those were kids, and half the adults were stupider than
>average. The chance of coming up with a good idea that nobody else had
>thought of was much higher than it is today.

Curt Adams responded:
>But all the adults were citing the accumulated knowledge of centuries or even
>millenia. The Australian aborigines' Dreamtime techniques have apparently
>preserved knowledge of geographical features from the last Ice Age. ...
>The punishment for a wrong idea was much more likely to be death. In our
>benign society you can make a lot of boo-boos and come out OK. Without

Robert Schrader also responded:
>2) Usable-idea space is so much larger now than then, the proportions
>may actually be the reverse of what you suggest.

The Low Golden Willow replied:
>On the other hand, the punishment for not being inventive in a hostile
>environment can _also_ be death.

There are many interesting related issues here, but most of these
considerations are irrelevant to the puzzle I was talkikng about. To
get the prediction of agreement, you need the two sides to know that
the others disagree, and in which direction. Punishment and
idea-space sizes are irrelevant, as are scenarios where you both know
someone else knows something you don't, such as good food plants.
Also irrelevant is disagreeing with people in the past, who certainly
can't know that you disagree with them.

The size of the population who disagrees is relevant, but our
ancestors did have to deal with beliefs communities of varying sizes,
so they presumably learned to condition their strategies on this
context, though the max size they ever dealt with is much smaller than
for us.

I'd be more tempted to look at evolutionary strategies for pretending
to disagree when you really didn't, in order to promote information
aggregation, take credit for originating discoveries and to
demonstrate strength of character.

Robin D. Hanson hanson@hss.caltech.edu http://hss.caltech.edu/~hanson/