Re: Secrets of The Mind [ie, brain]

From: Eliezer S. Yudkowsky (
Date: Fri Oct 26 2001 - 13:49:58 MDT

> "J. R. Molloy" wrote:
> >
> > Another file concerned John, who has temporal lobe epilepsy, and his seizures
> > leave him with the absolute conviction that he is God. The doctor suggests
> > that the seizures may have disrupted a part of the brain "whose activity is
> > somehow conducive to religious belief," and that such a receptive region may
> > have evolved because "it's conducive to stability of society."

Which goes to show that being a doctor of neurology doesn't make you an
evolutionary psychologist.

ERROR 514: group selection hypothesized

> > with the environment. The best part of all was when he described the neural
> > pathways between vision and emotional salience, in the segment about David,
> > the man with a condition called Capgras Delusion. He knew who and where he
> > was, but he was sure that the people who said they were his parents were
> > imposters. Dr. Ramachandran says David's injury destroyed pathways for
> > emotional information, and since he didn't feel the appropriate emotions
> > toward his parents, he concluded that they were lying look-alikes. (Reminder
> > to uploaders: Don't forget to include the enteric nervous system.)

There is NO WAY in HELL that Capgras's Delusion routes through the enteric
nervous system. Temporal lobe damage would be my guess.

Molloy, every time you use the words "enteric nervous system", your
apparent IQ drops another 10 points. These are not magic words. Use the
magic words "limbic system" instead when talking about emotions. You'll
still sound like you picked up all your neurology from pop science tracts,
but at least you won't sound like a raving lunatic. Lest you think I
exaggerate, what you just wrote above is, neuroanatomically speaking,
something along the lines of: "Scientists have hypothesized a new form of
neutron star called a 'magnetar'. (Reminder to space travelers: Don't
run into one of these things on your way to the moon!)"

Brian Atkins wrote:
> So can we draw the following conclusions?
> 1. Up until recently, the DEMM (deep emotional meaning manufacturing) chunk
> of the mind was a positive factor for society overall. While it led to
> the development of religions which became rather rigidly enforced on the
> society, this also allowed the society to remain stable, which probably
> allowed science and technology and trade etc. to grow. Societies of
> individuals without this hardware must not have been as stable.

Error 514 - you have to explain how this hypothesized DEMM hardware caused
a specific individual to have a larger number of surviving grandchildren.
Furthermore, the hypothesized DEMM hardware has to be an advantage in the
absence of any previous DEMM hardware in other tribemembers, or there's a
chicken-and-egg problem.

Also, why are we assuming that emotional meaning, especially of the type
hypothetically involved in Capgras's Disorder, and the God Module, are the
same piece of hardware? Is there fMRI evidence? I hypothesize that if
you took an atheist patriotic libertarian (but not an exceptionally bright
or self-aware one), and checked the hardware that lit up while meditating
on the Constitution or Liberty, it wouldn't be the same hardware as the
God Module.

Incidentally, I'm not sure that their explanation for Capgras's Disorder
is the right one. I'd be more inclined to guess a hardware route for
linking visual identifications to social/persona identifications. Damage
the link and the result is someone who looks just like the person but is
identified as a stranger.

> 2. Unfortunately, this same DEMM hardware is now very dangerous to the
> future.

Half the brain is dangerous in one way or another, and I can think of at
least three more dangerous modules offhand: The rationalize-flawed-idea
module, the them-and-us module, and the ends-justifies-means module.
(Assuming that these are single modules.)

There isn't really any good way to deal with this - for individuals,
maybe, but not for whole countries. You just go on.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:16 MDT