# Meaning of life and calculating the human brain's computing power

From: Dickey, Michael F (michael_f_dickey@groton.pfizer.com)
Date: Fri Feb 15 2002 - 14:42:03 MST

I posted the web page that dealt with the 'meaning of life'
<http://sysopmind.com/tmol-faq/tmol-faq.html > with an extropian view point
on the Skeptic forum. I received some posts discussing how to calculate the
computing power of the human brain, thought others may be interested in
contributing to the discussion, hopefully some will have more knowledge on
the subject than I. The Thread is below. You can get to the forum at
http://forums.delphiforums.com/skepticsociety/ its under the 'meaning of

Michael

----------------------

<<the raw power of a human brain - i.e., around a hundred million billion
(10^17) operations per second >>

I'm interested to know how you got to that number.

<< As of 2000, computing power has doubled every two years, like
clockwork, for the past fifty-five years. This is known as
"Moore's Law". >>

Moore's Law suffers from what I call the "extrapolation fallacy". It
consist in drawing curve through past data, and predict
that the future data will follow that curve, without any consideration
of cause-effect relationships. However, there are
examples that extrapolation not always work. For example, if you look at
bacteria multiplying in a culture vat, extrapolation
would lead you to predict that you will soon have an infinite number of
bacteria (a Singularity!). This is because bacteria
multiply exponentially (at first!) However, after a while the growth
curve breaks away from the exponential and goes into a
plateau. This happens because previously unseen cause-effect
relationships (depletion of nutrients, accumulation of toxins,
etc) kick in. I think that we will soon see that growth in computer
technology starts to deviate from Moore's Law.

Beware of the "extrapolation fallacy"! We see more and more examples of
it - Bjorn Lomborg uses it a lot in his book. Any
prediction based in extrapolations instead of rigurous cause-effect
relationship is highly suspect!

Juan Carlos

-------------------------

<<the raw power of a human brain - i.e., around a hundred million billion
(10^17) operations per second >>

"I'm interested to know how you got to that number. "

Well, I did not do the calculation myself, but if you check the source
page I am sure the author has a link to it. Actually, I
checked again, the author cites their sources right after each
statement. As far as the Brain calculation power goes, the
Author says "2: I use the generous estimate of a hundred billion
neurons, with around a thousand synapses apiece,
sending around two hundred signals per second, plus a factor of five for
good luck. For a much more detailed (albeit
outdated) analysis, see "When will computer hardware match the human
brain?
<http://www.transhumanist.com/volume1/moravec.htm>"

<< As of 2000, computing power has doubled every two years, like
clockwork, for the past fifty-five years. This is known as
"Moore's
Law". >>

"Moore's Law suffers from what I call the "extrapolation fallacy". It
consist in drawing curve through past data, and predict
that the
future data will follow that curve, without any consideration of
cause-effect relationships."

This is true, and I agree. I thought it was an interesting site.
Nonetheless, there are different types of 'extrapolation fallacies',
like the Elhric 'hundreds of millions will starve to death in
the 80's because food production can not keep up with demand' What
happened to these Wide spread famines? Elhric
drew trends out of food production, food demand, and growth rates, he
did not take into account Advances in agricultural
technologies which increased food production (even though these had been
obvious from the historical record) This is a limit
predicting extrapolation fallacy, and assumes a technological stasis. I
find these criticisms highly dubious and on par with
the 'everything that can be invented has been' mentality.

Conversely, Moores law is not describing the limits of growth, is it
making a reasonable projection based on historical
trends. Ray Kurzweil (http://www.kurzweiltech.com/aboutray.html) makes a
good case in his book 'The Age Of Spiritual
Machines' that computing technology has followed Moore's law since
Babbages mechanical computer. Critics of Moor's law
often cite technological limits, things like the size of the wires,
quamtum mecahnical interference, or the like. But I catagorize
these with the 'everything that can be invented has' mentality, and they
do not take into account technological innovations.
Kurzweil tracks Moore's law from the mechanical computer, to the
mechanical punch cards of the early part of this century,
to the vacuum tube computers, to the transistor. Each time the practical
limit was reached of the previous technology, the
successive one takes over. When the limits of the transistor is hit, any
one of numerous technologies is poised to take over,
Optical computers, quantum computers, molecular computers, etc. etc.

You place a limit on moore's law due to some type of cause effect
relationship, yet its very obvious that one type of
computing technology, the organic brain, is still vastly superior to
conventional transistor based technologies. Here we have
a tangible example of a far more sophistaced computing machine than we
can make, so it is reasonable that some trend will
continue until *at least* a computer matches the power of the human
brain. To think otherwise would seem, at least to me,
to be attributing some special supernatural quality to organic
computational systems that inorgantic or 'man made' ones will
never surpass.

It seems odd that you cite an 'extrapolation fallacy' when you are one
of the biggest arguers for global warming (which, it
seems, is SOLELY based on extrapolitve predictions)

"I think that we will soon see that growth in computer technology starts
to deviate from Moore's Law. "

I think we may see a limit at one time, but not (I think) at least until
a computer can match a human mind. But only time will
tell.

Regards,

Matus

---------------------------

<< the Author says "2: I use the generous estimate of a hundred billion
neurons, with around a thousand synapses apiece,
sending around two hundred signals per second, plus a factor of five for
good luck. >>

So we have (^ means exponent):

10^11 neurons x 10^3 synapses/neuron = 10^14 synapses in the human brain

This is the standard estimate of synapses in the human brain, and I
agree with it. Let's look a the following step:

10^14 synapses x 200 signals/synapse, sec x 5 (good luck) = 10^17
signals/sec

5 for "good luck" is just trash
Synapses don't process signals at 200 Hz (per sec). The typical synapse
signal is the EPSP or the IPSP, which has a
duration of 5-20 millisec. So it's more like 100 Hz:

10^14 synapses x 100 signals/ synapse, sec = 10^16 signals/sec, the
absolute maximum possible in the human brain

Or, if you want, NEURONS do send signals (action potentials) up to 500
Hz so

10^11 neurons x 500 signals/neuron = 5 x 10^13 signal/sec, a smaller
number

Realistically, taking into account that not all synapses work all the
time (that would be an epileptic seizure), that the brain is
not a perfect parallel computer, and that there is a lot of redundancy
in it, I would put that number at 10^13 Hz
(signals/sec). And I think that even that is optimistic.

So the human brain is not so far above computers when it comes to
process information. The PC I use to write this works at
10^9 Hz, so my brain has the computing power of ten thousand PCs -
disapointing!.

I don't think that you can invoke "biological computer" technology to
move forward Moore's Law. Brains are not a whole lot
better than computers, when put exclusively in terms of information
processing.