UPLOAD: advocatus diaboli

John K Clark (johnkc@well.com)
Tue, 7 Jan 1997 23:39:56 -0800 (PST)


On Tue, 07 Jan 1997 Eliezer Yudkowsky <sentience@pobox.com> Wrote:

>Ho, ho, ho! Scientists discover that neurons can influence
>each other in yet another completely unexpected fashion.
>Does this make them less complicated?

The article was about information stored in the synapse not the neuron.
It doesn't make the synapse less complicated and it CERTAINLY doesn't make
them more complicated, it DOES mean that the effective number of synapses you
need to worry about is MUCH less.

>Ho, ho, ho! My right foot it does. Just because a LTP
>change releases nitrous oxide that causes another LTP change
>doesn't change the amount of information stored by one

But the article proves that there are lots of synapses that have the same

>To put it another way, instead of each neuron storing
>500 bits of information, a configuration of 500 neurons
>holds 500 pieces of 500-bit data, one bit from each piece of
>data in each neuron.

That will not work, change one value and lots of others will change too, if
I am forced to write into 500 notebooks at the same time then, they will not
hold more information than one notebook if it could be used by itself.

>To put this all in perspective, imagine this: "Scientists
>have discovered that when a neuron fires, it releases
>chemicals that cause other neurons nearby to fire."

LTP is not about neurons firing but about the long term strength of
connections. If you change the LTP of one neuron lots of others will change
too so they can't be an independent place to store information.

>Does this reduce the computational complexity of the brain?

Very definitely. If the article had said that they also found that each
synapse held 1000 times as much information as previously thought then the
information storage capacity of the brain would be unchanged. The article
said not one word about that.

John K Clark johnkc@well.com

Version: 2.6.i