On Fri, 1 Oct 1999, Eliezer S. Yudkowsky wrote:
> > "Neurons process information structured in time," he
> > explained. "They communicate with one another in a
> > 'language' whereby the 'meaning' imparted to the receiving
> > neuron is coded into the signal's timing. A pair of pulses
> > separated by a certain time interval excites a certain
> > neuron, while a pair of pulses separated by a shorter or
> > longer interval inhibits it.
I believe there have been one or more articles in Science Mag. over the last couple of years documenting this and/or the use of "amplitude coding".
The interesting thing about this is when you consider the possibilities for artificial brains based on this method. Since in electronics the rise & fall times are faster and the pulse widths can be *much* smaller than those in the brain, we would get much faster overall system throughput than the simple computational ability of the nodes would suggest.
Of course we are going to have to get the processor voltages down to the brain levels (1-2 orders of magnitude (?)) or else your ABrain is going to have to run in a big freezer.
Anyone notice that with the trend in existing processors for lower voltages and higher power dissipation, the current to the chip has to go up. I think towards the end of the decade they think they may have to push 100+ Amps into the chip. Ouch!
For those who didn't catch it there was a good summary article on the limits of semiconductor device technology in Science last week:
URL is: http://www.sciencemag.org/cgi/content/full/285/5436/2079 (I believe subscription or E-signup is required).
The short summary is that the prospects for hard limits is looming large. Though people may find clever ways around them, the interesting thing is that an expansion of the phase space of designs & architectures (including nanotech) seems very probable.