Paul Hughes wrote:
> Impressive! Using their somewhat simplified simulation of the most
> complex neuron in the nervous system (the Purkinje), it still took their
> i860 processor almost an hour to run a simulation of a single firing.
> How this translates into computational requirements for the typical
> neuron remains to be seen, but I think it proves that simulating a single
> neuron is not a trivial task.
> I argued somewhat incorrectly with Hans Moravec during his peer review of
> the paper he published in the 'Transhumanist Journal', that his estimate
> of of 10^15 ops/sec for the human brain may need to be adjusted upward to
> account for intraneuronal complexity. However, this study would appear
> to support my claim.
Maybe, maybe not. Until we have a good understanding of how low-level
electrochemical activity produces computation, we really can't say how much
of this detail needs to be simulated. Reproducing a neuron's computation is
certainly going to be much less complex than simulating the neuron in detail
(because you don't need to fully simulate metabolism, self-repair, etc.),
but working from first principles like this it is hard to say whether we are
talking about a difference of one order of magnitude or several.
However, I think you are wrong to dismiss indirect measurements based on
comparisons between the human retina and computer vision systems. My
understanding is that we know exactly which image processing occurs in the
retina (as opposed to elsewhere), and that makes it easy to compare it to
artificial systems. If it takes X FLOPS to duplicate the retina's image
processing, it seems reasonable to expect that duplicating other
retina-sized slices of neurons would also take X FLOPS (give or take an
order of magnitude).
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:09:38 MDT