>>I calculated that a human-equivalent-intelligence (HEI) would fit
>>into several thousand cubic microns,
On Sun, 13 Jun 1999 19:16:20 +1000 I wrote any chance of passing on the algorithm you sued to calculate this ;)
and ray responded:
> Let me show you the numbers: The human brain has 10 billion
maybe an underestimate by order of magnitude but that won't matter.
> Each neuron has between 30 and 10,000 synapses, with one
> associated dendrite for each.
Well, let's just say a neuron has 30 to 10k synapses, that is what has been counted.
>Taking the geometric mean of the number of dendrites, say
> that each cell has 300 dendrites.
You need the arithmetic mean to work out dendrites = neurons times dendrites/neuron. In the cortex, there are many neurons with thousands of synapses, mostly unmylenated axons, and myriad of local and diffuse neuro-transmitter effects. These should be counted as well.
> Then there are at roughly 3x10^12
> synapses in the proposed human-equivalent brain.
Except this could be as great as 10^13 or 10^15. But that won't matter.
> Now, I thought about ways to reduce this by editing the system,
> but they won't work. Like most real computing systems, the majority of
> the logic (>95%) by weight or volume is I/O. (The cerebrum, cerebellum,
> gyrii, and most of the encephalon)
The gyrii ("outwellings", no more important than sulcal surfaces), cerebrum, "encephalon" = telencelphelon = cerebrum again? are not "I/O" unless you mean something much more than interface typically implies.
> Neural networks are great for I/O:
What is I/O?
> they're robust and compact compared to the digital systems they replace.
> You would not want to use anything else to construct the phenomenal
> systems of a robot.
How do you "know" that a digital implementation of the brain will be more compact?
> So, for a first approximation, let's say we can custom-design the
> system so that we can store one synaptic weight per byte. This generously
> assumes that the connection pattern (i.e. which neuron has the synapse)
> is hard-wired or hard-coded into the simulation program. The synaptic
> weights have to change, because that's how the system learns. Since they
> change, they have to be recorded.
A synapse in the cortex is an analog element, it passes along cable currents in proportion to its inputs as well as discrete action potentials. In addition it does spatial and temporal summation of action potential inputs. In addition, they can can be mediated by second messengers, voltage gated, and they can be auto inhibitory, just for starters. This is MUCH more than 1 byte. It is several kB plus an on-board processor programmed in OCCAM. Per synapse!
> Therefore, the computer needs at least one byte per synapse,
> 3x10^12 bytes of storage.
I think that from here on becomes irrelevant given the above limitations.
> Using Drexler's estimates for fluorine/hydrogen carbyne tapes,
> this could be stored in at least 1500 cubic microns (Drexler roughly
> estimated 2GBytes/cubic micron; see the notes for Engines of Creation,
> Now, we want the brain to run at human speed. Let's say that
> nanocomputers run 1 million times as fast as neurons; this is roughly
> right, because I'll assume mechanical nanocomputers. Mechanical
> nanocomputers would be more compact than quantum electronic computers.
> They also have a speed that more closely matches the mechanical carbyne
> tape drive. If we use the QE computers, they will run 100x faster, while
> only being about 50x bigger, but the apparent advantage will be cancelled
> because they will stall waiting for the tape drives. The result will be
> a slower or larger computer than the mechanical systems. This might be
> fixable; quite possibly an experienced nanoengieer could finesse this, if
> such a person existed. However, note that it just divides the
> computer-volume by 2, and the tape remains the same size.
> So, to get at least human speed, we need roughly 1/1,000,000 the
> number of processors, about 3x10^6. I assume that each one of these is
> servicing a million simulated synapses. I'm going to throw in the CPUs
> for free (I know pretty good CPUs that have as few as 7,000 gates; see
> the web site for computer cowboys).
> Using Drexler's estimates for random-access memory
> (20MBytes/cubic micron), we can fit 305 of 64K computers in a cubic
> micron. The computers therefore take roughly 9.8x10^4 cubic microns.
> The computers' program memories are therefore the major system
> expense. Can we get rid of them? Now let's say that the engineer goes
> for broke, and designs a system with no computers. It's totally analog,
> maybe with frequency-modulated hysteresis devices acting as neurons, and
> carbyne pushrods acting as dendrites. In this case, the system volume
> should grow substantially, because the dendrites have to physically
> exist, each with a few thousand carbon atoms, rather than just being
> simulated from 8 bits on <50 atoms of tape.
> Possibly one could substitute a custom logic machine that _only_
> processes neural nets? The problem with these is that they tend to be
> larger and more complex than the computers they replace. Random logic is
> bulkier and more power-hungry than the random-access memories that store
> software. Faster, maybe, but then we might stall waiting for the tape,
> The computers therefore take about 9,800 cubic microns. The tape
> storing the synapses takes about 1,500 cubic microns. Now remember, this
> is a _low_ estimate. I actually think that the storage for a synapse
> would have to store an address of a neuron as well, thus having 4 bytes
> of address in addition to the byteof weight. This quintuples the tape
> system to 7,500 cubic microns. Also, the tape drive and computers might
> double in size. Drexler doubled them.
> 11,300 cubic microns is small. It's a cube about 22.5 microns on
> a side, say a quarter-millimeter on a side, about 1/8 the size of a
> crystal of table salt. 17,300 cubic microns (storing synaptic addresses)
> is still small, about 25.9 microns on a side. Even 34,600 cubic microns
> (double everything) is small, maybe 32.6microns on a side, the size of a
> crystal of table salt.
> <snip stuff about Halperin's book>