('binary' encoding is not supported, stored as-is)
IMHO, we are pretty much at human brain bit rate
equivalency right now. According to the calculations in
the human brain approximates 1e14 flops of computer power.
(His calculation seems substantially better than the
calculations based on number of neurons x number of axons x
axon frequency, being based on the functional equivalent
Clearly we could have a 1e14 flops computer network today
if we so chose, simply by buying enough computers. (Note
that a network of computers makes a good approximation of
the human brain, since both are highly parallel in nature.)
However, that could be quite expensive. What we are
interested in then is not so much the raw processing power
of a potential computer cluster, since we could, given
enough money, have built a 1e14 flops network of machine in
the 1950s. Rather what we are interested in is the point at
which the cost per flop of processing power is cheaper in a
computer than in a human engineer. At that point, given the
right software, we can replace our engineer with our
network of computers (for the same cost), and thus start
the accelerating improvements the singularity predicts.
How many flops of engineer can we buy for one dollar?
(Excuse the US-centric approach, feel free to recalculate
in local currency.) Well first of all, we need to remember
that a typical engineer works 40 hours a week, or about 20%
of the time. Secondly, a typical engineer spends a huge
amount of time communicating (writing and reading design
documents, looking up reference material etc.) I would say
that it is an extremely conservative estimate to say that
the typical engineer spends less than 50% of their work
time in actual engineering. (Computers do have to perform
similar tasks, however, the amount of time they take is
probably negligible.) Also, only a small portion (perhaps
10%) of the brain is dedicated to the task of conscious
thought. Also, humans are greatly impaired by the fact the
their consciousness is essentially serial meaning that they
cannot perform many engineering tasks in parallel
(something which engineering strongly lends itself toward.)
Finally, human engineer’s brains are designed for picking
berries, and spearing rabbits, not designing computers.
Consequently, they do not have various specialized hardware
units that would optimize this task, (such as the ability
to quickly and accurately do arithmetic, the ability to
look up databases, the ability to convert abstract concepts
into manufacturing machine instructions etc.) For these last
two intangibles, let me assume that the human flop is
degraded by 50% for each as compared to a computer flop.
Based on all of this, the average human engineering flop rate
1e14 x 20% x 50% x 10% x 50% x 50% = 2.5e11flop per second.
An engineer typically costs $80,000 per year (including
various employment costs). So a human provides
2.5e11 flop per sec x 3e7 sec per year / $80,000 = 100e12 flop/$
You can probably buy a Pentium computer board for about
$500, giving you approximately 1e9 flop per second.
Assuming such lasts for 3 years, running continuously, this
1e9 flop per sec x 3e7 sec per year x 3 years / $500
= 180e12 flop/$.
To put this in a different light, 500 1GHz computers cost
about the same (over a 3 year period) as an engineer, and
offer more flops for your money.
This is of course a very approximate calculation, since a
human flop is only roughly analogous to what a Pentium can
do in one cycle. However, it surely gives us a general
order of magnitude. What it tells us is that today, raw
computer brain power is probably cheaper than human
engineering brain power, a precondition for the
Allow me a moment’s idle speculation if you will. Although
computers have not replaced engineers, it is clear that
they offer a great deal of leverage to real world
engineers: I doubt a Pentium designer has though much about
NAND gates for many years now, never mind transistors.
Given this, do we see the leading tail of the singularity
already? Careful observers have noticed that the Moore
doubling time has been decreasing for some years now, from
two years, to eighteen months, to its present value of
around 12 months. I wonder, as I say, if this is the
start of the singularity?
However, having said that, I am not convinced that the
“singularity” will in fact take place over a very short
period of time as has been widely speculated, because I
think there is a major flaw in the argument. The argument
goes something like this: when AI takes over computer
design we can expect Moore’s law to continue. However, as
computers become more intelligent (or at least better
engineers), they will begin to bring about engineering
changes faster than human engineers. Hence the first Moore
doubling takes place in two years, but the next takes place
in half that time, one year, then six months, three months
and so on, until a singularity is reached at four years.
The flaw in this argument is that it assumes that design is
the only cost associated with the production of better
computers. Design is undoubtedly a dominant cost right now,
however, as design cost becomes vanishingly small
other factors begin to take dominance. For example, the
time necessary to modify the equipment in a plant is small
in the current scheme, however, when design iterations take
less than a day, this will become a more significant cost.
The problem is that under the AI only scenario humans and
capital are still in the loop, executing the design
produced by the AI. AI can modify its own code, and can
undoubtedly make great strides in its productivity that
way, however the substrate on which the software runs
provides a maximum speed. Replacing that substrate requires
human and capital intervention, and it is the regular
replacement of the substrate (with rapidly diminishing times
time between replacements) that produces the singularity effect.
However, should we develop molecular nanotechnology, and
empower the AI device with that technology, then we can
eliminate humans and capital completely out of the loop. So, it
seems then that both AI and MNT are requirements for the
successful execution of the singularity.
However, call me a luddite if you like, but the idea of
giving an AI unrestricted access to MNT is a little
frightening to me.
It hardly need be said that the other major problem is that
our software is so badly lagging behind our hardware, and
that there are AFAIK so many unsolved problems with AI
software, that speculating on when human engineer
replacement AI will be available is a crap shoot.
--== Sent via Deja.com ==--
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:20 MDT