Re: Goo prophylaxis

CurtAdams@aol.com
Fri, 29 Aug 1997 16:38:54 -0400 (EDT)


In a message dated 8/29/97 8:17:17 AM, bostrom@mail.ndirect.co.uk (Nicholas
Bostrom) wrote:

>Carl Feynman wrote:
>
>> >Isn't the
>> >design work fairly tractable (Drexler has already produced some nice
>> >designs) and it is mainly the lack of molecular tools that prevent us
>> >from starting building things? Better CAM would help a lot, and it is
>> >on its way.
>
>> (1) The minimum self-reproducing device (mycoplasma genitalium) seems to
>> require about a million bits of information. I don't think we'll be able
to
>> get much smaller than that with our artificial equivalents. That's about
as
>> much information as is embodied in a car or medium-size piece of software.
>> To develop from having no experience in automotive technology whatever to
>> the point where you can build a reasonably effective car took at least
>> thousands of genius-years. Ditto software. The first piece of software
>> that I'm aware of that was over a million bits long was OS/360, developed
in
>> 1964, at a cost of 5000 man-years, to say nothing of all the research that
>> it took to bring software technology to the point where they could even
>> start the project.
>
>There is a difference between the three systems you mention, on the
>one hand, and nano self-replicator on the other. Mycoplasma
>genitalium, automotive vehicles and comersial software are all
>required to be fairly optimized.

No, none of those are "optimized". Insofar as they are optimized, germs and
commercial software are heavily optimized for miniumum design requirements,
even at the cost of performance. Living systems in particular are bizarre
cobbled-together things. Their energy use is a reasonable approximation of
optimal not because they are well-designed, but because the thermodynamic
processes they use are very efficient. Even with "inefficient" and indirect
chemistry, they still come out pretty well.

>To build an optimised nano
>self-reproducing device would be much harder than simply to make
>something useful that can replicate. For example, a universal Turing
>machine has been constructed in Conway's Life world. The entity is
>very big and it was hard, but nothing near a thousands of genius-year
>task, to do it.

Nobody has presented a self-replicating Life system. All Conway did was
produce a feasibility proof, so you know it *can* be done. Actually
designing such a system is still considered not yet possible.

>The feasibility stems from the fact that you have
>identical components that you can put together into bigger identical
>components, and so on, and at each step you need only consider the
>apparatus at a certain level of abstraction. If this is the right
>analogy for nanotech, then the design work would seem tractable, once
>the right tools are there.

I think a machine-phase nanotech will be far harder to design than current
biological nanosystems. Chemical thermodynamics does a lot of work in
current enzyme-driven systems, like locating substrates, moving substrates,
and performing the reaction. The biological system need only design the
enzyme. In current machine-phase proposals, all that has to be done
deliberately, greatly adding to the complexity of the design, and to the
energy costs. Frankly, the proposals for wild machine-phase nanotech sound
to me like they will need a host of Maxwell's demons inside instructing the
device what to do to obtain its stupendous energy, and then telling it how to
find all the individual atoms it needs to make itself. Locating vast numbers
of halogens in a non-aqueous environment is a really tough task. Even in a
saline solution, grabbing and using the chlorine won't be easy.

Von Neumann concluded you'd need about 250,000 parts in an environment
specifically designed for replication. I think 1,000,000 in a lab
environment where chemicals and energy are provided for the replicator is a
reasonable lower bound. Chemistry isn't specifically designed for
replication like Von Neumann's system.