Oh this is beautiful; how to make nanotechnology even
harder! I agree that such precautions could be taken.
Will they be, by all parties? Not likely. Limiting the
things in such ways will drastically reduce their
I also agree that nanotechnological weapons MAY be the
primary threat. After all--a weapon which is not
natural-environment-capable is useless. Weapons will
be designed to disassemble ALL matter, and they will
be able to operate in ALL environments. This is why I
used weapons as an example.
To be truthful, though, I really can't see how AIs
will improve things on the weapons front.
Eliezer S. Yudkowsky wrote:
The problem of not accidentally releasing a
replicator can be solved very easily by never
natural-environment-capable replicator, and, of
course, making very very
sure that replicators don't have the capability to
replicators that incorporate yttrium and boron and can
only reproduce in
high vacuum at eighty degrees Kelvin using broadcast
power and broadcast
information, and an accidental spill won't make a
the reproduction information - if you have it
on-board, which is itself a
mistake - is also easy; what you'd have to watch out
for would be prions,
structural deformations that result in similar
structural deformations in
It's nanotechnological warfare, not the sheer
stupidity required for an
accidental error, that imposes the time limit on us
Do You Yahoo!?
Yahoo! Photos - Share your holiday photos online!
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:16 MDT