"The idea of someone being smart enough to build a
universal assembler and concurrently being dumb enough
to not build it with sufficient controls is
**Oh? And who sayd intelligence is related to caution,
meticulousness, or indeed sanity? Teams of highly
intelligent people design software; does that mean it
is bug-free? The same is true of nuke plant
construction and safety systems--does that mean
**Hitler was quite intelligent. Was he sane? Perhaps.
Was he evil? Undoubtedly. So--suppose the first one
smart enough to build the damned things is evil?
Suppose further that he or she is insane--and desires
nothing more than to destroy the world and everything
**Ah--what then, hmm..?
**As Joy has said--a terrifying empowerment of extreme
At 07:37 PM 1/5/2001 -0500, you wrote:
"J. R. Molloy" wrote:
> > I quite agree that nuclear weapons are an easier
problem than biological
> > or nanotech weapons. The orginal poster said that
if "the consequences
> > failure" of nuclear weapons were the same as
nanotech, we would all be
> > dead. Not so, it is the nature of the technology,
not the possible
> > consequences, that determine the ease of control.
I believe we are in
> > violent agreement.
> > steve
> By "the consequences of failure" one means to say
that control has failed.
> Consequently, since we've failed to control nuclear
> Nagasaki), identical levels of failure in regard to
nanotech or biotech
> warfare would kill us all, since these latter
technologies have such
> killing power.
I fail to see such killing power. Much ado about
warfare has been with us for centuries to no ill
effect. The idea of
someone being smart enough to build a universal
concurrently being dumb enough to not build it with
is nonsensical. Purely Pollyanish Paranoia Posing
proper public politics.
Do You Yahoo!?
Yahoo! Photos - Share your holiday photos online!
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:17 MDT