ARTICLE: "Tech Patriarch Sees Need to Keep Robotic, Genetic Genie s in Bottle"

From: Vita-More, Natasha (NatashaVi@chadbourne.com)
Date: Wed Apr 12 2000 - 16:21:46 MDT


Below is today's LA Times article regarding Joy reverberations. If the
formatting is off I apologize -- I'm not in the studio and have
quasi-control over output linear design.

Natasha

----------------------------------------------------------------------------
------------------

"Tech Patriarch Sees Need to Keep Robotic, Genetic Genies in Bottle" by
Charles Piller

     "Is the pursuit of knowledge sacrosanct? Or should technological
developments that promise profound benefits, yet may some day cause
catastrophes, be halted?
 
     Such questions have traditionally been the province of academics. So
when Bill Joy, chief scientist at Sun Microsystems and one of the most
revered figures in high technology, recently suggested that scientists step
back from genetic engineering, robotics and nanotechnology, eyes popped
open.
 
     Most industry leaders responded like polite family members whose aging
patriarch has begun to behave erratically. They politely discounted his
suggestion as hysterical pessimism.

     But Joy's lengthy polemic in Wired magazine boils down to a simple
premise: Humans are not ready to be gods, so we should pause long enough to
think carefully before passing the point of no return with technologies that
offer God-like powers--including the meaningful prospect of our extinction
as a species due to acts of a misguided few.

     "The only realistic alternative I see is relinquishment: to limit
development of the technologies that are too dangerous, by limiting our
pursuit of certain kinds of knowledge," he wrote.
 
     The counter arguments are obvious: Knowledge is its own justification.
We can't slow--let alone stop--technological progress even if we wanted to.
Such efforts would not only cause great harm, they elicit the nightmare of
Stalin-esque perversions.

     Of course, any discussion of standing back from the prospect of
technological cataclysm has to start with atomic weapons.
 
     Despite heroic efforts of many scientists themselves--the Bulletin of
the Atomic Scientists offers an inspiring example--proliferation of nuclear
weapons continues apace while the U.S. Senate refuses to ratify a
comprehensive test ban.
 
     Joy cites the 1972 Biological Weapons Convention as a hopeful sign. In
that unusual pact, most of the world's nations agreed to refrain from using
a particularly heinous weapon of mass destruction and to destroy their
stockpiles.
 
     Unfortunately, that treaty is a thin reed on which to hang his
argument. Many nations--most notably Iraq and the former Soviet Union--have
ignored the pact's obligations or aided proliferation. And a loophole allows
"defensive" research. To a greater degree than any other category of
weaponry, nearly all aspects of offensive and defensive R&D concerning
biological weapons are identical.
 
     Such treaties are still crucial. But they reflect the limits of any
such process when managed by the very class of scientists, politicians and
technocrats that created the weapons.
 
     "The trouble with arms control is that it is an unnatural act," said
Paul Warnke, former director of the Arms Control and Disarmament Agency.
"Even when in the national interest, it requires countries to tear down what
they have built up."
     So Joy's doubters find discouraging support for their opposition to
unnatural acts in his own examples.

     Yet the lesson of weapons treaties fails to reflect the most
provocative element of Joy's argument--the idea of standing back before the
genie is out of the bottle.
 
     Joy's first two examples, robotics and genetic engineering, are already
well on their way. Deeply entrenched in research, commerce and popular
culture, they have achieved a forward movement that seems, for better or
worse, to be inevitable.
 
     In contrast, most of the practical applications of nanotechnology--the
creation of new materials and machines, atom by atom or molecule by
molecule--are as yet unproved.

     Proponents of nano-scale machines foresee enormous potential for human
benefit: microscopic particles with the intelligence of supercomputers that
travel inside your body to destroy cancer cells; limitless, clean, low-cost
fuel; or diamonds created from lumps of coal.
 
     Nanotechnology also suggests horrors: micro-machines designed to
destroy people with specific genetic traits, clouds of self-replicating
nanobots designed for an ostensibly benign purpose--say, weed killing--that
escape and devour all plant life on Earth in a matter of days or weeks.

     Nanotechnology now exists in primitive forms, but its grand practical
applications could make it a self-replicating, mass-production technology
analogous in important ways to gene cloning. All this still may be decades
away, if at all. So the nanotechnology genie has not been released, though
the bottle is being rubbed.
 
     Can society step back from such a fountain of amazing wealth and social
benefit because it might also be mishandled so dangerously as to destroy the
world?
 
     It won't be easy. First we need to develop greater skepticism for
expert wisdom. Experts often lack special expertise outside their narrow
areas and suffer many conflicts of interest.

     After all, "the sheer joy of resolving problems [is] a hard motivation
to put down, because it does involve some of the best features of human
nature," admits Theodore Roszak, a leading technology critic.

     Indeed, Roszak believes that powerful new technologies are far less
likely to be massively abused by terrorists or governments than they are to
spin chaotically out of control from corporations. Competition to exploit
new possibilities will always bury the cautious few. Corporate adherence to
the imperative of capitalism--grow or die--forbids any backward glance.

     Of colleagues who dismiss him, Joy wrote: "I don't know where these
people hide their fear."

     When one of the wisest of high tech's wizards questions the idea of
technological inevitability, he offers hope that thoughtful, democratic
management of technology may yet be possible.
 
     It's up to many of us to propose our answers."

* * *
     Times staff writer Charles Piller can be reached at
charles.piller@latimes.com.

***************************************************************************
This e-mail, and any attachments thereto, is intended only for use by the
addressee(s) named herein and may contain legally privileged and/or
confidential information. If you are not the intended recipient of this
e-mail, you are hereby notified that any dissemination, distribution or
copying of this e-mail, and any attachments thereto, is strictly prohibited.
If you have received this e-mail in error, please notify me by replying to
this message and permanently delete the original and any copy of this
e-mail and any printout thereof.



This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:09:16 MDT