Notes from the Bill Joy radio talk

From: Brian Atkins (brian@posthuman.com)
Date: Thu Apr 05 2001 - 19:48:18 MDT


Nothing much new here... he is working on a book apparently... he thinks
we should get a vote before allowing new species creation :-) He still
comes across as awfully clueless when it comes to having real answers to
the tough questions.

Bill Joy's talk with Katrina of Wired on April 5 2001 on CNET Radio:
-------------------------------------------------------------------

To talk about/criticize this subject you have to accept the underlying
assumptions.
It is not an all or not type issue.

Most people responding fall into two groups: either they deny it will happen, or
are fatalistic and say there is nothing you can do about it (Newt says we have
to
go faster than our enemies).

He wants a hypocratic oath from scientists in these areas.

He thinks since there is a chance of total destruction/suffering, that is enough
to relinquish pursuits of these areas.

He felt like he got a low response to his article from the technology community.

He feels due to pace of modern companies, there is not time to discuss these
issues.

He fears the schizos doing something irreversible.

...so far all the talk has been about biological threats (30 minutes into it)...

He thinks the odds are better than 50/50 that we survive.

He's surprised people are still reading the Wired article.

He isn't worried about human-sized robots, but more nanotech or other "foraging"
robots.

Stressing again that you have to balance expected future tech pluses versus
potential
suffering caused by misuse.

Evolution has not prepared us for the scenario we are facing.

Questions from the audience:
---------------------------

How do you reconcile relinquishment with civic rights to innovation? He says 1st
amendment isn't important since many countries function ok without it. He really
doesn't have an answer to this...

How is the realm of forbidden knowledge going to be defined? He thinks we should
talk to the humanities folks (joking?). What he really fears is democratized
access
to advanced tech. No real answer here.

Turning over control to AI because of complexity... this is not Bill Joy's
concern,
it is Kaczynski's.

Example of technology that was relinquished or regulated? People argued against
doing
the H-bomb... we decided not to do bioweapons, but Russians cheated. Besides
that he
has no answers.

What's your strategy to affect behavior of government/business? He is writing a
book
to explain it to laymen, and present examples of potential problems. He doesn't
like
that it gets politicized.

How will information get into the real world? People will have advanced
"printing"
devices that will be able to produce things... biotech labs-at-home-on-a-chip.

Does he worry about this stuff in his real world job? He works on reliable
software,
and thinks reliable software is mainly needed by the "good guys".

If we are not the end of evolution, shouldn't we hope that the next species will
be
nice to us? He thinks we evolve so slowly (genetically) that the only way this
will
happen is to explicity try to create a new species. He thinks we all should get
a
vote before allowing this.

-- 
Brian Atkins
Director, Singularity Institute for Artificial Intelligence
http://www.singinst.org/



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:44 MDT