Notes from the Bill Joy radio talk

From: Brian Atkins (
Date: Thu Apr 05 2001 - 19:48:18 MDT

Nothing much new here... he is working on a book apparently... he thinks
we should get a vote before allowing new species creation :-) He still
comes across as awfully clueless when it comes to having real answers to
the tough questions.

Bill Joy's talk with Katrina of Wired on April 5 2001 on CNET Radio:

To talk about/criticize this subject you have to accept the underlying
It is not an all or not type issue.

Most people responding fall into two groups: either they deny it will happen, or
are fatalistic and say there is nothing you can do about it (Newt says we have
go faster than our enemies).

He wants a hypocratic oath from scientists in these areas.

He thinks since there is a chance of total destruction/suffering, that is enough
to relinquish pursuits of these areas.

He felt like he got a low response to his article from the technology community.

He feels due to pace of modern companies, there is not time to discuss these

He fears the schizos doing something irreversible. far all the talk has been about biological threats (30 minutes into it)...

He thinks the odds are better than 50/50 that we survive.

He's surprised people are still reading the Wired article.

He isn't worried about human-sized robots, but more nanotech or other "foraging"

Stressing again that you have to balance expected future tech pluses versus
suffering caused by misuse.

Evolution has not prepared us for the scenario we are facing.

Questions from the audience:

How do you reconcile relinquishment with civic rights to innovation? He says 1st
amendment isn't important since many countries function ok without it. He really
doesn't have an answer to this...

How is the realm of forbidden knowledge going to be defined? He thinks we should
talk to the humanities folks (joking?). What he really fears is democratized
to advanced tech. No real answer here.

Turning over control to AI because of complexity... this is not Bill Joy's
it is Kaczynski's.

Example of technology that was relinquished or regulated? People argued against
the H-bomb... we decided not to do bioweapons, but Russians cheated. Besides
that he
has no answers.

What's your strategy to affect behavior of government/business? He is writing a
to explain it to laymen, and present examples of potential problems. He doesn't
that it gets politicized.

How will information get into the real world? People will have advanced
devices that will be able to produce things... biotech labs-at-home-on-a-chip.

Does he worry about this stuff in his real world job? He works on reliable
and thinks reliable software is mainly needed by the "good guys".

If we are not the end of evolution, shouldn't we hope that the next species will
nice to us? He thinks we evolve so slowly (genetically) that the only way this
happen is to explicity try to create a new species. He thinks we all should get
vote before allowing this.

Brian Atkins
Director, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:44 MDT