Principle of Nonsuppression

Eliezer S. Yudkowsky (
Wed, 01 Sep 1999 13:53:41 -0500

Edwin Evans wrote:
> If governments did start taking nanotechnology threats
> seriously and decided to ban selling/developing
> assemblers or even the sale of AFMs, I wouldn't rail
> against it (would you?). If Japan and a few other
> countries did do this, it seems they may be able to
> postpone the possible future event of nanotechnology
> obliteration. Since it does take a lot of money and
> organization to develop, making it illegal could
> conceivably be quite effective. Are you sure it's
> too late? Regulation will get more difficult and
> dangerous the later it comes.


> If we want to become immortal Superintelligences, we must pass through
> the Singularity. Or better yet, become the Singularity. To achieve this, interested
> parties should cooperate to gather wealth and watch, implement and sponsor
> research of human-enhancing technologies, especially those that may lead to mind
> uploading. As AI (of the conscious kind) is one of the biggest liabilities, this
> field of research should be monitored extra carefully, and curbed if necessary.


I see two possible problems:

1)  We're so busy trying to sabotage each other's efforts that we all
wind up getting eaten by goo / insane Powers.

2)  Iraq gets nanotechnology instead of the US / the AI project has to
be run in secret and is not subject to public supervision and error correction.

"Trying to suppress a dangerous technology only inflicts more damage." 
(Yudkowsky's Threats #2.)

I'm not sure - still running the numbers, and while I'm an idealist, I
don't trust my own idealism - but maybe "Nonsuppression" should be one
of the Singularitarian Principles.  Or non-initiation of suppression, at
least.  If the Transtopians succeed in regulating AI, I might have no
choice but to push for regulations on nanotechnology, which I'm sure
will be made easier by their having laid the groundwork for techno-panic
of the appropriate authorities.  Thus marginalizing all
ultratechnologies into the hands of renegade states and criminals.

Can anyone really think that ve can panic politicians or the media into
pushing for laws that suppress nanotech/AI without their noticing the
existence of AI/nanotech/uploading/neurohacking/genetic engineering? 
You push for the "Nanotechnology Suppression Act" and you'll get the
"Comprehensive Ultratechnology Regulation Bill".

Edwin Evans wrote:

> I sincerely hope I'm not part of the problem.
Puts you one up on the Transtopians. The problem doesn't even seem to have occurred to them. Who are the "Transtopians", anyway? My guess is den Otter. -- Eliezer S. Yudkowsky Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way