Re: Maximizing results of efforts Re: Mainstreaming

From: John Marlow (johnmarlow@gmx.net)
Date: Sun Apr 29 2001 - 23:50:51 MDT


On 29 Apr 2001, at 14:45, Eliezer S. Yudkowsky wrote:
...
> And there are plans like building a seed AI, which require only a finite
> number of people to sign on, but which benefit the whole world. The third
> class of plan requires only that a majority *not* get ticked off enough to
> shut you down, which is a more achievable goal than proselytizing a
> majority of the entire planet.

#Sound reasoning. It may however also require that the government not
view your efforts as a threat. See below.

> As Brian Atkins said:
>
"... Advanced technologies like AI
> give huge power to the individual/small org, and it is an utter waste of
> time (and lives per day) to miss this fact."

#Rest assured, the government will not miss this fact--and it is no
fan of power in the hands of individuals or small groups. If and when
you appear to be near-term viable, they (and perhaps others) will
attempt to wreck you, coopt you, or simply seize the works.

#Count on it.

# It is overwhelmingly likely they will succeed.

jm

>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://singinst.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>

John Marlow



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:00 MDT