Re: Humanrintelligences' motivation (Was: Superintelligences' motivation)

Jim Legg (income@ihug.co.nz)
Thu, 30 Jan 1997 14:27:27 +1300


> think is the most dangerous thing about extropy, transhumanism, nanotech,
> replicators etc. And that is human goals and motivations.
> We don't need advanced AI or IA but just plain simple exponential growth
to
> give extremists and minorites acces to weaons of massive destruction.

I think most Uploaders would be happy to simply let the competition die off
naturally. The extremists and minorities who you claim want to use massive
weapons would only do so out of hopelessness because they haven't found a
better way. Educate them and while you're at it, what's IA?

Best,

Jim Legg http://homepages.ihug.co.nz/~income
Man * Soul / Computer = 12 ^ (I think therefore I surf)