Re: Humanrintelligences' motivation (Was: Superintelligences' motivation)

Jim Legg (
Thu, 30 Jan 1997 14:27:27 +1300

> think is the most dangerous thing about extropy, transhumanism, nanotech,
> replicators etc. And that is human goals and motivations.
> We don't need advanced AI or IA but just plain simple exponential growth
> give extremists and minorites acces to weaons of massive destruction.

I think most Uploaders would be happy to simply let the competition die off
naturally. The extremists and minorities who you claim want to use massive
weapons would only do so out of hopelessness because they haven't found a
better way. Educate them and while you're at it, what's IA?


Jim Legg
Man * Soul / Computer = 12 ^ (I think therefore I surf)