Re: 5,000,000,000 transhumans?

edward worthington (
11 Aug 1998 04:11:52 -0000

I do not belive that all humans will become transhumans, I belive that the population will not allow power hungry egotistical maniacs to have transhuman benefits. I know I would not like having Sadam Heussian have life extenstion and super intelligance, that would make him an even more threat. So I belive that only Over Intelligant Responsiable Healthy people will be able to become Transhuman because people will not want having anyone abuse there transhuman power.

On Sun, 09 Aug 1998 12:09:06 -0700, wrote:
> At 01:37 PM 8/9/98 +0200, Den Otter wrote:<br>> ><br>> >Yes, but this is fundamentally different; godhood isn't something<br>> >that one would sell (or give away) like one would do with minor<br>> >technological advances such as phones, TVs cars etc. Just like nukes<br>> >were (and are) only for a select few, so will hyperintelligence, <br>> >nanotech, uploading etc. initially be only available to a select<br>> >group, which will most likely use them to become gods. There is<br>> >no rational reason to distribute this kind of power once you have<br>> >it. <br>> ><br>> >Powerful businessmen still need others to make and buy their products,<br>> >and dictators and presidents still need their people to stay in power<br>> >& to keep the country running, but a SI needs NO-ONE, it's<br>> >supremely autonomous. I can't imagine why it would share its <br>> >awesome power with creatures that are horribly primitive from its point <br>> >of view. Would *we* uplift ants/mice/dogs/monkeys to rul
e the world<br>> >as our equals? I think not.<br>> <br>> "No rational reason" is a strong claim. I doubt your claim. First, your<br>> view surely depends on a Singularitarian view that superintelligence will<br>> come all at once, with those achieving it pulling vastly far away from<br>> everyone else. I don't expect things to work out that way. I've explained<br>> some of my thinking in the upcoming Singularity feature that Robin Hanson<br>> is putting together for Extropy Online.<br>> <br>> Second, I also doubt that the superintelligence scenario is so radically<br>> different from today's powerful business people. [I don't say "businessmen"<br>> since this promotes an unfortunate assumption about gender and business.]<br>> You could just as well say that today's extremely wealthy and powerful<br>> business should have no need to benefit poor people. Yet, here we have oil<br>> companies building hospitals and providing income in central Africa. I just<br>> don't buy the idea that ea

ch single SI will do everyone alone.<br>> Specialization and division of labor will still apply. at some SI's will<br>> want to help the poor humans upgrade because that will mean adding to the<br>> pool of superintelligences with different points of view and different<br>> interests.<br>> <br>> Let me put it this way: I'm pretty sure your view is incorrect, because I<br>> expect to be one of the first superintelligences, and I intend to uplift<br>> others. Or, are you planning on trying to stop me from bringing new members<br>> into the elite club of SIs?<br>> <br>> > In any case, we<br>> >should all work hard to be among the first SIs, that's the only<br>> >reasonably sure way to live long and prosper.<br>> <br>> No disagreement there. Make money, invest it, and keep on integrating<br>> advances as they happen.<br>> <br>> Max<br>>  <br>> --------------------------------------------------------------------------<br>> Max More, Ph.D.<br>>  (soon also: <
>)<br>> <br><br>> Consulting services on the impact of advanced technologies<br>> President, Extropy Institute: <br><br>> --------------------------------------------------------------------------<br>

Free and Private email from Supernews(TM) <>