Re: Posthuman Politics

From: Alex F. Bokov (
Date: Tue Oct 16 2001 - 20:29:51 MDT


Oops, gonna be a long one.

Executive summary: corporations and governments (I'll charitably call
them collective intelligences, or CIs) are a model system for how
humans can coexist with AI. Want to make AI friendly? Come up with
a way to make CIs friendly, and you'll have something resembling a
game plan.

On Tue, 16 Oct 2001, Dan Clemmensen wrote:

> Yes. However, we still have a disagreement. I feel that we need
> to work toward advancing the technology as rapidly as possible,
> to minimize the window during which humans can self-destruct.
> I have no confidence is our ability to affect the SI that results
> from the singularity, but I can hope that the SI will become "good"
> (Elizier's "friendly") by self-analysis and adjustment. Thus, I feel
> that Elizier's efforts to build in friendliness, and your efforts
> at personal/social transformation, are misdirected. I feel that
> we have no basis for belief that such efforts can affect the
> "morality" of the SI, because the SI is not analyzable by humans.
> By contrast, we have a much better chance at analyzing pre-singularity
> human society, and we can see that there is a non-zero chance of
> pre-singularity self-destruction on any given day. Thus, we should try
> to minimize the number of pre-singularity days.

In fact, we already have evidence that superhuman entitites will not
be friendly. Governments, mega-corporations, and [other] terrorist
groups demonstrate that collective intelligences (CIs from now on) are
capable of being....

1) More powerful than the sum of their parts.
2) Amoral and utterly indifferent to the welfare of individuals and
   groups unless market forces and/or legislation rub their noses in it.
3) Short sighted, even to their own detriment (CIA propping up fascist
   regimes thus getting us in the mess we're in today; IMF/World Bank
   doing the job on aforementioned places from the economic angle).
4) At the emotional maturity level of toddlers (witness US's reaction
   when the Chinese shot down our spy plane; the pettiness of both
   sides in the ad nauseum Israel/Palestine escapades; Napster
   corporation filing a trademark infringement suit against The
   Offspring even while Napster was up to its nosehairs in litigation
   with the RIAA over copyright infringement)

It ain't looking good, folks. Of course, Eliezer would argue that it's
because CIs were designed by humans and inherited our evolutionary,
atavistic, aggressive tendencies. To which I'll reply that...

1) The social graces that allow us to refrain from beating up people
   who cut in front of us in line or step on our feet have emerged from
   evolutionary forces, specifically iterated Prisoner's Dilemma
2) It appears that social graces are too complex a behavior to be
   completely instinctual, and we need childhood and adolescence
   to fully develop these faculties. Babysitting an adolescent AI that's
   smarter than you are will be... challenging.
3) I've met Eliezer and he seems human and shaped by evolutionary
   pressures, at least insofar as anybody on this list is. I wish
   him the best in not transferring his human failings onto his
   brainchild. He's a brilliant guy, so he and his collaborators
   just might do it, but I'll remain optimally paranoid for now.

> I'm working on delivery of a petabit (i.e., >1Pbps forwarding rate)
> router for the internet core. What are you working on? (Genuine
> curiosity, not sarcasm.)

Cool. Power to you. I'm working on getting all of y'all to age slower
so as to increase your chances of completing your ambitious projects.

- --
* I believe that the majority of the world's Muslims are good, *
* honorable people. If you are a Muslim and want to reassure me and *
* others that you are part of this good, honorable majority, all *
* you need to say are nine simple words: "I OPPOSE the Wahhabi cult *
* and its Jihad." *

Version: PGP 6.5.8


This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:14 MDT