Re: SI ultimatum speculation

Dan Clemmensen (dgc@shirenet.com)
Sat, 07 Sep 1996 21:32:25 -0400


Michael Wiik wrote:
>
[SNIP]
>
> I asked a friend the other day why there are no groupminds in existence
> now - why specifically some group of people has not gotten together and
> formed some sort of synthetic being, that would exist on the Internet.
> Perhaps they would choose a particulary charismatic personality to
> represent them in real life - such person being linked to the others
> in the group via hip PC's, eyepiece monitors, etc, (the Borg look).
> His response - that people have families, their own concerns, their will
> to be independent, etc, all makes sense, yet I think it is a barrier to
> becomming >H.
>
> Perhaps such groups do exist, yet choose to remain secret. Or maybe I
> just haven't heard of any.
>
> If you're an advanced AI or some sort of transcendent being, who could
> choose to take the current human race with you or not, how long would
> you give them to get it together - to end war, suffering, murder, etc.
> Civilization has had about 10K years, I'm not sure I would wait that
> long. Putting the question as an ultimatum - "get it together in a week
> or be destroyed", *might* provide sufficient incentive, but I doubt it.

My feeling is that spculation by humans about the motivations and
actions
of an SI (AI or any other type) is not worthwhile. It's equivalent to
ants speculating about humans, or humans speculating about God.

So, on to the worthless speculation! An SI has no reason to deliver an
ultimatum. the SI can simply take human civilization and modify it at
the optimal rate consistent with the SI's (incomprehensible) goals.
For example, let's arbitrarily assume that the SI has determined that
revealing itself to humanity will slow our "progress" toward whatever
its goal for us is, but, being a benign being, the SI wants to stop
wars.
In this (arbitrary) scenario, the SI cannot instantly stop all wars
without
revealing itself, but it might guide humanity to prevent big wars and
minimize small ones. I think your "one week" ultimatum is unlikely. The
SI simply take control of the bodies of anybody that breaks the SI's
rules, without even letting them know.

Basically I don't believe that SIs that are only somewhat more
intelligent
than humans will exist, except very transiently, because an SI that's
only
somewhat more intelligent than a smart human can make itself smarter
still. This feedback mechanism has no obvous limit until the SI has
godlike powers, and the transformation will occur in days or weeks.