Re: The Politics of Dancing [was: SPACE/IDEA FUTURES...]

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Dec 06 2001 - 06:56:51 MST


"Robert J. Bradbury" wrote:
>
> There is a post in the queue that I believe addresses some of this.
> But I have to protest the characterization of being a "singularitarian".
>
> I am an extropian. I would probably classify myself as a "romantic"
> extropian. So I'm all for the creation of new information but *not* at
> the expense of "old" information. *I* consider the label "singularitarian"
> to be individuals who value the acceleration of change without considering
> the value of "old" information. (I'm sure others may differ with definitions
> but we aren't exactly dealing with dictionary term here.)

Yes, but I've never heard that definition before (and here I thought I'd
heard them all). Furthermore, it should be obvious that this does not
reflect the held philosophy of a great many actual Singularitarians, or at
least, they would not report it so - where Singularitarian is defined
under either of the two more common definitions, as one who predicts or
expects a Singularity, or as one who acts so as to bring about a
Singularity.

Whether old information is valued is, strictly speaking, quite
orthogonal. Although in fact I know of no Singularitarian who assigns
this a definite value of zero.

> I often reflect upon the quote:
> "You must give up everything you are for who you might become".
> (Sources of this are unclear to me at this time.)

So why attribute it to the entire Singularitarian community? Especially
when what binds us together is our common belief that we ought to be doing
something about the Singularity, rather than our belief about what we plan
to do afterward? I may think that someone who plans to spend the next
million years waterskiing is being silly and has completely failed to get
the whole point of intelligence enhancement and will almost certainly give
up this plan once the alternatives become clearer; it doesn't mean that,
here-and-now, the person can't contribute to the Singularity.

> Instead, I consider:
> "You should become whatever you can be remembering who you once were."

Is memory enough? Never mind, orthogonal issue.

> I'll simply note the lack of a Sysop AI in either of these scenarios.
> (Something I'm sure will draw flak).

Good call. Why? Because:

THE SYSOP SCENARIO HAS NOTHING TO DO WITH THE SINGULARITY.
THE SYSOP SCENARIO HAS NOTHING TO DO WITH SINGULARITARIANISM.
THE SYSOP SCENARIO HAS NOTHING TO DO WITH THE SINGULARITY INSTITUTE.
THE SYSOP SCENARIO HAS NOTHING TO DO WITH FRIENDLY AI.

Everyone repeat this to themselves until such time as it reverses whatever
damage I inadvertantly did by blurting out such an unfortunately monikered
description of A SINGLE POSSIBLE OUTCOME of the Singularity GIVEN A
CERTAIN SET OF STARTING ASSUMPTIONS.

> In a singularitarian scenario
> there is fundamental question of a democracy of power or a delegation
> of power. In the former situation everyone gets to evolve as fast as they
> choose to do so, in the later it is a function of the 'overlord(s)' being
> willing to grant favors.

As some of you may have noticed, by this point I've simply given up on the
Sysop Scenario, the reason being that it apparently cannot be explained to
a substantial minority of human beings, including Extropian Elders. Just
as our visual cortex can be made to interpret nonexistent shapes by
optical illusions that incorporate a few simple visual cues, there is
apparently some chunk of brainware somewhere that insists on interpreting
the Sysop as an extrema of the tribal-chief archetype, regardless of all
lectures on anthropomorphism. I am not willing to fight it any more; I
give up. It was never necessary to Singularity description in the first
place and it is not necessary now.

I will simply confine myself to noting that "in the latter, it is a
function of the overlord being willing to grant favors" implicitly assumes
that the probability of the favor being granted is something less than
100%. It assumes, in short, that the Sysop behaves like a tribal chief;
selectively granting favors, either to enforce the return of reciprocal
favors, or simply for the joy of meddling. It assumes that the presence
of an intelligent substrate for reality retards changes because that
substrate interferes with changes, rather than improving changes because
that substrate assists with changes. Above all, it involves the inherent
human distrust of any mind involved in a transaction, based on our always
assigning a significant probability of that mind interfering with the
transaction, which in turn is an evolved instinct resulting from the fact
that our sole experience of minds is with humans who *do* tend to
interfere. When was the last time a law of physics decided to interfere
with a falling rock? In time humanity will learn that not all
minds-in-general are as meddlesome as we, but until then I've given up on
fighting instinct.

Otherwise the distinction is simply that between needing to evolve oneself
using nonsentient technology, and being able to make a much wider range of
choices about the type and speed of evolution using assistance from a
sympathetic superintelligence. Do you really want to try rewiring your
own neurons? Well, apparently you do. But if you do try to go it alone -
which is, of course, possible under a Sysop Scenario; simply not
recommended - it is going to impose certain constraints on the directions
in which you can evolve, just as genuine evolution imposes other
constraints on the design of the organism. You would probably get to the
same place eventually, but your choice of intermediate path - that is, the
kind and number of steps between humanity and superintelligence - would be
much more constrained.

> I'm not an individual who strongly favors "court politics", so I do
> not consider myself to be a "singularitarian".

I'm not an individual who strongly favors anthropomorphic descriptions of
superintelligence, so I'm against the use of the term "court politics".

> It would be nice if this "realm" of perspectives had a positional
> scale similar to that which Max recently provided regarding "IDENTITY".
> That would serve to foster productive discussion instead of unproductive
> sniping.

And on it, beliefs about the Sysop Scenario would be totally orthogonal to
beliefs about the Singularity. Pardon me; beliefs about the Sysop
Scenario would be dependent on beliefs about the Singularity, but not vice
versa.

But you know what? Screw it. It doesn't matter. Consider the Sysop
Scenario junked. Leave the Singularity blank.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:24 MDT