Re: >H: The next 100 months

Sayke@aol.com
Tue, 19 Oct 1999 01:27:40 EDT

In a message dated 10/18/99 7:33:30 PM Pacific Daylight Time, sentience@pobox.com writes:

> Sayke@aol.com wrote:
> >
> > In a message dated 10/18/99 9:22:53 AM Pacific Daylight Time,
> > sentience@pobox.com writes:
> >
> > 1:30 pm est: eliezer climbs up onto the the roof of the singularity
> > institute's main building, pulls a large optic taser out of a suitcase,
plugs
> > it into the wall, checks the backup batteries, and sits down on an
airvent to
> > await... whatever.
>
> It's entirely irrelevant what I do after this point. Or at least,
> that's the idea. I would simply wait, just like everyone else... And
> will it ever feel good to be able to say that.

in the first paragraph i write, and so as to get it out in the open, i would like to say that i did not intend my neal-stephenson-wannabe rant to be taken nearly as seriously as you seem to have taken it. i was not trying to "shock" you. my opinion of you is higher then that. and also, i took it as a given that, if the singularity somehow appears to 'do nothing', youll try something else. and, i dont think i will be in the position to steal a russian nuclear sub anytime in the next several years... but ya never know... ;)

[snip 'if !singularity();' scenerios]

> > 1:55 pm est: eliezer stands on an air vent casing, looks around, for
one
> > last time, at the placid cityscape surrounding him, and starts busting
high
> > voltage caps into the air and shouting "hey, elisson! i wanna know the
> > meaning of life, the universe, and everything!!!"
>
> I think you've managed to misunderstand my motives completely.

poetic license, dammit! you had to stand on a high place and shout something!

> > 1:57 pm est: although elisson notices its surroundings almost
> > immediately, it takes a short time for it to realize that the ant on the
roof
> > is its creator. its decision-making process is something vaguely like the
> > following: "a monkey is discharging a monkey weapon on the roof. it
might do
> > something bad with that. no, there is no way it can damage me with that.
this
> > monkey seems to be one of my primary creators. its asking me questions.
it is
> > not necessary to answer its questions. cells 0x9e83fa823 through
0x9e83fc907,
> > disassemble the monkey."
>
> Oh, please! A Power isn't a super-AI any more than it's a super-human.

hey, ive got an idea, how about we predict the by-definition unpredictable? and for the record, i trend towards agreeing with you, but... its just that you seem to be trying to exude this aura of knowing what your talking about (even when you cant know what your talking about). i dont dig those auras. [excuse my wetware; its annoyed.]

> > 1:58 pm est: on the roof, the wind picks up, and eliezer notices the
dust
> > rise from the ground like a fractal wave of soot, and opens his arms in
> > welcome. elisson, like a sandblaster, embraces him. eliezer ceases to
exist
> > in a sheet of black razorblade snowflakes.
>
> Am I supposed to be shocked by this scenario? You don't want to know
> what I would consider a bad end.

naaaaaa, you werent supposed to be shocked. amused, possibly... and i really do want to know what you would consider a bad end. do tell.

> There are two problems with trying to shock me this way. First, unlike
> you and den Otter, I suffer from no illusion that the world is fair.

since when do i think the world is fair? when i put words into your mouth, i at least attempt to be ridiculous to the point of amusement. you actually sound like you think i think the world is fair. amusing, yes, but not quite ridiculous enough for me to infer an attempt at irony and thought-provocation. i am left with no choice but to take you seriously.

> You believe, because it is implicit in the human model of the world,
> that every risk can be ameliorated by your actions.

you believe, because it is implicit in your attempt to rebel against the human model of the world, that the human model of the world is wrong, because it is the human model of the world.

another idea: how about we stop making unsupported assertions about each others beliefs and just ask each other what the beliefs in question are?

> You'll choose a
> path that's far more dangerous than mine in absolute terms, simply
> because it allows you to "do something", or believe you're doing
> something, about the risk that you'll be destroyed by AIs.

::shrug:: death is probably final. there is no more "absolute term" then "im dying". so... resistance is not futile, etc... remember the alamo!

> I choose the
> path with the best absolute probability, even if it isn't as emotionally
> satisfying, even if it contains risks I admit to myself that I can't
> affect, because the next best alternative is an order of magnitude less
> attractive.

best absolute probability of what, exactly? and why is that to be strived for? if you dont trust philosophy and you dont trust your wetware, what do you trust? ("and who do you serve?" sorry... damn that new babylon 5 spinoff...)

and, please describe exactly how you figure that the next best alternative (become a singularity, dammit! and please dont tell me how hard it would be. its a lot like cryonics: better then nothing) is an order of magnitude less attractive?

> If Powers don't like mortals, then mortal life is doomed and there's
> nothing we can do about it - whether we're humans or AIs or neurohacks
> or augments or telepaths or hybrids or anything else on this side of the
> line - that doesn't involve such enormous risks that we'd have to be
> a-priori certain that the Powers would kill us before it would be
> survival-rational to do anything but steer for a Singularity.

there is a nonzero chance that there are no powers, and there is a larger chance that the powers dont know about us, because they cant see us, and even if they did, they wouldnt care, simply because they cant get here in less then a Very Long Time. ftl could very well be impossible, and the economics of being a deity could be such that leaving the solar system of your 'birth' is a waste of time. at least, this is the impression that i get. correct me if im wrong.

and anyway, it seems to me that your basicly saying "the powers will eat us if the powers will eat us. their will be done on earth, as it is in heaven, forever and ever, amen." damn the man! root for the underdogs! etc... (yes, i know my saying that probably has something to do with my tribal-issue wetware. so? it makes sense to me. if it shouldnt, point out the whole in my premises)

> Second, in a case like this, I would have to evaluate whether I wanted
> my ultimate goal to be survival.

does not 'the state of having goals' depend upon personal survival? if so, are not all other goals secondary to personal survival?

> I don't really have to do that
> evaluation now, because the Singularity is intuitively obvious as the
> thing to do next. Which is good, because I don't really trust
> philosophy, even my own; I do, however, trust certain kinds of
> intuitions. Nonetheless, if my choice of actions became dependent on
> philosophy, personal survival wouldn't be my first pick as priority goal.

the singularity is not, to me, intuitively obvious as "the thing to do next." and, i do not trust any kind of intuition, if i can help it. why do you? yes, im asking for to justify your reliance on intuition (if thats what it is), and thats philosophy. if you will not explain, please explain why you will not explain.... heh.

> > pardon my rehash of what seems obvious, but isnt suicide bad?
>
> Prove it. I don't trust philosophy, but I trust the limbic system even
less.

that the limbic system agrees with me is completly beside the point. in order to have goals, i must exist. in order for goals to effectively *be*, i must exist. i think significance is an observer-dependant property; even a me-dependant property. quite literally, nothing matters if im not there to say "that matters." goals are a function of the-point-of-view-that-i-am, like perceptions. if you disagree, please tell me how and why. as best i can figure, you havnt really answered den otter. ive interpeted you as saying, in response to his questions, "ive been able to change my goal system to something that may seem irrational to you; deal with it." ...?

and are intuitions not a function of your tribalistic and vestigial wetware, as well as my instinct for survival?

and why are you not a functional solipsist? (or enlightend self-interisted person, if you want to use that term... but thats so clumsy).

baaaaaah humbug. may you live in interisting times,

sayke, v2.3.05