Re: transitional thinking

From: J. R. Molloy (jr@shasta.com)
Date: Wed Dec 13 2000 - 22:23:11 MST


Justin Corwin writes,
> i would argue, no offense intended at all towards the singularity folks,
> that it already is a religious meme. singularity is supposed to solve
every
> problem since the time you burnt your toast to cats and dogs living
together
> in harmony, which smacks of faith to me.

Well then, technological singularity could also solve the problem of faith.
Rather like self-healing/faith-healing
too-smart-for-mere-mortals-to-comprehend SI, which provides the answer to
why anything exists. No Problem.
Nothing offensive in that (unless I'm expected to do something about it).

> not that i think greater than human intelligence wont' solve problems we
> can't on our own, but perhaps the singularity will not be the incredible
> force you're expecting.

I cannot expect technological singularity to manifest as a force, unless
that force already operates in reality. As I understand extropy, it
qualifies as a pre-singularity force that fits the profile of
hyper-accelerated positive change, which has the potential to evolve into a
hyper-complex super-adaptive mega-system that will rip itself out of the
constraints of biology and boot itself into galactic commandeering like we
can't imagine. It's progeny shall eat universes for breakfast.
Nothing offensive in that (unless I'm expected to do something about it).

> what if self modifying humans with their patterned
> parallel processors are never overtaken by machine intelligences? it's
just
> as likely as a human who is self modifying has less far to go to become
more
> intelligent than a human, all he would have to do is become ANY smarter. a
> machine intelligence doesn't exist yet. going from 60 to 61 is easier than
0
> to 61, even with our design flaws.
> just a theory.

I like the idea of self modifying humans. For starters, I'd modify myself so
that I could earn an enormous amount of money to pay for the operation that
made me self-modifying.
After that, I'd set about to teach the world that Democrats will not let
machine intelligence get out of control, even if they have to hand count
every ballot in the country.

> i do however, believe that expert systems and ultratech/ultrasoftware will
> play a role in improving us, as we're unlikely just to do it through
> biological manipulation(gene therapy, bioware, etc). although you could
just
> add a bunch of grey matter and see what happened.

Yes, let's start with... uh, how 'bout me. Go ahead, strap me down and pour
some grey matter in there. Not too much though. I might start thinking about
the meaning of "improving us" and go haywire. Relax, I'm just kidding. What
I really want to do is work as hard as I can to become a real live
transitional-post-human. (I'll just die if I don't make it.)

> (a wonderful quote from a bad movie:
> "....so, what does a 18 foot predator with a brain the size of a flat-head
> V12 engine think about?..."
> -Deep Blue Sea )

What would such a predator _need_ to think about? I've been thinking about
the Global Brain lately. (Greg Stock used to call it "Metaman.") Suppose a
super-intelligent super-organism were to solve all our problems for us. Then
what? Isn't that what people invented god for? So there you are, back with
religionism, and like a jack-in-the-box, here comes deus ex machina.

> the bottom line is that trancendant AI is cool, and massively
> useful(theoretically). but it's also not here. so we need to be careful
> where we put our trust(faith?). as long as we're not sitting on our butts
> waiting for momma AI to come and fix our problems, the religious nature of
> "awaiting the singularity" won't be an issue.

You are right to advise caution, I think. Nevertheless, an SI that can solve
all my problems may not be worth the problems it creates by requiring me to
build it. I have no clue where to begin, which tempts me to stop right now.
Suddenly I feel as though I have no problems. Has the singularity happened,
or will even more unproblematical feelings herald its birth...

> my big question, is, when the singularity does occur, how do we keep
people
> from treating it like a GOD, or a Demon. no matter how logical and well
> constructed it is, being treated like a god would hamper both it's
> situation, it's flexibility, and it's psychology(assuming a general
> intelligence would have a similar psychology to other general
intelligences,
> such as humans, being given absolute worship and power has a tendency to
> mess with your head and hamper your effectiveness)

I see no reason why people should _not_ treat an SI like a god. A benevolent
god would be nice, but a demon with the right answers would do fine. Don't
forget, the singularity is you... if you successfully self-modify.
Nothing offensive in that (unless I'm expected to do something about it).

Stay hungry,

--J. R.
3M TA3

"Truth is a pathless land."
--J. Krishnamurti



This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:50:36 MDT