From: Robert J. Bradbury (email@example.com)
Date: Fri Jan 18 2002 - 12:30:29 MST
On Thu, 17 Jan 2002, Wei Dai wrote:
> Human individuals and societies tend to change because
> individuals die and are replaced, or we adopt some idea that is initially
> attractive and then find out that it doesn't work, or we learn something
> new and surprising about the nature of reality. SIs presumably will not
> have these problems.
I don't see any reason to believe that is likely to be the case.
People may simply tire of life. They may go "offline". A consensus
may developt to completely re-architect the SI to focus on a particularly
interesting problem of art or math or string theory.
> It seems plausible that they will quickly settle into
> a stable state with long term goals that don't change over time.
Well one stable long term goal would seem to be self-preservation.
It doesn't require rocket science to realize that one will eventually
(in the far far distant) future be in competition with ones clones.
How that effects ones reproduction decisions now isn't clear.
If humans are an example, some do and some don't.
> In other words, I think rational design and goal-oriented action will play
> a much bigger role than survival-of-the-fittest kind of evolution in
> governing how SIs change over time.
But what is "rational design"? Isn't it based on a prior knowledge base
being applied based on principles of what has previously worked? We know
now that "evolved" neural and electrical circuits can go places that humans
can't design and have a difficult time understanding how they work once they
are developed. How can we say SIs will not use similar methods to explore the
space "outside" of what can be seen from an existing "rational design" perspective?
> Human beings are able to preserve trust relationships over periods
> comparable to their life times. Why shouldn't SIs be able to do the same
> thing? What problem will prevent them from doing this over millions of
Humans don't have the ability to turn themselves into something completely
different from what they are? I cannot currently turn myself into a dog.
I am fundamentally a human with evolved structures that generally dictate my
social interactions with other humans -- just as a dog has different evolved
structures that dictate their interactions with other dogs. SIs do not
generally have significant amounts of "social" interaction. They spend
most of their time alone in the vast reaches of space. So its entirely
possible that trust relationships and social interaction guidelines might
never naturally evolve. I agree that there might be some universal
optimal behavior from a game theoretic standpoint.
As has been discussed before on the list, one may always play nice
at the SI level because one can never guarantee that an entity one might
betray has not prepared berzerker-bots in response for just such
a situation. One plays nice because survival depends on it.
That means in the final ultra-long term future, SIs don't consume
each other but themselves.
> You can fuse hydrogen into metals and build your reactors in an
> exponential bootstrapping process. It doesn't seem to be an insurmountable
Yes, but it would be nice to know the rate of increase on this.
The flat part of the curve could be many millions of years.
> BTW, do you consider full matter-to-energy conversion via small black
> holes to be "magical physics"? If not that would allow you to disassemble
> a star even more quickly.
True. I suppose I would have to see some concrete plans for how
to move the black hole someplace where it can be put to good use
and a way of generating more useful than non-useful energy.
If most of the energy comes out as gamma and X-rays then it seems to
be a pretty hellish environment where even with full scale nanotech
one might be expending a significant fraction of the energy,
dismantling and reassembling the machinery as well as sorting and
breeding the isotopes being generated just to harvest some fraction
of the energy available.
It might not be "magical physics", simply "impractical physics".
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:35 MST