Re: >H RE: Present dangers to transhumanism

Stan Kretler (skretler@infinitefaculty.com)
Thu, 02 Sep 1999 11:52:27 -0700

"Robert J. Bradbury" <bradbury@www.aeiveos.com> wrote:

: On Wed, 1 Sep 1999, Stan Kretler wrote:
:
: > From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
: >
: > : Frankly, I think a fairly large percentage of you *are* naive
: > : technophiles. You think you can take controlled sips from a tidal
: > : wave.
: >
: >
: > I just joined this list recently, but I do not see a lot of what I
would
: > I call "naive technophiles". I do see a lot of technophiles, and a
lot
: > of naivete, but the naivete seems to be about the philosophical
: > undergirding of transhumanism, not about the technology. Taking
: > "controlled sips from a tidal wave" seems easy enough. Technology
can do
: > anything. But thought about *why* taking a controlled sip from a
tidal
: > wave is a worthwhile goal seems to be absent from this list.
:
: I know why -- because I don't want to drown. :-)
: But, the point about the "philosophical undergirding of transhumansim"
: strikes home. We are faced with a fundamental problem of constructing
: a philosophy for a system which we are woefully ill-equipped to
: comprehend.
:
: Humans have been selected to be about two things: survival &
reproduction.
: Now, all of the technophile discussions, AI discussions, political
: discussions, etc. seem to some degree to revolve around --
: "how can *I* survive?", and as a subtheme --
: "and how can *I* live in an environment that is just the way *I* like
it?".
: Sometimes this gets extended to inviting/dragging other people along
: into your perfect environment or encouraging them to create their own.
:
: Now, leaving aside the technical details of whether it is really
: feasible, let us *assume* that survival is guaranteed. It does seem
: like that is a distinct possibility (ignoring the probabilities).
: So lets say we are surviving in whatever enhanced form and perfect
: environment we have choosen.
:
: How does a species that to a large degree based on a fight for
: survival, develop a philosophy, or a rasion-d'etre, for this
: environment. Yes, we have all seen the posts -- I want to travel,
: I want to create music/art/explore interesting scientific problems,
: etc. I raised a point a week or so back, that perhaps the purpose
: of it all was to turn all of the available matter & energy in
: the universe into a "perfect" structure or the "ultimate" thought
: machine, or maybe the single most aesthetic work of art. I don't
: recall anyone commenting on that, but I think it begins to chip
: at the problem.
:
: When it is *not* about survival (or reproduction) -- What is it about?

This is an important question, but it is a bit different from the one I was getting at. You are asking about the point of existence when we are fundamentally different beings from those we are now. I'm asking whether there is a solid foundation for the *desirability* of becoming fundamentally different beings from those we are now. Almost everyone on this list, it seems, finds the idea of becoming a fundamentally different being desirable. Me too. But when I ask myself why, I can't come up with very deep answers. "It will be great to be immortal"; "It will be cool to be *really* smart"; "It will be fun to live on other planets"; etc. But "it will be great" isn't a philosophical justification. A friend of mine says the reason for life-extension is to figure out the reason for life-extension. Maybe something similar applies here, although that is still not a perfect answer.

Of course, if we answer *your* question, then mine will probably be taken care of!

:
: Now, if the other possibility that seems to fit the available data
: is -- become an SI/AI, think about a philosopy for existence,
: realize that there is no philosopy (because survival *was*
: the purpose for existence and once survival is guaranteed,
: existence is pointless); dismantle yourself back into atoms.
: This seems to fit the paradox of "Why do we still see stars?".

Yes, this is the possibility I think is likely.

Cheers,
Stan