"Robert J. Bradbury" wrote:
> T.0. Morrow promoted a nice interesting philosophy (quasi-religion)
> to the discussion for the framework in which to consider how to
> run simulations, what you can do in them and how to "hold" the
> simulation directors and producers.
> I'm not going to go into that because I'm not sure I understand it
> well enough.
> However, his discussions sith Samantha placed side by side with
> Eliezers Sysop discussions do raise some interesting issues.
> If we assume that extropian principles are those striving for
> increasing complexity and sophistication (this may be vulnerable
> to attack, but lets assume it for a minute) -- and *if* we assume
> that conscious entities are inherently valuable and you cannot
> infringe on their right to self-determine whether or not they should
> continue to exist do we not have an inherent conflict?
> I.e. if you exhaust all the possibilities (read use up all the matter
> to store the memory states of the all the conscious entities that have
> ever been created and who cannot be destroyed) -- does not the moral position
> that you "cannot" erase those states imply that you cannot fullfill the
> extropian prime directive?
I would not think so. There is considerable overlap of material among
human mentalities. I would rather doubt that all the memories and
fundamental personality structures take up all that much space. After
all that don't take up that much space coded very inefficiently and
redundantly in meat-brains. So I would be very surprised if you ran out
of computational space in such a manner as described. Also, there is
nothing in the implied moral position that says that all beings must be
running at every nanosecond. From the point of view of the beings there
would be no difference as long as their milieu did not change out from
too rapidly to cope with at all.
> The fundamental path through which evolution operates is to wipe the slate
> sufficiently clean from time to time to allow it to go off in a completely
> different direction. This is not allowed in a "moral" SysOp world interested
> in protecting everyone's "self"-interests. (I doubt the SysOp has a
> self-destruct sequence tied to on a random interval timers).
AFAIK, evolution does not really wipe the slate clean and go in a
different direction. It most builds on what is already present.
Besides, evolution, is not some super-morality or model for all that can
or should be forever more. I do wonder just how the definition of a
"self" that has interests that should be catered to is to be arrived at.
> Now, from an extropian perspective, you desire that the phase space be
> explored as completely as is feasible. Accepting that, you realize that
> sooner or later your sim will have had its run and your bits are up for
> reuse. From a truly extropic perspective this is fine with you.
I could care less if phase space is explored as completely as possible
as long as the beings present can continue to grow and learn as much as
possible. If that takes exploring phase space completely then so be
it, but the goals/desires and wellbeing of the beings involved is first
priority not to be sacrified for some other goal imho. I do not see why
all bits need to be reused for something else to the degree that any
being need be wiped so your dilemna is lost on me.
> Put another way, I think that Eliezer and Greg have a problem adhering
> to extropic principles unless there is some very magic hand-waving
> done on why "extropians" or "SysOps" would choose to limint the
> phase space that can feasibly be explored.
I find this a curious projection actually of our current scarcity
thinking on to what is not in any reasonable timeframe likely to be
scarce. It seems like a projection of the need for death and recycling
on a totally different context and a claim that this is required to
actually support extropian principles no less!
> In short -- what do you do in the simulation where you get to the point
> where you cannot explore any further without 'offing' someone against
> their will?
You do not explore further. But I think this is a strawman that in
practice is highly unlikely.
> [Now there may be parts of the Extropian Principles that deal with this
> (I haven't checked them), but it would seem to me that the consequence
> of this is that what we are really discussing is "Limited Extropianism"
> and not a consciously driven (read probably more efficient) full-blown
> extropianism where you fully explore the phase space.)
What makes fully exploring the phase space a wonderful goal, extropian
or otherwise, if you limit involuntarily (murder) some extropians in the
process of chasing after this goal? Goals can only be reasonable in the
context of what they are for. You lose that context when you speak of
offing people imo.
> Note, there may be a form of "moral" extropianism where you explore
> the phase space as fully as possible, but you do it with as little
> "pain & suffering" as possible. This gets extraordinarily tricky
> as for example would be the case where one want to determine whether
> pain & suffering can drive people to self-enlightenment such that
> they realize pain is something that "they" have and can choose
> to experience in a variety of ways.
I doubt we are wise enough to decide exactly what our wellbeing and best
goals are and especially that we are wise enough to decide what they are
for others. In light of that I find it morally objectionable to talk of
offing one another to supposedly better pursue the goals we all agree
to. I don't agree to any such goals at that price.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:41 MDT