ETHICS: Mind abuse & technology development [was RE: New

GBurch1@aol.com
Sun, 29 Aug 1999 10:26:04 EDT

In a message dated 99-08-27 07:21:42 EDT, bradbury@www.aeiveos.com (Robert J. Bradbury) wrote:

> It seems to me its a really big mess.

We sure do. Development of AI, uploading, mind-sculpting and mind-copying will make the philosophical conundra of mediaeval christian philosophers look like child's play . . .

> On Thu, 26 Aug 1999 hal@finney.org wrote:
>
> > There will be new minds created, conscious and full of potential, but
> > some of them will have different opportunities and training than others.
>
> I think we have a problem with the term "mind" here. We have the
> concept of a conscious entity. We have the concept of creating
> an "unformed" conscious entity and letting it evolve (children).
> We have the concept of uploading oneself onto new hardware.
> We have the concept of creating backup copies to be reactivated
> in the case of an accident.

Yes, the term "mind" seems applicable in all of these situations and, as your questions below point out, we MIGHT mean different things by that term in different circumstances. But then, we may not. One of the most exciting prospects presented by the new technologies listed above is that we may finally come to know whether "mind" is really one KIND of thing in all these different circumstances. My feeling is that we're coming to realize that the word "mind" most meaningfully refers to the intersection of more than one kind of cognitive process; for instance (to cite one small example), the intersection of "episodic memory" and "task analysis" when we "figure out" a generalized lesson about some situation from reviewing one or more memories of personally-experienced past events.

A MORAL theory of mind (which seems to be what you're looking for) may be dimly perceived in this insight, as applied to the questions you've presented. As a first pass at formulating such a moral theory of mind, perhaps we can say that an entity should be treated as both a moral subject and a moral object TO THE EXTENT THAT it exhibits more or fewer of the various distinct elements of "mind". As an example, consider a book or computer hard drive as an instance of memory. Both are utterly passive repositories of information, incapable by themselves of processing any of the data they record. Thus, a book or a current computer storage device exhibits only one aspect of mind (and that only dimly). Likewise, consider your PC's CPU: It is capable of processing data in a very simple way, but without software running on it and access to memory, it is only POTENTIALLY capable of exhibiting one or a small range of the capabilities of "mind". In the proposed moral theory of mind, we do not consider these examples to be very significant moral objects or subjects; although, interestingly, some people DO consider them to be very slight moral objects, in the sense that there is a slight moral repugnance to the notion of burning books (no matter who owns them) or, as has been discussed here recently, "wasting CPU cycles".

>From the proposed axiom of "mind morality", one could derive specific
propositions of moral imperative. For instance, it would be morally wrong to reduce mental capacity in any instance, and the EXTENT of the wrong would be measured by the capacity of the mental system that is the object of the proposition. Thus, willfully burning a book would be bad, but not very bad, especially if there is a copy of the book that is not destroyed. It might be more wrong to kill an ant (depending on the contents of the book with which one was comparing it), but not as wrong as killing a cat or a bat.

Let's see how my proposed axiom of mind morality answers your questions:

> Q1: Do you "own" the backup copies?
> (After all, you paid for the process (or did it yourself) and
> it is presumably on hardware that is your property.)

I don't think you "own" your backup copies any more than you "own" your children or other dependants. In the case of your children, they are built from the "joint intellectual property" of you and your mate and all of the atoms making up their bodies started off as your property, but still we don't say you "own" your children. Why? Because they are complex minds. Now, you may have special rights and duties with regard to minds that have such a special relationship to you, but "ownership" isn't among them.

Morally, mind is a special sort of "thing". For one thing, it is a process. Thus, one might be said to have something more akin to "ownership" in the stored pattern of one's backup copies, but once they are "run" or "running", they would take on more of the quality of moral subjects as well as moral objects. Once a system is capable of being a moral subject, "ownership" ceases to be the right way to consider it as a moral object.

Where does this conclusion come from? Simple: The Golden Rule. Minds are a special class of moral object BECAUSE they are also moral subjects. In other words, we have to treat minds differently from other moral objects because they are like "us", i.e. there is a logical reflexivity in contemplating a mind as a moral object.

> Q2: Do you have a right to "edit" the backup copies?
> In some forms, this would be considered "brainwashing", which
> seems morally repugnant.

This is a much more difficult question, especially if one considers that the "editing" can be done before the copy is ever active. Here we face hard questions posed by POTENTIAL minds. I would say that we are not without guidance from traditional moral wisdom in that the ethics of child rearing probably is a fairly complete guide to this issue. We "edit minds" all the time: It's called education. It is wrong to fill a child's head with nonsense and hatred because that head is a mind with the potential to become a moral subject just like ourselves. On the other hand, it is not only acceptable, but good to do our best to act to "program" or "edit" the child's mind to make it a BETTER mind. Thus, the proposed morality of mind would find that some "editing" of one's own backup copies would be good, and some bad.

> Q3: If you "edit" the backup copies when they are "inactive"
> (so they feel no pain) and activate them are they new individuals
> with their own free will or are they your "property" (i.e. slaves)?

Without a doubt, they cannot be your slaves, whether you edit them or not. See the response to Q1 above.

> Q4: If you are the "overlord" (an SI evolved from hal with some beta
> test code leased from the Anders & Robert SIs), and you have a
> subprocess running (say an "original" copy of hal, operating in a
> virtual reality to see what would have happened if you *had* tried
> to kiss the girl on your first date). Say, hal[0] kisses the
> girl, goes on to marry her, gets browbeaten into living a
> "normal" life, never discovers the extropian movement, etc.
> The simulation becomes uninformative to the overlord, so
> it is terminated and all copies of hal[0] are erased.
> Have you done something morally wrong?

This is ultimately the hardest question to be found within the general area of the "morality of mind": It is the ethics of godhood. I have proposed one answer in my essay "Extropian Ethics and the Extrosattva" (http://users.aol.com/gburch3/extrostv.html), which is basically an extension of the Golden Rule into the arena of unequal minds with the potential of augmentation, i.e. a godlike being should treat lesser beings as much like itself morally as possible, because those beings may one day themselves be godlike themselves. Even without considering the potential for moral-subject-status equality, though, I believe the godlike SI is not completely without constraint in how it should treat such lesser beings, no more than we are in how we treat lesser animals.

The morality of mind proposed here would dictate that the subprocess you posit should be treated with the respect due a fairly complex mind, even if that complexity is far less than that of the SI. Consider how we should ethically treat higher mammals who are used in laboratory experiments. Yes, it is perfectly acceptable that even chimpanzees be used in structured psychological testing, but they shouldn't be TORTURED, and the quality of their lives outside the testing arena is very much a matter of moral responsibility for the scientist in whose charge the chimps are placed. In the scenario you posit, it might well be acceptable that the subprocess be subjected to the "test" in question, but I would say that the "ethics of godhood" dictates that the subject of the test be ultimately "freed" from the lab and offered the potential to experience a less constrained life.

> Q5: How does one judge or evaluate moral systems?
> The female praying mantis, eating the male praying mantis
> for protein for the offspring -- is this "natural" or "wrong"?
> A human killing a human is wrong in the context of our current
> reality, but is perfectly reasonable for an overlord who has
> lots of them. It is morally incorrect (and should be punished)
> if SIs throw black holes at other SIs. Moral systems are totally
> irrelevant since in the long run the universe runs down anyway...

I utterly disagree with this last assertion. Just because the universe does not (yet) have a moral purpose, this does not mean that subsystems within it do not. Yes there is an inescapable relativism inherent in judging moral systems but, hey, physicists have come to terms with relativity: Why shouldn't moral philosophers?

> > Ethical problems arise similar to issues of child abuse, or of children
> > who are not given the same advantages of others. If your neighbor on the
> > next asteroid is creating sentient subminds and not letting them grow and
> > develop, you might be really unhappy about that. You and your buddies
> > might even be able to do something about it. But first you need some
> > ethical guidelines for which kinds of minds are proper and which are not.

Hal's put his finger on the fact that we're not treading entirely virgin moral territory here. We already have to deal with moral questions inherent in interactions of unequal minds and in one person having some kind of moral "dominion" over another. I believe that, ultimately, the same moral principles that guide such questions now CAN be "mapped" onto the harder questions you pose, Robert. But, as I suggest above, it will require that we become more clear than we have been formerly about just what a moral subject IS.

> I've been thinking about this problem -- my moral sensibilities are
> bumping into my freedom seeking libertarian leanings. It gets worse
> because you clearly would argue that someone molesting a child or
> raping another conscious entity (i.e. violating their free will)
> is doing something "wrong". But if you are doing these things in
> "virtual reality", nobody is really getting hurt, so it isn't wrong.
> But once your "virtual reality" expands to include "conscious minds",
> then it is wrong. But a "conscious mind" is to an SI, what a
> thought or a dream is to one of us. So if it is wrong for an SI
> to create scenarios in which conscious minds are molested/raped/killed,
> then it is wrong for us to "imagine" doing those things as well.
> But why would it be wrong to "imagine" something if it isn't real?
> Why is it wrong for an SI to do anything (since it is running nothing
> but a big computer simulation)? The reality is the processors
> orbiting on platforms around a star (or some other architecture).

This is only a difficult problem if we take a simplistic view of what "reality" is. "Reality" for questions of morality and ethics IS mind, so "virtual reality" is in a very real sense more "real" than the underlying physical substrate. (Phew! That was hard to say . . .)

> It gets worse, because if you argue that if its not wrong to
> "imagine" molesting/raping/killing and/or for an SI to freely
> manipulate "conscious minds", then you are on a slippery slope.
> Then you are in a place where "morality" depends entirely
> upon (a) the relative "seniority" (?) of the actor as compared
> with the "person" being acted upon; or (b) the physical
> reality of the actions.

I maintain that the latter factor is all but irrelevant to the moral questions posed here and that the former bears on the degree of responsibility one can rightfully impose on a moral actor. Thus, the more "senior" the moral actor, the more justified we are in judging its actions (while at the same time those moral judgments become more difficult -- thus we eventually encounter an almost christian moral inequality in which it becomes nearly -- but not entirely -- impossible for "lesser", "created" beings to morally judge the actions of a much "greater", "creator" being).

As I say, we will eventually find ourselves considering moral problems that would not be foreign to a mediaeval philosopher.

     Greg Burch     <GBurch1@aol.com>----<gburch@lockeliddell.com>
     Attorney  :::  Vice President, Extropy Institute  :::  Wilderness Guide
      http://users.aol.com/gburch1   -or-   http://members.aol.com/gburch1
                         "Civilization is protest against nature; 
                  progress requires us to take control of evolution."
                                      -- Thomas Huxley