John Clark writes:
> >Imagine a machine that you just put on your head ... that
> >gives you instant, safe pleasure like you've never felt before.
>That could be the explanation of the Fermi Paradox, the reason we can't
>find any ET's. If it were possible to change your emotions to anything
>you wanted, alter modes of thought, radically change your personality, swap
>your goals as well as your philosophy of life at the drop of a hat it would
>be very dangerous. Once you change yourself you may not want to change
>even if your behavior became bizarre or suicidal.
If it were me, I would have a two backups stored of the latest acceptable
version of my mind (heuristics, knowledgewebs, etc.).
Backup One: The Judge
There would be a gatekeeper AI that would be in charge of one backup. After
a 24 hour period (objective time), the gatekeeper would boot-up my
Backup One: The Judge
There would be a gatekeeper AI that would be in charge of one backup. After a 24 hour period (objective time), the gatekeeper would boot-up mybackup and let me scrutinize the current me (the end state of the last 24 hours of mind morphing). If the backup me thinks the new me is acceptable, the gatekeeper overwrites the new me over the backup copy and the cycle begins again anew. If the backup me thinks the new me is unacceptable (e.g., suicidal, psychotic, just plain "weird") then the new me is overwritten with the backup me and the cycle begins anew. This process might impede the fast evolution of my mind into more robust states because I can not appreciate the possible drastic progress my mind made in the previous 24 hours. However, it provides insurance that I don't fall victim to the scenario you've described.
Backup Two: The Archive
The second backup would be archived with other backups from previous days. This mind archive could prove useful in variety of ways. It might serve as a useful data source for my present attempts to improve my mind. What worked? What didn't? What kind of synergistic effects can be discerned? And so on. My mind archive could serve as a useful data source for other minds also. I could analyze the mind archives of other individuals also. These archives would be stored in static storage media, i.e., they would not be operational identities.
This entire setup could be tailored to the aggressiveness of the individual. Some people may wish to have the "judging" period only once a month instead of every day. Others might wish to review their progress in near real time (every second, or less) with each subtle change analyzed and selected for or against, hence accelerated Lamarckian evolution.
There are all the concomitant problems with identities and the "rights" of the new me. If the overwriting of the new me becomes a problem then the new me could be allowed to continue. But the backup me would have the chance to reinstate itself. Thus, the backup me is never at the mercy of an unfortunate choice of modication that might send me off on some psychotic tangent.