Hal Finney writes:
>Doug Bailey, <firstname.lastname@example.org>, writes:
>> If it were me, I would have a two backups stored of the latest acceptable
>> version of my mind (heuristics, knowledgewebs, etc.).
>> Backup One: The Judge
>> There would be a gatekeeper AI that would be in charge of one backup.
>> a 24 hour period (objective time), the gatekeeper would boot-up my backup
>Don't you mean "subjective time"? In objective time 24 hours might be
>a subjective century for you.
I considered objective time a better reference point. My intent was to anchor the time interval with the gatekeeper AI since any reliance on the "new" me could make the entire process vulnerable to manipulation.
>> and let me scrutinize the current me (the end state of the last 24 hours
>> mind morphing). If the backup me thinks the new me is acceptable, the
>> gatekeeper overwrites the new me over the backup copy and the cycle
>> again anew. If the backup me thinks the new me is unacceptable (e.g.,
>> suicidal, psychotic, just plain "weird") then the new me is overwritten
>> the backup me and the cycle begins anew.
>This is an interesting approach. I can conceive of cases where it might
>not work well, but they are not very convincing. Maybe you could drift
>into an uptight, restricted mindset, and the only way out of it is by
>some kind of mind-blowing, consciousness-expanding experience, the very
>thing which would be rejected by your Judge.More serious might be a
>subtle memetic trap, like some kind of cult where you get sucked into
>it gradually. Now your Judge is helping to enforce your new mindset
>and it's going to be harder to break free.
You've identified some significant shortcomings to the approach. But I think they could be ameliorated in a couple of ways. The levels of review could be multiple. For example, every three months, the archived mind (of three months ago) can perform the judgement process on the entire mind trajectory of the last three months. An analysis could be performed to determine if some sort of memetic obstacle was introduced. If this is the case, the memetic virus could be discerned (more easily at least) by the uninfected mind and either removed from the current mind or that trajectory is abandoned and I start with the mind a few days before the memetic infection. Of course, this might be dangerous since it is possible my mind's configuration prior to the infection is what gave rise or made me vulnerable to the infection in the first place. However, as long as the three month (or whatever time period) window is in place there is some assurance that undesirable states can be "cured".
As a side note, a particularly nefarious "undesirable" mutation could bestow vastly increased intelligence in the short-term while it sinks its cognitive bile deeply into your mind's configuration. Even this scenario though would not be immune to a multi-level mind review.
>Certain aspects of this approach may make people uncomfortable.
>Some people will be squeamish about the "death" the Judge faces when he
>allows himself to be overwritten by the next day's copy. When in the
>role of Judge, they might find themselves tempted to preserve their train
>of thought even at the cost of losing a day's experience. This kind of
>identity management would not be not appropriate for people with such
Herein lies one of the two most formidable obstacles to this approach. The
continuity problem is a vexing dilemna that has been debated at length in
this forum. Two identities exist (the backup and the new), separated by a 24
hour time period, and only one emerges after the judging process (assuming
the "overwriting" scenario). Obviously, one entity is no longer in
The other obstacle is the solution to the aforementioned problem. If
The other obstacle is the solution to the aforementioned problem. Ifthe new entity, even though judged undesirable by the backup entity, is allowed to continue then it might want to institute its own review process. In a cosmologically short span of time the storage capacity needed to store all the mind trajectories of me would exceed the information storage capacity of the universe.