> As I understand it, your contention is that an entity with a strong
> motivation system might be capable of changing it, but would never actually
> choose to do so. Is that substantially correct?
Yes. If it is sufficiently rational it would not change its basic motivations. It may of course rewire specific drives if it thinks that would serve some higher-level goal.
There is one type of situation where this fails though. If the SI knew that it was to be subjected to some sort of mind-scan, and that it only if it has certain fundamental values would it be allowed to survive, then it might replace its old values with new ones, the new ones being choosen as that set of values that would (1) allow it to pass the test, and (2) lead to the attainment of its old values at least as well as any alternative set of values that would pass the test (modulo its present knowledge).
http://www.hedweb.com/nickb firstname.lastname@example.org Department of Philosophy, Logic and Scientific Method London School of Economics