From: Hal Finney (hal@finney.org)
Date: Wed Jun 18 2003 - 12:21:04 MDT
Robin writes:
> How leaky will our distant descendants be? How far will they want to go,
> and be able to go, in agreeing to reveal their secrets to each other, to
> avoid the social problems that secrets cause? It seems plausible that our
> descendants will be constructed so that they can allow outsiders to directly
> inspect the internal state of their minds, to verify the absence of certain
> harmful secrets.
Such a practice is common among some of the members of the society
described in John C. Wright's book, The Golden Age, which we discussed
a few months ago. Influenced by that and also some of the debate about
Microsoft's Palladium initiative, I wrote a message titled "World of
Knights" that explored some of the possibilities if people were able to
prove their honesty in this way:
http://forum.javien.com/XMLmessage.php?id=id::GQ1aUElJ-EF94-GX4k-C3Z0-PmNcO0MRbkoj
Robin points out that economic pressures could strongly encourage the
voluntary adoption of these policies, making it harder for "Knaves" (those
who refuse to adopt this mental transparency) to compete and survive
economically once this becomes possible. Indeed, in Wright's book the
"Knights" appear to be the most successful members of society, although it
is not clear what is cause and what is effect.
The tie-in to Palladium I saw:
: In some ways, Palladium-style "trusted computing" technology provides a
: preview of such a world, in a small domain. It lets you convincingly
: prove to a remote system that you are running a particular program,
: which means that your computer's behavior is trustworthy from the remote
: point of view. That's why you may be eventually forced to run Palladium
: systems in order to legally download movies and music, because only
: this kind of public commitment to "honest" or "trustworthy" behavior
: will win the confidence of the content companies.
I suggested that our experiences with "Trusted Computing" technologies over
the next few years might give us a preview of the future world much along
the lines that Robin describes.
Robin does raise one cautionary note:
> And one disturbing implication of this is that
> we may well evolve to become even *more* self-deceived than we are now,
> as believing one thing and thinking another becomes even harder than now.
(Actually I think the idea has many disturbing aspects, but that's just
an emotional reaction!)
Wei Dai had made a somewhat similar point in response to my message, at:
http://forum.javien.com/XMLmessage.php?id=id::Wh1DfB54-bjhA-bksC-UyYU-Ghl9HQFcbBUw
: Does
: it prevent you from believing in self-serving rationalizations? Our
: current notions of rationality depends on the assumption that not only are
: your beliefs private, but there is no way you can convince others that you
: truly believe them. If you *are* able to convince others of what your
: beliefs are, it's no longer in your self-interest to only believe in what
: is true. We already see this to some degree because humans are not able to
: lie costlessly. The incentive for self-serving rationalizations becomes
: much higher when lying is impossible. It's not clear whether this could be
: prevented by any kind of technology.
I'm not sure though that Robin's and Wei's point(s) are necessarily
valid; just as people could become economically compelled to tell the
truth, they might feel equal pressures to seek the truth, that is, to
not self-deceive. It doesn't do me much good to know that you won't
lie to me, if I can't tell if you're lying to yourself.
There also might be pressure, along the lines that Robin describes
for standardization, towards mental structures which are relatively
transparent and don't allow for self-deception. A simple man is more
trustworthy than one who has layer upon layer of contradictory thoughts.
As long as this "motivational simplicity" is consistent with highly
intelligent and rational analysis, these kinds of minds should achieve
economic success.
Hal
This archive was generated by hypermail 2.1.5 : Wed Jun 18 2003 - 12:32:40 MDT