Re: Fw: Today's Headlines from NYTimes.com Wednesday, July 4,

From: Mike Lorrey (mlorrey@datamann.com)
Date: Mon Jul 09 2001 - 09:46:47 MDT


Anders Sandberg wrote:
>
> lördagen den 7 juli 2001 20:32 Harv wrote:
> > On a logical level, Lee makes and excellent case that hiding from the world
> > is not the answer. Gays shouldn't have to hide from gay-bashers. Rich
> > people should feel safe to walk the streets without being assaulted.
> > Governments should be prevented from interfering with their citizens' lives.
> > National security should come from a position of strength and defense rather
> > than security by obscurity.
> >
> > However, I'm not sure that we will ever reach that level. Mike feels safe
> > because he has a gun. I feel safe because my computers have firewalls, IDS,
> > encryption and anti-virus. Eliezer feels safe because AIs are programmed to
> > be friendly. Will we ever be so safe that we drop our guards? Or will we
> > always need to keep our safety protocols in place to maintain the safety
> > that we desire? I am betting on the latter.
>
> Safety is a statistical concept. Just as I argued for friendly AIs, what we
> want is as much safety as we can get, but we will have to accept that it will
> not be perfect.

We also have to look at how that safety is implemented. A top down,
state centered (or cnetered) approach is what comes from weak smart
contracts that are constructed on the premise that individuals are
untrustable but institutions made up of individuals are trustable. Such
state centered security contracts invariably result in fascist
repression as they follow Acton's Law. Arming only the state's police
force is a low trust social contract. Mandating state controlled means
of disabling private guns is also a low trust social contract. Imposing
these controls states that individuals do not trust each other to begin
with, as a default.

Conversely, security contracts in society that are based upon the high
trust principle of 'innocent until proven guilty' are centered on trust
of the individual and pseudotrust of institutions (because under Acton's
Law, institutions are attractors of corruption, and therefore need to be
watched over by armed and vigilant citizens). Arming the citizenry is a
high trust social contract between individuals. That I am armed does not
mean I don't trust my fellow citizens, merely that I am willing to trust
anyone who does not point a gun at me. The gun is a fallback position
for the rare instance that trust fails.

Similarly, giving the government the power to posess your private key to
your computer system is also a low trust social contract, and allowing
individuals to control their own key administration without government
oversight is a high trust social contract.

The difficulty in the area of computer technology is that with
remoteness and anonymity, it becomes difficult to impose consequences in
the event of a breach of trust (as with the gun analogy, I can impose
consequences immediately if someone breaches trust in person, but not
across the net). The threat of consequences is of paramount importance
in maintaining high trust in a society. If the legal system is distorted
so badly that most criminals either escape apprehension or conviction,
or if they are insufficiently punished for their breach of trust, they
will find a positive cost benefit sum in their risk calculations, and
will commit breaches of trust.

>
> I think Brin makes a good point in the transparent society (which is often
> overlooked as we get into the crypto-vs-camera escalation discussions) that
> the important think is keeping the society open and healthy. That is the main
> goal. Privacy might be important but is really just a means for making a
> society worth living in, and there might be reasons to regard other means for
> keeping society nice as more important. Accountability is of course the
> obvious one; it can to a large extent deal with the above-mentioned problems
> on various levels.

The only redeeming feature of transparency is that it allows us to live
in each other's skin, in a manner of speaking. The problem in high trust
societies is that they must remain relatively homogenous. We essentially
have circles of trust in our lives, and we assign trust ratings to
strangers based on preliminary evaluations of where in our trust circles
they seem most apt. Someone who fits into our family very well earns
high trust. Those who are strange looking or behaving do not.

We also assign trust based on media programming, as has been shown with
studies of school kids and adults of various races responding to
pictures of black and white people with trust ratings. It turns out that
blacks in the US in fact tend to trust white people more than black
people, and women trust men more than other women. This helps explain
the prevalence and persistence of white males in positions of leadership
in the US despite huge gains in equal opportunity for blacks and women.

Transparency does allow for individuals to experience the lives of
others who are different from themselves and learn that we are all far
more similar, and therefore trustable, than our instincts and ignorance
based fears tell us.

>
> Imperfect societies can be made liveable if we add the right feedback loops
> to reward nice behavior and punish bad behavior (they don't even have to be
> put into place centrally, quite a few are Hayekian spontaneous orders like
> unwritten rules of courtesy and neighbourhood relations). This will not turn
> them perfect, but change the probabilities of things happening (sometimes to
> very large degrees).

Yes, consequences are of paramount importance. They will be far more
important with transparency, but they must be imposed on a person to
person basis. If transparency wins, we must significantly reduce the
degree of control the state has over the justice system and increase
personal ajudication.

> I'm less convinced about that "barriers", things
> intended to prevent certain actions from being taken (like encryptions, locks
> and pre-progammed friendliness), can produce the liveable society. In some
> situations they work by making costs of ill-doing high (it is harder to read
> en encrypted file or pick a lock than it would be without the barrier), but I
> can't see them helping to reward nice behavior.

Oh, I don't know. I think continued existence is a very nice reward for
being a nice person.



This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:42 MDT