Re: ULE: covert vs overt

From: Anders Sandberg (
Date: Fri Apr 20 2001 - 01:52:34 MDT

On Thu, Apr 19, 2001 at 07:08:40PM -0400, Michael Wiik wrote:
> I don't accept the idea of 'Transparent Society', I see no reason
> whatsoever why governments would go for this. Libertarians who might
> support this idea might see it as 'turnabout is fair play' or something.
> I accept it's not gonna happen. I accept that the government is always
> gonna be way ahead of any (revolutionary) libertarian/anarchist
> movement, and is gonna be able to spin events it's way as long as people
> are uninformed. The idea is to inform the populace and engage them in
> anti-terrorism/accident prevention efforts.

If the libertarians are the only one demanding openness, accountability
and transparency, then of course nothing will happen - everybody knows
they are weird gun nuts, and now apparently camera nuts too ;-) But
these issues interest far more people than that, and when you phrase
them into terms of controlling corruption, the right to document ones
life to prevent false accusations and especially forcing governments
into accountability, then you will be part of a far larger group than
just the libertarians.

I have noted that it is quite common to ascribe some sort of
well-executed evil agenda to the governments, as if they were the clever
bad guys in some Hollywood production. Reification, anyone? Governments
change policies in response to public demands - not perfectly, and
sometimes with a notable resistance, but it does happen all the time.
Governments often try to impose their values on society, but that is
also an imperfect process that is definitely getting harder. People are
not as manageably uninformed as they once were.

> In some sense, I think Bill Joy is quite right. I don't think humanity
> is capable of using nanotechnology/genetic engineering/etc without a
> good chance of at least some local disasters. So, I can see the need for
> police-state like ULE to prevent such disasters.

I think such a police state would *promote* local disasters. It creates
a false complacency, sets up an organisation that is not very
accountable , has plenty of temptations to misuse its abilities while
few incentives to keep to its original agenda of tech protection.

Local disasters will likely occur. But as long as we can keep them local
it is worth it. The danger comes from not being willing to risk *any*
risk - which would mean a halt to any research and innovation - and
setting up bad institutions that will not lower risks while hurting
openness. There is a real danger than a spectacular nanodisaster would
make these bad solutions politically popular; the best way of combatting
it is to make people understand the value of open societies, why these
solutions don't work and involve them in creating their own open

> Getting people involved might be initiated via a series of
> future-shock-inducing memetic campaigns.

Future shock usually makes people freeze and do... nothing. "Oh dear,
things are so complicated these days" Just look at how people reacted to
the Internet, plenty of the reactions were resigned or applied ideas
that clearly were old and inapplicable (like the decency laws) since
they had no new ideas. Hence I think inducing future shock is extremely
dangerous, because it makes people give their power to whatever experts
or politicians they think know the answers rather than thinking and
taking responsibility themselves.

Anders Sandberg                                      Towards Ascension!                  
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y

This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:59:49 MDT