From: Samantha Atkins (samantha@objectent.com)
Date: Wed Apr 09 2003 - 00:23:50 MDT
Robert J. Bradbury wrote:
> Sheldon Pacotti has written an interesting article for Salon
> on a whole slew of topics, self-replicating nanotech,
> bioterrorism, transparent societies, government control, etc.
> 
> "Are we doomed yet?"
> http://www.salon.com/tech/feature/2003/03/31/knowledge/print.html
> 
> Its a bit long but contains many valid points.  I think he gets
> most of it right.
> 
There are some very important things I believe he gets wrong 
although mostly I think he is right also.  I think that we are 
largely conditioned to get many of these things wrong.  Here are 
a few of my thoughts on what we get wrong and what I think might 
be more "right" at the moment.
The revolution of knowledge, of computation being the only real 
power and being that which brings or can bring all else into 
being,  directly questions old values and assumptions based upon 
material things in limited quantities and labor being the source 
of economic power and other real world power and well-being. 
The old way of thinking gave us capitalism, with some caveats at 
least, as the best workable solution for satisfying the needs 
and desires of human beings from a finite supply.   But now, 
within the limits of matter and energy itself, that supply is 
effectively infinite.
The knowledge, once produced, of how to produce what we need and 
desire from raw matter and energy, also cannot be depleted by 
being used.  As a matter of fact knowledge grows in scope and 
refinement the more it is shared and combined with other 
knowledge.
A great change becomes possible in our economic institutions. 
The limit of our ability to create and enjoy whatever we wish is 
not the limit of our salary.  Salary becomes superfluous because 
there is no need to put fences up in such an open and boundless 
commons.  There is no need to say that you can only produce X if 
you pay me because I originally figured out the pattern for the 
nanobots to make an X.  So what?  What need have I for salary or 
for charging you and thus requiring you to have a salary when we 
both can produce whatever we want using the technology once it 
exist upon the common knowledge base of humankind?  Why on earth 
(or off it) would we play silly joy/salary games in such a 
world?   It seems to me we can think up a lot more interesting 
and fun games to play.  Having thought of those better games, is 
there any need for policing me to insure I don't use what you 
figured out without paying you?
Individuals all over the planet can have the capacity to 
practice boundless creativity.  They can have everything 
practically that they wish for.  In such a world where is the 
allure of "mischief" and of the hatreds of today?   Where is 
there a sense of being threatened or especially of being 
unfairly deprived?  Where is there a sense of desperation? What 
would be left as motive for wrecking paradise?
I don't believe our greatest danger is each other.  I think our 
greatest danger is not having sufficient imagination and ability 
to embrace the new to profit from rather than to abuse our 
scientific/technological advances.
Complete surveillance only works in societies that also have a 
very liberal (in the initial sense) understand of the rights and 
liberty of sentients.  In a society that has any ability or 
tendency for the majority to force its opinions on the minority, 
full surveillance is a horror far worse than any form of terrorism.
Why on earth would we believe that "we as a society" possess the 
wisdom to control and police the development of all technology? 
  Doesn't this leave the door open for the majority to forbid 
stem-cell research today or all strong AI tomorrow and most 
nanotech the day after that?  The majority of human beings are 
incapable of remotely understanding the technology and keeping 
up with its development much less regulating it wisely.  Even 
the most elite minority cannot fully produce wise regulations or 
keep up.  As we ramp up toward Singularity this becomes ever 
more true.  So full surveillance backing quite limited human 
abilities to monitor and regulate is very likely to strangle our 
future and our possibilities wholesale.
That speech becomes functional does not mean that speech must be 
regulated.  It means that speakers must be held to account for 
consequences and only that.  It is perfectly obvious that a 
tightly regulated society becomes stagnate to the limits of the 
competency and capacity of its decision making mechanism.  If 
that mechanism is political/bureaucratic it is very limited 
indeed and leads to great stagnation and oppression.  To use 
full technological power to enforce utterly inadequate decision 
making is the height of absurdity.
The greatest unexamined assumption in this piece is the "need 
for government" per se. Actually the greater unexamined 
assumption is that we will be much as we are now but only with 
better toys and larger threats. I feel no "sense of doom and 
need for autocratic rule" from the ever growing technological 
capabilities.  I feel great unbounded hope, and limitless 
possibility.  Why are so many people today concentrated on the 
negative?  That is admittedly a mystery to me.
Another archaic aspect of the article is the producer-consumer 
distinction.  As information increasingly becomes king and tools 
to work with that information become available to all (which 
after all is assume to power the threat scenarios), all persons 
who wish become producers as well as consumers.  Both terms 
increasingly disappear and society is a society of peers.  The 
possibilities inherent in technology are already too strangled 
by attempting to shore up notions that the few produce and reap 
most of the profit and that the many are their consumers only. 
The producers jealous guard the tools to allow the consumers to 
create their own solutions and share them.  It is another kind 
of needless fencing on the unbounded commons.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Apr 09 2003 - 00:27:13 MDT