Re: Lanier essay of 2001.12.04

From: Samantha Atkins (samantha@objectent.com)
Date: Thu Dec 13 2001 - 00:20:05 MST


Anders Sandberg wrote:
>
> On Mon, Dec 10, 2001 at 04:11:14PM -0500, Dossy wrote:
> >
> > The fear is that through cloning/GE, there will be an artificial
> > imbalance in the way genetic populations evolve and self-regulate.
> > Population will be over-run by "designer humans", potentially
> > those who have genetics that are not naturally possible via
> > reproduction.
>

I share that fear considering what will happen if the current
general society determines in detail what kind of human beings
it wants for the next generation. Overall, given reasonably
democratic free-choice this will result in the proliferation of
a number of traits many today would consider desirable and the
suppression of a number of others, including some arguably part
of high intelligence and creativity.

In short, we are in a sort of Catch-22. We can't seem to get
too much better without getting smarter and more capable and we
can't get smarter and more capable without using certain
technologies with more wisdom than we generally possess.
>
> 2) the assumption that things will get worse if humans are allowed to
> take responsibility for them. This is largely based on the ease we can
> come up with bad scenarios and examples of mistreatments in the past,
> while leaving out all the good things. One reason is of course that good
> news are not very exciting, so they do not get trumpeted about as much
> or written about in history books. But there is also an assumption here
> that mankind is always fallen, and increased freedom always must result
> in increased evil actions rather than increased evil and good actions.
>

I don't assume things would get worse. I simply assume, based
on a lot of observation, that some pretty screw things will be
done with any powerful new technology, not just or even
necessarily predominantly good things. There is reason for some
caution and safeguards.
 
> 3) The assumption that the best way of handling this potential risk is
> to abstain from a technology that could be bad in a repressive society,
> rather than seek to deal with the repressive society. If we are worried
> about sexism, maybe we should see what we can do about sexism in our own
> society. If we are worried about cloning being used to create carbon
> copy people, maybe the right way to handle it is to strengthen
> individualism?
>

I am not so much worried about a repressive society as about not
especially wise or more or less rational or ethical human beings
and human organizations wielding powers increasingly able to
really screw us up in perhaps a terminal degree.
 
> > Another (somewhat unrelated) scary thought experiment:
> >
> > People fear cloning/GE used to produce "designer humans" that
> > are better than organic/natural humans. Has anyone discussed
> > the fear of using cloning/GE as a weapon?
>

I do not worry about super-soldiers. I do worry about people
become obsolete and no consideration given to their well-being.
Whether they are made obsolete by genetic design of superior new
humans or by AI or robotics or something else is immaterial to
the basic concern.

 - samantha



This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:26 MDT