Chris Fedeli wrote:
> I want to alert you to a nascent movement originating here in
> Berkeley opposed to "Techno-eugenics'" i.e. human germ-line engineering
> with the intent of producing super-people, which presents some very
> serious threats to the future of humanity, social equality and the
> like.. The technology is movign forward quite quiickly, with little
> public awareness.
I would also agree that human genetic engineering poses a serious threat - to public relations; it's too slow to become a serious problem in its own right. Eugenics, especially, poses a problem - I don't even consider it Extropian; it doesn't respect the defining quality of ultratechnology, which is that ultratechnology requires pushing buttons. But it's become something of a curse-word, in this century, and for good reason - we should oppose an attempt to extend it to transhumanism in general.
But I would strongly advise that we all watch these people for a while before attempting to convert them, or even asking them about positions on, say, neurohacking. Some of their list charter implies that these people are technophobes, or at least less sane than the attempted image of justifiably concerned technophiles would imply. Of course, there *is* in fact a small group trying to promote a transhuman, if not techno-eugenic, future... but just because they're out to get you doesn't mean you're not paranoid.
> I'll be doing some major lurking. I promise to forward more juicy
> tidbits as they arrive.
Good work. I agree, these people will take watching.
-- firstname.lastname@example.org Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way