Re: Reproducing (was Re: are there smart drugs)

D.den Otter (neosapient@geocities.com)
Fri, 15 Oct 1999 23:21:18 +0100



> From: Joseph 1 <joseph1@neosapiens.org>
> > "Breed like rabbits"?! "You and me baby are nothing but mammals,
> > so let's do it like they do on the Discovery Channel"...Really, there's
> > no need for such a sacrifice because a) the Singularity, and/or total
> > destruction, will be upon us well before any breeding programs would
> > start to have a noticeable effect...
>
> I've never been a fan of such reasoning. It's kind of like saying "We don't
> have to worry about running out of natural resources, because Jesus will be
> returning in our lifetimes,"

Come on, are you really comparing religious myths to logical consequences of ever accelerating technological progress? If you think that the idea of a [relatively near] Singularity/Doomsday is "silly", then look at the alternatives: perpetual technological stagnation due to "social" factors or because suddenly ("magically"), against all odds, human knowledge hits a permanent, fundamental ceiling which prevents us from developing "strong" AI *and* nanotech *and* intelligence augmentation by means of genetic engineering. Profoundly illogical, and almost as silly as waiting for Jesus IMHO.

> or the Golden Bullet of nanotech which will
> magically make all our technological, social, and economic problems
> disappear.

No, but it will solve *most* contemporary problems, and of course create some new ones that we currently can't even imagine. Planning ahead for more than, say, 50 years is probably a waste of time. Planning ahead for more than a century is utterly absurd. By that time we'll all be dead or ascended.

> History has proven that humanity's predictive powers are often quite
> lacking, and certainly no reason not to reasonably plan for the future.

History won't be of much use in the future. Unlike our ancestors, we're about to change some *very* fundamental things; the fabric of life itself, one might say. Instead of simply making more and better monkey-tools, we'll cease to be monkeys altogether. Now *that's* what I call a revolution.

> Raising the average human IQ through a program of Conscious Evolution
What do you exactly mean by "Conscious Evolution"?

> is a
> worthwhile goal, and one that shouldn't be quashed merely because one model
> predicts doom for humanity's future...

That one model (which is in fact two models in one; the roads to extinction and ascension are paved with the same technologies) isn't just any model, but the most likely extrapolation of current trends. If you have a better model, I'd love to see it.