Intelligence enhancement (was: Selfish Reason To Preserve Chimps)

From: Eliezer S. Yudkowsky (
Date: Thu Nov 29 2001 - 19:37:36 MST

Mark Walker wrote:
> BTW, I think there are a couple of reasons genetic engineering ought to be
> given more prominence in Transhumanist thinking. First, it is something that
> we could start right after lunch--at least we have the technology, we just
> need the resources and the political will.

We do not have the technology. We have the technology to make arbitrary
alterations to DNA, in the same sense that we have the technology to make
arbitrary alterations from ones and zeroes. Going from DNA to
intelligence enhancement is not necessarily easier than going from ones
and zeroes to Artificial Intelligence. And your compile-build-debug cycle
is 14 years long, and illegal.

My guess as to the best course is to try implanting (in adults - still
illegal, but in no wise immoral) Calvinian hexagonal grids of Neural
Signals's neurotrophic electrodes as adapted for "write access" as well as
"read access", implanted across two or more humans and locked into
synchronization, on the off-chance that the two brains will learn to
decode each other's cerebral codes and start talking to each other.
(Prerequisite: That William Calvin is right about the representation of
the cerebral code.) If this works you can expand to 64-human grids and so
on, if the 2-human pairs aren't transhuman enough.

> Imagine if genetic engineering
> smarter persons were taken on with the same sort of political enthusiasm as
> landing men on the moon. I realize that around here genetic engineering is
> not as "sexy" as uploading but it would seem wise not to put all our
> posthuman eggs in one basket. (After all, the claim that humans can create
> an AI is an empirical and falsifiable proposition).

Uploading is post-Singularity technology.

> Second, my experience
> suggests that when it comes to presenting the Transhumanist case it is
> better in the first instance to discuss concepts like 'posthuman' in meat
> terms. Talk about SIAI requires two conceptual leaps for most people,
> namely, that we can create superintelligence and we can create AI.

A PR issue which requires, as prerequisite, that an unbiased line of
reasoning have previously established a reasonable probability of
meat-based transhumanity playing a part in the Singularity.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:23 MDT