Nanotechnology: Just the beginning

Eliezer Yudkowsky (sentience@pobox.com)
Thu, 28 Nov 1996 20:27:15 -0600


A group of transhuman technology advocates hight the Extropians, to
whose mailing list I am subscribed, read an interview with you (in
Fortune magazine) in which you spoke highly of nanotechnology. This
touched off a minor debate in which numerous individuals spoke of
introducing you to our happy group, but nobody contacted you because the
concept of actually DOING something is apparently foreign to these
people. I am therefore taking matters into my own hands.

Judging by your quotes in the nanotechnology article, you are familiar
with the materialist Utopia aspect of nanotechnology, but not with the
far more powerful intelligence enhancement technologies that lead to the
Singularity.

---Quick Intro To The Singularity---

1) Computers double in power every eighteen months.
2) Computers double in power every eighteen *subjective* months.
3) Three years after computers match human researchers, computing power
goes to infinity.

4) The human brain is thought to require 10^17 operations/second.
5) Current computing power is 10^12 ops/second.
6) The Singularity will occur in approximately 2025.

---End of Quick Intro---

Since you're interested in the far future, I just thought you should
know that human history ends in less than thirty years. The scenes
depicted in "The Diamond Age" are really quite unlikely. Before
nanotechnology was that familiar, the era of merely human intelligences
would be long gone.

One Web page - say, you actually invented those things, didn't you? - in
which you might be interested is "Staring Into The Singularity" at:
http://tezcat.com/~eliezer/singularity.html

This expands the arguments given above, which are actually totally
implausible. Too optimistic because a dish of bacteria doesn't outweigh
the world in 4 days, and too pessimistic because bacteria can't think.

The page also provides answers to some "very philosophical questions".

> "There's no reason death should happen," he rushes on. "There's no reason
> decay shouldn't be totally repairable. There's no reason you shouldn't be
> able to design exactly the body you want."
>
> "Well," I say, "there's no reason I should be unhappy."
>
> "That raises a very philosophical question," he says. He seems exhausted, or
> perhaps disappointed by my skepticism. He slows down for a second. "Well,
> at the very least the concept of manufacturing goes away."

[As you might put it:]
"Well, then, we could redesign our minds too. There's no reason why we
should feel pain. There's no reason why our short-term memories should
only hold seven items. There's no reason why we shouldn't think
millions of times faster. You could speed up your neurons, put in some
new ones, and rearrange them any way you wanted to. If you wanted to
feel happy, you could do that."

My page supplies a categorical answer to "Why?", as well as "What?",
"How?" and "When?". Probably not correct answers, as I take pains to
point out, but answers that at least show any lesser answers to be
wrong.

Up and Out,
Eliezer S. Yudkowsky.

-- 
         sentience@pobox.com      Eliezer S. Yudkowsky
          http://tezcat.com/~eliezer/singularity.html
           http://tezcat.com/~eliezer/algernon.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I know.