Re: extropians-digest V7 #22

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jan 23 2002 - 22:04:18 MST


Louis Newstrom wrote:
>
> From: "Wayne Hayes" <wayne@cs.toronto.edu>
>
> > "Louis Newstrom" <newsnewstrom@home.com> writes:
> > >I think most of you have missed the real problems with surviving
> > >[the Singularity]. Money.
> >
> > I doubt money will be a problem. If you can think a million times
> > faster than the average human
>
> I was thinking that you needed the money to get to that state.

No, *we* need money, "we" being the Singularity Institute for Artificial
Intelligence. We build a seed AI. The seed AI grows up into a
superintelligence. The superintelligence cracks the protein folding
problem, emails a DNA sequence to one of the many DNA-and-protein
synthesis labs now advertising online. The DNA sequences specify proteins
which build a nanoreplicator that builds more nanoreplicators. Some
number of hours later, you ask for whatever the heck it is you wanted from
the Singularity and get it. If you need a concrete model for how the
Singularity can be free of charge, there it is.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 13:37:36 MST