---------------------
Date: Thu, 21 Nov 1996 01:27:08 -0800
From: Tim Freeman <tim@infoscreen.com>
I read your pages starting at
http://tezcat.com/~eliezer/beyond.shtml
following the citation you posted to the Extropians list.
I agree with your criticism of the story about the person getting his
hand fixed after it was run over with a car. Cars, biological people,
and hands are all obsoleted by other elements in the scenario.
I don't agree that getting to to the Singularity ASAP should be the
interim goal. Instead, I think the goal should be to get to the
Singularity with as high a probability as possible. First priority is
that somebody gets to it; second priority is that I get to it (perhaps
repeatedly self-transcending along the way). The human suffering that
happens between now and then is small in the grand scheme of things,
and is insignificant compared to the suffering or lost experience that
would happen in any scenario that doesn't end in Singularity.
Drexler once said (I paraphrase) the sooner nanotech happens, the
slower it will happen. That is, the sooner we start work on trying to
achieve nanotech, the more primitive the supporting technology around
it will be, so a greater amount of time will pass between when it
starts to attract enough attention for people to start planning for
the consequences, and the time when those consequences happen. So
there is some overlap between the goals of doing it soon and doing it
with high probability.
It seems to me that there is a good chance it won't happen by 2035.
We haven't a clue about how to build a stable transhuman society, or
even a stable transhuman for that matter. The social practices of
biological humans are going to be hard to incorporate into our
creations, and there's a good chance that they wouldn't be the right
practices to incorporate anyway. The disequilibrium introduced by
technology will slow the progress of the technology. It is prudent to
position yourself so you will be able to wait. Life extension is
important. Calorie Restriction is the best technology available now,
IMO. Cryonics comes a close second; the question with it in my mind
is whether too much change will happen before the people are revived
for those people to be anything other than pets.
I agree with you that it might be worthwhile trying to form an
intensional community of people consciously striving for stability in
the presence of change. In The Diamond Age by Neal Stephenson, Nell
notices at one point that she needs to be in a tribe to accomplish
anything, since any individual can be overcome by a group. Algernons
would have a particularly intense need to be in a tribe of people who
would have capabilities complementary to their own.
I know three people who are competent to do life extension research
but aren't doing it now because the money isn't forthcoming. I don't
think the immediate obstacle is lack of vision or skill; I think it is
lack of money backing the right projects. In my mind this raises the
importance of ordinary success in the business world, since that is
ultimately where the money has to come from.
Tim Freeman
--------------------------------------
Date: Thu, 21 Nov 1996 19:07:03 -0600
From: Eliezer Yudkowsky <eliezer@pobox.com>
> I know three people who are competent to do life extension research
> but aren't doing it now because the money isn't forthcoming. I don't
> think the immediate obstacle is lack of vision or skill; I think it is
> lack of money backing the right projects. In my mind this raises the
> importance of ordinary success in the business world, since that is
> ultimately where the money has to come from.
Oh, aye, indeed, and precisely.
Lack of financial backing is probably the major obstacle at this time.
My hope is to solve that problem, personally, if it persists. If Bill
Gates - who is funding life-extension research - suddenly gets stricken
by a vision of the Singularity, that would probably accelerate it by 1-2
years.
> Algernons would have a particularly intense need to be in a tribe of
> people who would have capabilities complementary to their own.
The right "tribe", given me, could probably build up enough money to
make it to the Singularity in, say, five years max. The only reason I'm
not doing that right now is because I want another six months (or two,
or eight) to accumulate fame, through methods I shall not speak of,
before looking for companions and venture capital.
> It seems to me that there is a good chance it won't happen by 2035.
If it doesn't happen by 2035, it's not going to happen at all. The
minimum-time path to the Singularity is the maximum-probability path to
the Singularity; the problem at this point isn't lack of technology,
it's lack of time. How long do you think things are going to last?
> We haven't a clue about how to build a stable transhuman society, or
> even a stable transhuman for that matter.
Stable transhumans are a problem, yes, but even an unstable transhuman
could probably make it to the Singularity in three months. They don't
need to be *that* stable. Even if they don't survive, they will make the
sacrifice. A fatal Neural Growth Factor is better than none.
As for a stable transhuman society, don't worry. Just don't worry.
It will be stable. A society of Algernons won't be stable. A society of
Countersphexists would probably be stable, assuming it survived. A
society of transhumans with Countersphexist abilities plus the abilities
of a full human isn't going to have anything to worry about. They would
make it to the Singularity in between two weeks and six months.
> Cryonics comes a close second; the question with it in my mind
> is whether too much change will happen before the people are revived
> for those people to be anything other than pets.
Two words: Moravec Transfer.
The Powers will be ethical. I am more certain of that than I am of the
sun rising tomorrow. For the Powers to knock us off for our spare atoms
without even making backups, our survival would have to somehow threaten
the Powers AND human life would have to have no meaning.
> Drexler once said (I paraphrase) the sooner nanotech happens, the
> slower it will happen. That is, the sooner we start work on trying to
> achieve nanotech, the more primitive the supporting technology around
> it will be, so a greater amount of time will pass between when it
> starts to attract enough attention for people to start planning for
> the consequences, and the time when those consequences happen. So
> there is some overlap between the goals of doing it soon and doing it
> with high probability.
Like heck. It's entirely possible that the actual time to Singularity is
precisely 14 hours after I (or the right Algernons) walk into a
nanotechnology lab. They're completely right to get as much substrate as
possible in place *now*; one breakthrough and we'd need it all.
Seeing as how this discussion is more interesting than all other threads
on the Extropian list, I propose we move it there. Am I seconded?
Up and Out,
Eliezer S. Yudkowsky
-------------------------------------
Date: Thu, 21 Nov 1996 20:25:38 -0800
From: Tim Freeman <tim@infoscreen.com>
>If Bill Gates - who is funding life-extension research - suddenly gets
>stricken by a vision of the Singularity, that would probably
>accelerate it by 1-2 years.
I didn't know Bill Gates is funding life-extension research. Is there
evidence for this you can disclose? There is at least one relatively
small-budget experiment I'd like to make sure he considered funding:
Does aminoguanadine extend the lifespan of (put your rodent of choice
here)? It allegedly prevents non-enzymatic glycolysation, and it
seemed to prevent some signs of aging in a relatively short pilot
experiment.
I said:
> Cryonics comes a close second; the question with it in my mind
> is whether too much change will happen before the people are revived
> for those people to be anything other than pets.
and you said:
>Two words: Moravec Transfer.
>The Powers will be ethical. I am more certain of that than I am of the
>sun rising tomorrow. For the Powers to knock us off for our spare atoms
>without even making backups, our survival would have to somehow threaten
>the Powers AND human life would have to have no meaning.
I'm not too concerned about being revived by evil people; I don't
think evil people would take the trouble. I am concerned about being
revived into a situation that is far enough advanced that even with
the available improvements I have nothing to contribute, and am
therefore irrelevant. If you focus on the consequences, there is no
important difference between being irrelevant and being dead.
>Seeing as how this discussion is more interesting than all other threads
>on the Extropian list, I propose we move it there. Am I seconded?
Fine with me (that is, feel free to quote anything I said to you). Be
aware that I don't read that list except occasionally via the
archives. It ruins my focus.
Tim Freeman
----------------------------------
Date: Fri, 22 Nov 1996 02:27:52 -0600
From: Eliezer Yudkowsky <eliezer@pobox.com>
> I didn't know Bill Gates is funding life-extension research. Is there
> evidence for this you can disclose?
Um, not offhand. I read once - maybe in Great Mambo Chicken - that "Bill
Gates is pouring money into biotechnology companies, looking for the
secret of eternal youth." Dunno where to start checking it out.
> I'm not too concerned about being revived by evil people; I don't
> think evil people would take the trouble. I am concerned about being
> revived into a situation that is far enough advanced that even with
> the available improvements I have nothing to contribute, and am
> therefore irrelevant. If you focus on the consequences, there is no
> important difference between being irrelevant and being dead.
Uploading is generally followed by upgrading.
Do you think that's impossible or that nobody's going to bother?
It's not going to be impossible for a Power.
And with computer power doubling every two seconds, I think they'll have
the room sooner or later.
And why wouldn't they bother to direct a fraction of a fraction of a
subconscious sub-personality to put in the necessary effort?
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I know.