Re: Singularity or Holocaust

Eugene Leitl (eugene@liposome.genebee.msu.su)
Fri, 27 Feb 1998 14:33:42 +0300 (MSK)


On Thu, 26 Feb 1998 DOUG.BAILEY@EY.COM wrote:

> I do not believe anyone, regardless of their power base, will be able to
> effectively prepare in such a manner as to increase their survival chances at
> and after a technological singularity event.

This would seem to depend on the flavour of the Singularity, wouldn't it?

If it was a full-blown DIY Blight, we won't have any chances at all. A
typical scenario of the Blight would seem the sudden emergence of one or
several malignant/indifferent Powers as a result of Web transcending, or
of a supercritical ALife experiment. I'm just rehashing the FAQ here, of
course.

Otoh, the Singularity may be benign/slow, so there are traversable
continuous persona space trajectories. Of course anybody not following is
endangered even on the short run. And of course many will choose not to
follow.

On the third nanomanipulator, there might be no Singularity at all.
Bummer :(

> If a singularity event were to occur, life would be extremely different. I am

By definition.

> not saying it would be necessarily incomprehensible to those who were around
> after the event. But my guess is we can not imagine the changes a singularity

If their reality representations was faulty, they would't remain around
for long. Darwin would seem one of the very few constant factors
throughout the Singularity. Then maybe I'm using an invalid extrapolation.

> would cause before it occurs and thus we would have a hard time effectively
> planning for such an event.

As we are on the outskirts of the Singularity already, trying to keep
himself _very_ informed while accreting personal wealth (=a richer set of
possible future ego trajectories) would seem a pretty good strategy. Of
course being part of the team about to transcend would be even better ;)

> Vinge's idea of a technological singularity looming in the future is based on
> the exponential tendecies of technological progress over time. In mathematics,
> a singularity is a point where mathematic modelling no longer works. I think
> the best way to view a technological singularity is as a mathematical
> singularity. In this sense, I view a technological singularity not as a point

As I seem to recall, Vinge himself spoke against such an interpretation in
his recent interview with N. More.

> where technological progress becomes infinite but as a paradigm shift. Our
> conception of technological progress can not comprehend the environment at or
> after a singularity event. Thus when we attempt to graph it or manifest it in
> some way we come up with "infinity". The singularity event where a black hole

But physics itself does not know infinities. With the possible exception
of infinite spacetime curvature (where God divided by zero ;) -- which
probably tells us more about shortcomings in current theories then
infinities in the 'real' 'world'.

> is formed is equally incomprehensible to our current conception of the laws
> that govern the universe. Many theorists believe that to fully understand what
> takes place at the black hole singularity (and at the possible singularity that
> existed near or before the point where the universe was the Planck time in age)
> requires a dramatic shift (paradigm shift) in our understanding of the universe.

Aren't superstrings, and M-theories not weird enough for ya? There are
paradigm shifts occuring every second Monday.

> Would I survive a technological singularity event? Maybe, but not without
> being changed by it. Perhaps the lithmus test will be whether people can adapt

Sounds very plausible ;)

> to the post-singularity environment after the singularity occurs. Perhaps
> we'll need drastically enhanced intelligence, mental resources, or other
> characteristics completely strange to us now.

There is no way how flesh would be able to persist even halfway through a
Darwinian Singularity (does anybody disagree?). I even do not see how
uploads are hoping to survive virtually unchanged (as there are
conservative cryonists, there are conservative upload researchers) in a
world rushing towards the new equilibrium (which might well be a Red Queen
equilibrium).

Several mails upstream, Randal Koene of MURG cited several buckytube
theorists who recently estimated (before you ask, I have no idea who it
was, which reference values for the human equivalent they used, and
intuitively, their estimations would seem totally off) that about three
bottlefulls of buckytube nanocircuitry would eclipse the combined wetware
crunch of the whole planet. Even assuming one person/waterglass-grade
circuitry (certainly achievable with autoassembly-constructable molecular
circuits), one cubic mile of it would seem to be able to do a bit more
than just to walk the dog.

> About concerns of a holocaust, I made a post a month or so ago about what
> meaning transhumanism had in a future where the Strong AI hypothesis ended up
> being true. I think a technological singularity could produce similar
> concerns. Every method Vinge proposed to get us to such an event involved

Singularity is impossible without the strong AI hypothesis not being true.
Brains the size of a planet are a bit difficult to do with plain wetware.

> drastic changes in the way we are now. I am not saying this would necessarily
> be a "bad" thing. But it would serve as a virtual holocaust of the way we are
> now. Terms such as transhuman imply some residue of humanity surviving. Its
> possible the future will result in a posthuman era where whatever you define as
> "human" is no longer discernable in the resulting lifeforms. But all of this

Can there be an isomorphy between an embryo in blastula stage and an adult
individuum?

> hinges on factors we have no real control over. If a holocaust of humanity
> resides in the future then there is little we can do to avoid it. Such a
> holocaust, if it is to happen, will ride in on the apparently inexorable tide
> of technological progress.

Er, what was that thing, what was it called.. Dynamical Optimism, like?

ciao,
'gene