Re: The Great Filter

Forrest Bishop (
Thu, 10 Oct 1996 13:03:40 -0700

[[In The Great Filter, 9-15-96(?) version, Robin Hanson writes:]]

..For example, while one can imagine predatory probes sent to search
destroy other life [Brin 83], it is harder to understand why such
probes would not also aggressively colonize the systems they visited,
if such colonization were cheap. Aggressive colonization would give
them all the more probes to work with, and deny resources to
competitors. If this colonization effort could hide its origins from
those who might retaliate, what would they have to lose?

[[Hiding its origins may require forgetting them. If this is the case,
colonization becomes a pointless exercise. Denying resources requires
a priori evaluation of the potential competitor’s capabilities, which
is not possible in practice.
>From another side, one feature of Superintelligence is the ability to
and re-synthesize its/themselves. This feature alone may render
social theory inapplicable. ]]

Finally, we expect advanced life to substantially disturb the places
it colonizes. ... And unless ideal
structures always either closely mimic natural appearances or are
effectively invisible, we expect advanced life to make visible

[[Visible changes are an invitation for destruction.]]

If such advanced life had substantially colonized our planet, we would
know it by now.

[[Unless we are it, courtesy of panspermia.]]

One possibility is that fast space travel and colonization between
stars and galaxies is much harder than it looks, and effectively
impossible, even for nanotech-based machine intelligence. The
interstellar medium, for example, may be much harsher than we realize.

[[The least understood component of the medium seems to be the density
and distribution of dust grains, rocks, and planet-size objects (some
has been made on this very recently). One method of countering this
is to blast clear corridors ahead of the probes.]]

This would suggest we have good chances of surviving, but little
prospect of leaving our solar system at any substantial speed. The
slower the maximum speed, the smaller is the Great Filter that needs
to be explained.

[[I don’t think this changes the equation much- an interplanetary
civilization can
still generate enough anomalous activity to be noticeable. The time
required for
their early signals to reach us would always be less than the time
required for
signals from their colonized star systems, again assuming no FTL
The maximum speed for atomically structured matter may be (depending on
interstellar medium density, self-repair speed, etc.) much less than
the maximum
speed for relativistic bombs and beams. It is certainly less than the
weapons, such aselectromagnetic, gravitational wave, other types of
Spacetime or
fundamental constant distortion pulses, and such.
Nuclear holocaust is only the introduction to the concept of
intelligently guided, instant
and total annihilation. One lesson has been that offense (first strike)
is always easier than
defense, indeed a defense may be effectively impossible. If Mutually
Destruction was, instead, Unilaterally Assured Destruction, as it would
be if your
potential enemies lived on other worlds, then first strike takes on a
fresh appeal. For
instance, a few kilograms of matter, launched at our planet from
anywhere in the galaxy
at very near the speed of light, would suffice to bomb us back to the
Age. The early warning interval can be dialed down to a fraction of a
second, if so
It wasn’t mentioned in the Starseed/Launcher articles,
but a scaled up version of this kind of device can serve as a low profile
world-buster. The website is rather popular at the national labs.]]

..First, large-scale engineering such as orbiting solar collectors
from asteroids, Dyson spheres, and stellar disassembling might be
effectively impossible, explaining why nearby stars look so natural.

[[It does seem to be well within the realm of present day engineering
practice to build structures in space large enough to downshift the

Second, structures that best use such resources might happen to almost
always preserve natural spectra and other appearances.

[[It might be wise to actively ensure that one’s constructs are well

..Similarly, Papagiannis claims that "those that manage to overcome
their innate tendencies toward continuous material growth and replace
them with non-material goals will be the only ones to survive this
crisis," implying a galaxy "populated by stable highly ethical and
spiritual civilizations" [Papagiannis 84].

[[Whatever that means. Maybe simply attributing caution to these
imagined civilizations would suffice.]]

And Stephenson claims that
"for a truly advanced intelligence the drive for quality rather than
redundant quantity would be paramount" [Stephenson 82].

[[There may be practical upper limits on the size of a (self aware)
computer, beyond whicha high power function of signal propagation
speed, OS complexity, etc. diminishesor even reverses the gains in
computational power. Gravitational limits also come into play, long
before a “Jupiter-size brain” is reached.
A superintelligencemay find at some point it doesn’t want to get any
bigger.This point may be physically small. Sending clones of itself to
ther star systems would not help its realtime crunching at all.]]

.. The point is that in general the creatures whose purposes lead to
most reproduction end up dominating the future.

[[This an argument for quantity over quality. Since viri and bacteria
reproduce the fastest and the mostest, do they therefore “dominate”?]]


No alien civilizations have substantially colonized our solar system
or systems nearby.


Thus among a billion trillion stars in our past
universe, none has reached the level of technology and growth that we
may soon reach.

[[But I am not sure this follows.]]

[[Forrest Bishop]]