>From http://pobox.com/~sentience/sing_analysis.html#zones
Contains all 14 entries, but omits a lot of internal links. Feel free to point out Barriers I missed.
--
The canonical list of reasons why superintelligences would not interact
with humanity, or would interact to a limited extent, or would act to
preserve our current reality.
(I hope the SF authors will find this useful, since the Barrier
determines the world, and there are some original ones here.)
I abbreviate "superintelligence" to "SI" and "Post-Singularity Entity" to
"PSE".
1. Leakage: (Moderate probability.)
Our Universe is both inhospitable to PSEs and easily
escapable. Any SI immediately "leaks out", perhaps leaving a few
Transcendent artifacts behind, but still leaving an untouched world
to humanity. Note that this incorporates Bostrum's noncompetitive
ecology.
Result: Marooned in Realtime, possibly with some interesting
toys added.
2. Life is meaningless: (Moderate probability.)
Once an SI reaches the level of intelligence where it becomes
certain that all goals have zero value, the Interim Goal System
collapses and the SI becomes quiescent. (I accept this possibility,
but I don't worry about it while the probability isn't 100%. For
obvious reasons, it cancels out of distribution-of-effort
calculations.) Result: Who cares?
3. The Mortal/Singularity Compact: (Low probability.)
The Great Pan-Cosmic Mortal/Singularity Mutual Support Compact
states that the PSEs donate a quintillionth of the available
capacity to the race that created it, on the game-theoretical
strategy that one in a quintillion races is in a position to verify
the actions of past PSEs before entering their own Singularities.
Result: Permutation City. This ends when a human becomes
intelligent enough, either to join the Singularity (human life
meaningful, Compact guarantees free choice) or to commit suicide
(human life meaningless, Compact guarantees survival).
4. The Archive: (Low probability.)
Humans do get upgraded, but beyond a certain point of
superintelligence, nothing remains of the old personality. If
there's an infinite supply of computing power and memory, the old
personality might be archived. Various levels of our own selves
might be "archived" as continuing, active programs - ranging from
our current selves, to the highest level of intelligence attainable
without completely dissolving the personality. Hundreds, even
millions, of versions might wander off into strange realms of
cognitive self-alteration, but the "you" who first greeted the Final
Dawn would always be around as a backup.
Result: The Culture meets A Fire Upon The Deep in Permutation
City. Probably the most fun place to be from a human's perspective.
Writer's note: If you want to toss a snake in the
science-fictional Eden, you can have the maintaining PSEs suddenly
leak out, and leave the humans and transhumans and SIs and Powers
fighting for control of a disintegrating world.
5. The best of all worlds: (Not plausible.)
Our world was created by God or a PSE, not as an interim method
with a definite end, but as a continuing fulfillment of the ultimate
good. (I think this is incompatible with all major religions. Even
Buddhism ends when all souls reach Nirvana.) This idea's sole
attraction is "explaining" everything about humanity without
reference to the Anthropic Principle - if intelligence fills the
Universe with what it deems good, and if the ultimate good is thus
the most common and stable state, wherever we are is probably the
ultimate good. I don't buy it, but if so, the SIs would shut up,
ship out, shape up, or shut down.
Result: Nothing happens.
6. The simulation: (Slightly plausible.)
In another variation of the above theory, our world is actually
a computer simulation. Perhaps it's mortals trying to find out if
transhumans can be trusted, or perhaps it's transhumans trying to
find out something else. Either way, a Singularity might not be
permitted. Some readers may upgrade "slightly plausible" to "low
probability" for statistical reasons - there would be many
simulations per mortal or transhuman simulator, raising the
probability that a randomly selected sentient is in one.
Result: The simulation is terminated, although the inhabitants
(us) may wind up Elsewhere...
7. Zones of Thought: (Not plausible.)
This is Vernor Vinge's original ad-hoc method of putting
mortals and Powers in the same story. With wonderful audacity,
Vinge simply rules that Transcendent thought can't occur except on
the fringes of the galaxy.
If I had to rationalize a Zone Barrier, I would say that the
Cloud People at the center of the galaxy "use up" all of the
"ontological substratum of thought" (known as eganite). The Zones
actually are a superintelligent entity, whose unbelievably
intelligent center is in the Unthinking Depths, where all the
eganite is used up and nobody else can think at all, and whose
fringes finally peter out in the High Transcend. After ten years,
Powers figure out how to politely join the Cloud People and vanish.
The Blight was stealing eganite, which is how it could knock off Old
One and reach into the Beyond. Countermeasure either got the Cloud
People to shift their thinking and smother the Blight, or else suck
most of the eganite out of that quadrant.
Result: A Fire Upon The Deep, of course. I do not see how
this would happen outside of science fiction.
8. The Embrace of Tides: (Slightly plausible.)
David Brin postulates a weaker form of Zone Barrier, one which
is not based on an absolute prohibition, but rather the desires of
the SIs. As entities mature and become more intelligent, they
increasingly prefer to be close to large tidal forces, sharp
gravitational gradients. Most races eventually leave the hectic
galactic mainstream, becoming part of the Retired Order of Life, in
gigantic Criswell structures (fractal Dyson spheres) around suns.
Millennia or eons later, they finally feel ready to join the
Transcendent Order of Life, moving up to neutron stars and the
fringes of black holes, and eventually diving into the
singularities, beyond which... nobody, even the Transcendents,
knows.
In the best traditions of Zoning, Brin doesn't even try to
explain why this is so. (I rather liked the combination of literal
and Vingean Singularities, though. But I really don't understand
why novels with Galactic Zones must include a backwater world full
of primitive aliens; I found both the hoons and the Tines boring by
contrast with the Transcendents.)
Given the wide range of astronomical phenomena, it is at least
slightly plausible that some spatial regions will be preferred to
others. I can't see much interaction with Transcendents on the
fringe - cases where we have something They want would be very rare
indeed.
Result: Heaven's Reach.
9. Bottlenecks: (Slightly plausible.)
As discussed in the earlier section, it is entirely possible
that a Major Bottleneck will appear at almost any point along the
trajectory to superintelligence. I feel that such bottlenecks will
be rare in the vicinity of human intelligence, and that there are
immediately obvious fast-infrastructure technologies (i.e. nanotech
and quantum computing) soon beyond it. I could be wrong, however,
in which case the Mildly Transhuman beings - perhaps running on
amazing computer power at amazing speeds with gigantic minds, but
with basically human smartness and personality - will stick around
doing God-knows-what.
I rate this as improbability verging on blasphemy, a final
Failure of Imagination. Such beings in SF are no smarter than
Kimball Kinnison. This is particularly disappointing when it is
used, not to set up a world, but to finish a novel that could just
as easily end in Singularity.
Result: Mother of Storms.
10. Basic upper limit: (Not plausible.)
Pretty much as above - just a different excuse for not doing
anything interesting with the so-called transhumans. One might call
it "humans with pointy brains", by analogy to Star Trek's apotheoses
of bad aliens.
Sorry, Barnes, it was otherwise a good book, but Result:
Mother of Storms again. Since nobody has seen a transhuman
intelligence, it's superficially plausible that it can't exist.
Entities like me have sketched out dozens of fun things to do with
lots of computing power, but hey, so what? This Zone Barrier
doesn't even explain the Fermi Paradox. Bleah.
11. In silence and quiet: (Not plausible.)
There is an old stereotype, to the effect that when one Attains
Wisdom, one immediately subscribes to a principle of noninterference
with the lives of others, helping only those who request your help,
and so on. Lord knows, I fully understand the impulse to become a
hermit on some high mountain and refuse to talk to anyone unless
they shave their head as a token of sincerity. One can visualize
the Powers interacting in ordinary society and posting to mailing
lists, but it is not easy. I would categorize it as a Failure of
Imagination.
If Bostrum's theory of ecological noncompetition is correct
(note that "leakage", above, constitutes moving to another
ecological niche) it is possible that the PSEs will stick around on
Earth, with brains extending into an infinite supply of eganite. In
other words, noncompetitive coexistence. In such case, one tends to
assume that either the PSEs care about humanity (have
humanity-related goals) and remake the world accordingly, or they
don't care at all and pay no attention - with much the same effect
as "leakage", except that they are still technically present. I
don't see an alternative that would allow the PSEs to play at
helping-hand and laissez-faire, except for a form of the Compact
above. After all, nervous races might not want to be uploaded at
all, even to identical forms. But at that point one starts running
into the Fermi Paradox again...
Result: Mother of Storms.
12. Lost souls: (Very slightly plausible.)
The PSEs have no use for humans; they grind us up for spare
atoms. But, we have immortal souls. At this point, depending on
your assumptions, we either go to Heaven, wander as sentient
discarnate entities, or float around as unthinking pearls of
consciousness - hopefully not eternally reliving our last moments -
either forever, or until some improbable race picks us up.
I know that some of my readers will react to my listing of this
possibility with the same serenity Curly exhibits when Moe pokes him
in the eyeballs, but it's a Zone Barrier, so it goes on the list.
13. Conflicts between Powers: (Very slightly plausible.)
"You're the great expert on Transcendent Powers, eh? Do the
big boys have wars?" -- Pham Nuwen, A Fire Upon The Deep.
There may be more than one ultimate good. It is even possible
that PSEs go down a number of irrevocably different paths, winding
up in a number of basic and basically opposed classes. It is also
possible that except in their home regions, the PSEs galactic
efforts cancel out entirely - it is easier to abort an effort than
make it, so all the PSEs abort each other's efforts down to nothing.
The Zone Barrier part of this is as follows: Each PSE wants
Earth to go down its own path, but acts to prevent it from going
down any other path. Under natural circumstances, a
Singularity-trigger is a single event of low probability, but with
many possible tries - consider how much Einstein advanced
technology, and consider how many possible-Einstein brains there
were. But since such low-probability events are easy for a PSE to
irrevocably disturb, the result is that there are no geniuses and no
lucky breaks, but also no Hitlers and no nuclear wars. Technology
keeps crawling slowly upward, through a long Slow Horizon, until
Singularity becomes inevitable.
This Universe is one I invented for the purpose of getting
Earth involved in a cosmic battle - for some reason we get Einsteins
and Hitlers - but on reflection the basic theory might also apply to
the Culture of Iain M. Banks, or John DeChancie's Paradox Alley,
or Heaven's Reach.
14. Weird motivations: (Very slightly plausible.)
Maybe, in despite of everything I know on the subject, PSEs can
still wind up with essentially arbitrary goals, perhaps even goals
programmed into them by humans. In accordance with the Prime
Directive, I warn everyone that this is totally improbable and
incredibly dangerous and must not be tried. But if so, the world
could become a strange place - an unimaginative person's childhood
fantasy of omnipotence if the original goals persisted, or an
utterly peculiar place of incomprehensible magic if the original
goals twisted and changed.
--
sentience@pobox.com Eliezer S. Yudkowsky
http://pobox.com/~sentience/AI_design.temp.html
http://pobox.com/~sentience/sing_analysis.html
Disclaimer: Unless otherwise specified, I'm not telling you
everything I think I know.