As far as I know, the bobble is physically impossible. However it may be
possible to simulate its effects with other technologies. Here I am
especially interested in the possibility of tunneling through the
Singularity.
Why would anyone want to do that, you ask? Some people may have long term
goals that might be disrupted by the Singularity, for example maintaining
Danny Hillis's clock or keeping a record of humanity. Others may want to
do it if the Singularity is approaching in an unacceptable manner and they
are powerless to stop or alter it. For example an anarchist may want to
escape a Singularity that is dominated by a single consciousness. A
pacifist may want to escape a Singularity that is highly adversarial.
Perhaps just the possibility of tunneling through the Singularity can ease
people's fears about advanced technology in general.
Singularity tunneling seems to require a technology that can defend its
comparatively powerless users against extremely, perhaps even
unimaginably, powerful adversaries. The bobble of course is one such
technology, but it is not practical. The only realistic technology that I
am aware of that is even close to meeting this requirement is
cryptography. In particular, given some complexity theoretic assumptions
it is possible to achieve exponential security in certain restricted
security models. Unfortunately these security models are not suitable for
my purpose. While adversaries are allowed to have computational power that
is exponential in the amount of computational power of the users, they can
only interact with the users in very restricted ways, such as reading or
modifying the messages they send to each other. It is unclear how to use
cryptography to protect the users themselves instead of just their
messages. Perhaps some sort of encrypted computation can hide their
thought processes and internal states from passive monitors. But how does
one protect against active physical attacks?
The reason I bring up cryptography, however, is to show that it IS
possible to defend against adversaries with enormous resources at
comparatively little cost, at least in certain situations. The Singularity
tunneling problem should not be dismissed out of hand as being unsolvable,
but rather deserves to be studied seriously. There is a very realistic
chance that the Singularity may turn out to be undesirable to many of us.
Perhaps it will be unstable and destroy all closely-coupled intelligence.
Or maybe the only entity that emerges from it will have the "personality"
of the Blight. It is important to be able to try again if the first
Singularity turns out badly.