Security and Longevity [was: some U.S. observations and notes]

From: Robert J. Bradbury (
Date: Sat Dec 22 2001 - 15:05:19 MST

On Sat, 22 Dec 2001, Amara Graps commenting on my comments wrote:

> Arghhhh! Robert: does it really seem to you that this security stuff
> is appropriate ?? From my view, it's blown way out of proportion. [snip]

> How in the world is this 'security' frenzy going to help humans on this
> planet be better (live longer, healthier, smarter, stronger, etc.) ?

Amara, don't pull your hair out over this. I was trying to point
out the middle ground. I'll to explain why.

Lets assume you've spent the last decade or so of your life studying
biotechnology & nanotechnology and are trying to start a company
to accelerate the approaching wave of those technologies (having
already started one such company somewhat prematurely and so
not being particularly enthusiastic about being too far ahead
of the curve again). Lets say that you've read all the material
I've read such that you can really "see" the future, such that
you can point out exactly where people who consider themselves
"future" prognosticators, even the moderately good ones such
as Kaku & Kurzweil, have failed to do their research. Lets say
you sit down with someone like Robert Freitas (which I've done)
and look closely at the situation and ask -- to what extent can
the wave be accelerated or decelerated? -- and you come to the
conclusion that the most you could do is change the arrival
time by +/- 3-5 years (depending on whether you had $10 billion
to work with or were deanimated and cryonically suspended).

Only on my most pessimistic days do I think that the inertia of
our current path will not carry us forward sufficiently fast
enough that I will be unable to benefit from it (whether I'm working
either for it or against it). So "live longer, healthier,
smarter, stronger, etc." *will* happen (no matter what I
personally do). It is also likely to happen no matter what
the U.S. government does (within reason). It will even happen
despite the efforts of the luddites, greens, etc. (no matter
what they or we do). [Yes, I *know* others on the list feel
differently, but this is about what *I* think.]

So, knowing that it is going to happen -- no matter what you do --
then an interesting dilemma appears. How much is one willing
to increase ones personal risk of not making it in order to accelerate
the curve? Or looking at it another way -- how much of ones
non-essential freedoms is one willing to sacrifice to ensure
that you personally make it?

I've flown over half-million miles in the last 15-20 years.
When people ask why I'm not signed up for cryonics, I tell
them its because I view my risk of death from an airplane
accident as being higher than my risk of death from any
other cause I can think of. Generally speaking one does
not survive airplane accidents in a form suitable for cryonic
suspension so a suspension contract makes little sense (for me).
If you were to walk around "knowing" that the chances you
were going to live thousands of years are probably 99.9%
(as I've done for most of the last decade) and you also "knew"
that was going to be true for most of the people alive today
on the planet (under the age of ~50 or so) -- then the needless
irrevocable deaths of individuals who would otherwise have been able
to surf the wave becomes quite significant.

Much of my time during the last month was devoted to finishing
a paper on how existing and near-future technologies could
be used by bioterrorists if they had the knowledge base that
I have. I am sure that after reading it most people would
feel a great deal less secure than they currently do now --
which is presumably less secure than they did before 911.
So I take safety and security *very* seriously -- particularly
when vehicles and technologies that I consider essential components
of accelerating the wave are being used as WOMD.

There are tradeoffs to make between being free and surviving.
The higher the trust and security level within a society the
greater probability there seems to be that one will survive
until such a time that even greater freedoms can be granted.
(I can grant you huge freedoms to do whatever you want when you
are located on the other side of the galaxy and little you do is
going to effect me.)

Until those times come, I will be asking myself questions like:
Is flying safe?; Is flying worth the time it takes to get through
the security checkpoints in Seattle?; Is eating out worth the
risk of food poisoning? (natural or terrorist); Is the small loss
of freedom of asking ~5000 people in America to talk with law
enforcement officials about whether they have potential knowledge
of terrorist activities a small price to pay for 1 life? What
about for 1000 lives?; Is spending $1 billion a year on the
development of good treatments for agents that might be used
by bioterrorists useful? What about $10 billion? Etc.

I think whether you view it as "way out of proportion" has a lot
to do with ones frame of reference. My frame of reference is
that it is *only* the unanticipated or unprepared for that is
likely to prevent me from living thousands of years. So I
spend a lot of time trying to anticipate and/or better prepared
for whatever it might be. If I personally feel secure then I'm going
to be pushing in directions to accelerate the wave. If I personally feel
insecure then I'm going to be attempting to find someplace safe to park
my life until the tornado blows past. One doesn't get to live
thousands of years by thinking any other way.


This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:29 MDT