> Reason wrote:
> > For me, this whole business is all cold and calculating risk
> analysis. Any
> > and all risk associated with my ceasing to exist is unacceptable risk.
> > Unknowns are risk. Therefore the only logical conclusion is to
> do all I can
> > to reduce that risk.
> All risk is undesirable, but not all risk is unacceptable. Some risks are
> necessary - for example, risks that can act as tools to reduce other
> risks. If you regard all risk as unacceptable, without being able to
> choose between large risks and small risks, you wind up in a fantasy world
> where you don't have to take any risks. Shortly thereafter, you are
> crushed by a still-hostile universe.
> I'm pretty sure you meant to say "undesirable" instead of "unacceptable",
> especially since you went on to correctly use the term "reduce" rather
> than "eliminate".
Pedantry accepted, more or less. Actually, I did mean "unacceptable" -- but
just didn't go on to clarify the obvious point of focussing on reducing the
risk as close to zero as possible in a as logic manner as possible given the
resources to hand. I'm quite fanatical about the not ceasing to exist thing,
but recognize the limitations...so getting rid of aging first, and upgrade
the hardware by degrees later such as to reduce that risk with every
In theory, I'll eventually find out what my level of acceptable risk is, but
I imagine I'll be -- at the very least -- a large dispersed collection of
self-powered computing devices in the outer solar system by that time.
Although that's probably going to be vunerable to close supernova.
But anyway, I digress. None of that is going to come about if a cure for
aging doesn't happen relatively soon, and sitting down and waiting isn't
going to make that happen any sooner.
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:43 MDT