[optimism on longevity tech curve snipped]
--> Brian Atkins
> > I really don't see this; really, really not. It just doesn't
> sound plausible
> > to be so absolute about this; it begins to seem like an issue
> of faith in
> > technology, in everything working out ok. Consider even just
> the trial and
> > error is-this-working timescales...
> > Anyone want to roll out some numerical- and factual- based
> arguments on one
> > side or the other?
> Well for starters what do you think of the Singularity concept and
> the hard data and graphs Kurzweilai.net has?
> P.S. Did you see that news I posted about sequencing your whole DNA in
> about 2 hours? What's interesting is that in many cases recently we
> seem to be actually exceeding the Kurzweil graph predictions. So I
> consider them to be conservative.
Yes, I did. This is an advance in speed of obtaining raw data, not speed of
processing it. I'm not sure it belongs in those particular arguments of
progress...but that may just be semantics.
[Now the idea of a certain few people I can think of being passed in
entirety at high speed through a single nanopore...;) ]
For me, this whole business is all cold and calculating risk analysis. Any
and all risk associated with my ceasing to exist is unacceptable risk.
Unknowns are risk. Therefore the only logical conclusion is to do all I can
to reduce that risk.
Insofar as the Kurzweil analysis of the problem goes, what are the
statistics on futurists -- even damn smart futurists doing the analysis in a
serious, thoughtful way -- being correct in their predictions? That is to
say; a) correct in the time sense, and b) correct in the technology sense.
Now I realise that broad enough predictions are fairly safe (very, very
broad predictions, like "we will have a channel tunnel one day", or
"everyone will be able to talk to everyone with little effort within 20
years"). However, "aging being defeated soon enough for us not to die" is a
pretty specific prediction in time and technology. I, for one, look at the
past performance of such predictions and worry. A lot. Nothing fundamental
in the art of making predictions of human societal achievement is different
from 100 years ago. This is enormous risk, right there.
So; either I sit back and accept the risk, or I get out there and reduce my
risk. This is a no-brainer; it should be for everyone who doesn't believe in
resurrection and pattern identity. Just because you think an analysis of the
singularity in time and technology is good, doesn't mean that you should
discard an appreciation of just how risky such an analysis is.
[My take on the analysis? If you want to take the axioms as a given, it's
fine. However, I don't think it has much real-world application. We're not
very good at modelling societies at all -- we can barely model stock charts,
the most simplistic societal output of all].
This archive was generated by hypermail 2b30 : Fri Oct 12 2001 - 14:39:43 MDT