[CRYONICS][BOOK] "The First Immortal"

Yak Wax (yakwax@yahoo.com)
Tue, 3 Feb 1998 13:33:04 -0800 (PST)


Hal Finney <hal@rain.org> wrote:

> Halperin has a couple of tricks to slow the rate of technology
advance.
> One is to put restrictions on AI research. Some early AI machines
become
> aggressive and kill people, and after that they are not supposed to be
> programmed with emotions or survival instincts. They're just a
bunch of
> Vulcan types. They do become much smarter than humans over the course
> of the 21st century, but presumably these restrictions do limit their
> rate of advance.

The ironic thing is we do the same thing in real life - appoint
authority so we can quitely 'ignore' the real complexity of a
situation. I've said it before and I'll say it again - authority is
ignorance.

> The other magic which Halperin has up his sleeve is his Truth Machine.
> This was the subject of his first novel, which I haven't read. The
device
> is a foolproof lie detector, and it allows laws to be enforced with a
> certainty far beyond anything which would be reasonable today.

The problem with (most) Sci-fi is the way it's written - first you
find a problem (crime), the you find a solution (the truth machine).
However, real life works differently - breakthrough technology is
created before any 'real-world' application has been thought of.

I wrote a paper on how the web could evolve into a "truth machine" of
sorts, but it would not be abusable in the same way. One of the
factors people are most worried about on the internet is the lack of
privacy, but it could turn out to be its greatest advantage. When
everything you do is recorded and stored in a distributed/open
environment and then linked to everything everyone else does, it's
impossible to lie or cheat. This doesn't require a revolutionary new
technology but the continuation of a trend (so it's more likely to
happen.)

> In effect this kind of technology is a power amplifier. Whoever has
the
> bulk of power in society can use this to enforce their will. If the
mass
> of people ultimately has the power, then in many ways the truth
machine
> will be beneficial, as they can prevent evil people from seizing power
> away from them. On the other hand, where the masses misuse their
power,
> there won't be the limitations on their abilities that we have today.
> The majority of people in the United States believes in conventional
> religious morality, but if it wants to legislate on that basis, in
> practice it can't control people's private lives very well. A truth
> machine would change that.

Of course, if there was absolute truth (and I mean "absolute" not just
the detection of crime) then it would create a truly 'open' society.
We would not only see the crime, but the reasons for the crime and
may well decide it is not a crime after all. We would also see that
*everyone* is breaking the law. The idea of this kind of
"open-society" appeals to me (not because it brings peace and love,
but because it's one of those things that changes *everything*.) Hey,
and finally others may see the world in the same way I do
(-->shocking<--)

> I thought Halperin's novel was most effective in depicting a 21st
century
> where we'd like to live, and where technology really does advance to
> the point where it solves many of our most difficult problems. I was
> uncomfortable when it took on a scolding tone, shaking its head over
how
> blind people were in the late 20th century not to sign up for
cryonics.
> This reminds me too much of a temperance tract regaling us with the
> evils of Demon Alcohol.

I wouldn't like to live there - government!

> Science fiction authors often succumb to the temptation to have their
> future characters talk about the mistakes of the 20th century - if
> only they'd taken care of the environment, or if only they'd been more
> socialist, no, if only they'd been more capitalist, etc. I never find
> this realistic (how much time do we spend talking about the mistakes
of
> the 1890's?), and even when it does fit, I don't necessarily think the
> future characters are right (people today disagree over the morality
> of the "robber barons" of the 19th century, even with 100 years of
> hindsight).

This is something I don't get - people (writers especially) seem to
think that in their future (unlike their parents, or grandparent
before them) they'll look back and think, "I'm so ashamed by all that
pain and suffering we caused, things are so much better now." The
problem is, things aren't going to better from our perspective, but
they will be better. 'Good' is subjective and future 'good' is
subject to future citizens (i.e. kids) The only way
potential-immortals could 'survive' tommorow is by doing away with
that kind of subjectivity (hence the reason I don't believe in
'morals', 'ethics', etc). Here's a clue: if you think there's
something wrong with the world today, you're not going to fit-in
tommorow.

> Halperin doesn't overdo this, fortunately. If his novel is a success
> then it probably will help to make cryonics more acceptable. But I
> suspect that it will be many years before it becomes truly mainstream.

The only obvious advantage of mainstream acceptance is having cheaper,
more ubiquitous cryonics centers. Which would be nice (I want a
portable.)

--Wax

Rise - Future proof your thinking.
On-Line Q4 1998
Preview Q2 1998

_________________________________________________________
DO YOU YAHOO!?
Get your free @yahoo.com address at http://mail.yahoo.com