Re: Why would AI want to be friendly?

From: Emlyn (emlyn@one.net.au)
Date: Fri Sep 29 2000 - 08:10:38 MDT


> Eugene Leitl writes,
>
> > Your reasoning is based on a slow, soft Singularity, where both the
> > machines and humans converge, eventually resulting in an amalgamation,
> > advancing slowly enough so that virtually everybody can follow. While
> > it may happen that way, I don't think it to be likely. I would like to
> > hear some convincing arguments as to why you think I'm mistaken.
>
> I can't think you're mistaken, since this thread entails nothing that can
> presently be tested. My current readings in robotics persuade me that the
most
> successful engineers will build machines which can do useful things,
rather than
> try to build humanoid robots, just to prove it can be done. Similarly, I
don't
> find a fast technological singularity as useful as a convergence of
augmented
> humans with their Mind Children. IOW, the only usefulness of a fast TS
would be
> to escape or leave behind Homo sapiens (those nasty war mongers).

I think I missed a bunch of posts, no idea why, damned ISP (they'll be first
against the wall when the singularity comes, don't worry about that).
Anyway, this slow singularity thing has got me beat. Doesn't the idea
contradict the definition? Exponential (or double exponential or whatever
people are calling it, where the rate of acceleration increases), doesn't
really have a slow option.

Also, I am intrigued by the idea of questioning the usefulness of a fast
singularity. Pretty much, the idea of technological singularity is that it
is fast, by definition, after a certain point (and most likely fairly
invisible before that point), and it's not the result of committee decision
making; it's just going to happen, as an emergent property of decentralised
technological and scientific research/world economy/some other stuff I've
missed, in place in the world right now.

The options appear to be
1 - Draconian crack down on everything technological across the entire world
(the tournique on the neck option), or
2 - Put on your flippers and flying goggles, it's party time!

However, 2 isn't quite as out of control as it might appear; there is some
hope that we can figure out at least the initial conditions we would favour
going into the spike (whatever good that does us), then guide things in that
direction. That's one of the basic reasons for being for the transhumanist
movement, is it not? Caveat: some THers don't believe in the mighty deity
that is the Spike; they have other goals.

Emlyn
(Talking of Spike, he really did dissapear, cold turkey no less! Scary!)



This archive was generated by hypermail 2b29 : Mon Oct 02 2000 - 17:39:23 MDT