> The Powers are the AIs. Since I don't see a mechanism by which
> we should be able to keep up with the super-joneses, and I do not
> see any incentive on the part of AIs (until they're constructed
> very, very carefully from human biological template -- which is
> about the most complicated and expensive way to build an AI, so
> most likely a different, easier but more dangerous path will be
> taken), the AIs are the sole players on the field. As soon as
> they arrive, we become a part of the environment, to be manipulated
> at will.
> This is not a nice state to be in, so I suggest we handle
> this dangerous transition period with extreme care.
Great suggestion! But, given that all of what you said is true, exactly
what would such "extreme care" look like? I hardly think the AIs, if
you are right, are going to be more or less annoyed with us if we
prattle on about taxing them. Should we all just take the view I've
heard Hans Moravec state that we should be happy in building our
evolutionary successors but not expect to stay around? Seriously, what
does being careful look like given your view of the situation and how
much does it matter beyond insuring we are around long enough to get the
AIs started? I'm not saying this is (or isn't) my view. I'm just
attempting to understand more fully what yours is.
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:24 MDT