"Peter C. McCluskey" wrote:
>
> rhanson@gmu.edu (Robin Hanson) writes:
> >I think you're missing the point. Some AIs can tax other AIs,
> >just as some humans now tax other humans. The relative
> >abilities of humans and AIs are irrelevant.
>
> I think people are missing your point because it is irrelevant to
> the concerns they are raising.
> Who will control whether humans benefit from taxation? Moravec
> assumes without much justification that humans can remain "the powers
> that be" as far as this question is concerned without having many of
> the abilities that people expect are needed to remain "the powers that be".
> While not absurd, it is sufficiently far from any extrapolation from
> how existing societies work that it seems appropriate to assume it is
> improbable.
My take on Moravec's position in "Robot" is that in the beginning humans
will be able to control the production process and utilization of
advanced robotics and AI enough to set an early precedent. By
incorporating the extend AI and robotic abilities into keeping the
policies intact the policies will continue long enough to provide a
period of time when humans can benefit and sentients that don't play by
the local rules will leave. But it seems to me that Moravec pointed out
that this situation is unstable and that it is likely these artificials
sentients and highly augmented humans would decide to change this
situation, perhaps drastically.
- samantha
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:25 MDT