Robin Hanson wrote:
> Mike Lorrey writes:
Well, for example, if the average compensation level for a machine
intelligence were $0.00, while the average compensasion level for a biological
human level intelligence were $30,000.00, you could state accurately that the
average wage level would drop to $15,000.00 if half the work force were AI.
However, no legally human entity's wage would drop. Moreover, since the
> >> >Why have you assumed a state of slavery for the sentient beings? How
> >> >would self ownership and a right to ones own wages, for the machine
> >> >intelligences change your results?
> >>
> >> I did not assume machine slavery, and so my results are uneffected by
> >> slavery vs. not.
> >
> >Uh, I don't think so. If all AI machines were treated with a slave
> >existence, then biological humanity as a whole would develop an
> >aristocratic level of wealth based on the uncompensated profits from the
> >productivity of the AI machines. I think that this scenario is the
> >preferable one at least at first, as we are not JUST talking about
> >completely human level AI. There will be a whole taxonomy of various AI
> >entities, some of which we may want to grant personal sovereignty to, but
> >hardly all of them. ...
>
> Perhaps, but I don't at all see how this is contrary to what I said.
I would say that this is the proper way to look at it, for the most part. I mean, why would an AI have material wants and desires unless we program them into the AI?? Our own materialistic desires are based on our evolved nest building instincts, something which an AI will not posess. What does an AI need anyway??? More disk space??? more peripherals? Since such assets will be networked, AIs should be able to operate in a much more socialistic economic environment. There would be no need for an AI to have permanent posessions outside of its own hardcopy, and no ongoing consumption outside of a few watts of power, and rent for variable temporary processing space/
Mike Lorrey