Hara Ra wrote:
>
> >Actually, the consensus position of the radical singulatarian
> >community (consisting of me and perhaps 'gene ;-) ) is that the
> >motivations and actions of the SI are intrinsically unpredictable.
> >There is no reason to predict that the SI will be inimical, benevolent
> >or indifferent to humans. I'm personally hoping for benevolence, and
> >I think the potential benefit is worth the risk.
> >
> This all assumes that SIs are separate from us. One way to avoid this is
> to form a symbiosis with them. This brings up other interesting issues,
> such as bandwidth, and whether the symbiotic system would feel that the
> human component is worth keeping around.
>
My personal favorite scenario is that the SI results from a hunman/computer collaboration. Thus, the SI starts in symbiosis with at least one human. I don't know what happens shortly thereafter. The most likely scenario IMO Is an explosive growth phase based on adding computer capacity (via the net) without adding additional humans. Note that this is SI singular, not SI plural, even if multiple humans are involved. At some point, the SI may choose to add more humans, either volunteers or draftees, and grant some level of autonomous identity, less than, equal to, or greater than we have now. However, it's IMO impossible to assign a probability to any action the SI may choose to take that isn't precluded by the laws of physics. That's why I'm very interested in prognostications up to the advent of the SI, and relatively uninterested in the post-SI era.