> One concept that I developed in my minds eye upon reading much of Vinge's
> work is that a 'singularity' is not a destination, but the horizon of any
> man's vision of the future. While extrapolating from Moore's Law we think
> that the event horizon of the 'singularity' is going to come closer and
> closer as we approach a given point that seems now to be incomprehensible.
> However, given that amplification of our own intelligence is an effect of
> approaching the singularity, our own ability to imagine future
> possibilities should also expand with our own intelligence.
> The singularity is ALWAYS in the future. It will NEVER be reached.
If you consider the emergence of SI to be [the beginning of] the Singularity, then it will be a real event at a certain (fixed) point in time, for example 30 years from now. This is imo a more useful definition.
> For example, I and others on this list can imagine greater things in store
> for humanity in the future than the average broom pusher or the average
> tribal bushman. I am sure that to at least some humans on this planet, our
> internet civilization we are building here is superhumanly
> incomprehensible, while we would look at those people as savages, living a
> near-animal existence. To them, we are already beyond the singularity
> event horizon as they see it.
Maybe, but we are all still the same species. Though there is a large knowledge gap, we still share all the basic traits that make us human. The SIs on the other hand will be like a completely alien life form.
> Do not feel bad, though, as there will always be niches and reservations
> and uses for individuals of all levels, just as there are niches for
> savages within our own culture.
Hmm, I'm not so sure.