Robin Hanson wrote:
> Vernor Vinge writes:
> >Notions of great change raise the vision of all sorts of things that
> >might be called a singularity. In the past, discussions of what I've
> >written about have more often than not spread out to things quite
> >different.
> >(snip)
>
> O.K. Uncle. It seems I was mistaken in my attempt to create a
> focused discussion on singularity by focusing on Vinge's concept
> and analysis. I incorrectly assumed that Vinge had in mind a specific
> enough concept of and analysis of singularity to hold discussants'
> attention. In fact, by "singularity" Vinge seems to just mean
> "big changes will come when we get superhumans." And while Vinge
> has dramatic opinions about how soon this will happen and how fast
> those changes will come afterward, these opinions are not part of
> his concept of "singularity", and he is not willing to elaborate
> on or defend them.
One concept that I developed in my minds eye upon reading much of Vinge's work is that a 'singularity' is not a destination, but the horizon of any man's vision of the future. While extrapolating from Moore's Law we think that the event horizon of the 'singularity' is going to come closer and closer as we approach a given point that seems now to be incomprehensible. However, given that amplification of our own intelligence is an effect of approaching the singularity, our own ability to imagine future possibilities should also expand with our own intelligence.
For example, I and others on this list can imagine greater things in store for humanity in the future than the average broom pusher or the average tribal bushman. I am sure that to at least some humans on this planet, our internet civilization we are building here is superhumanly incomprehensible, while we would look at those people as savages, living a near-animal existence. To them, we are already beyond the singularity event horizon as they see it.
To those individuals participating in the bleeding edge techno-culture, anyone less technological is a savage, while the 'superhuman' is never in existence, as it is always 'todays' ideal for improvement. I'll make a stab at a chart to illustrate this:
| | | | | |s | / |i | / |n THE NOW / |g | / |u savage | / |l human ----- |a animal | -------- |r ----------0----------- |i --------- | superhuman |t ------- | |y / | / | / |where: x axis is time relative to the observers present and y axis is advancement.
The singularity is ALWAYS in the future. It will NEVER be reached. And just as today there are savages and spacemen living on the same planet, there will also be a whole panoply of civilizations within each nation or ethnic group that at some point will be incomprehensible to each other. This is the 'generation gap' in the extreme, though it is not necessarily a matter of the age of the participants, but in the version numbers of the participant's operating systems. As Moore's Law begins to contract, anyone more than x number of software generations behind the curve will find themselves in a state of career and cultural obsolescence, as far as the bleeding edge is concerned.
Mike Lorrey