Re: Darwinian Extropy

Robin Hanson (hanson@hss.caltech.edu)
Tue, 24 Sep 96 10:51:23 PDT


On Mon, 23 Sep 1996, Robin Hanson wrote:
>> Not all information can be computed, if one doesn't have the right inputs.
>
>Information? Computed? Obviously, you are not referring to the
>information theory definition of information...
>
>> clear there is some maximum computational "depth" (compute cycles over
>> output + input length) It would be very interesting if you could prove
>> that a computer has some universal minimum discount rate. That would
>
>I don't know, whether you refer to minimum discount in terms of
>complexity theory.

I don't think I understand your comments here. But rather than get
distracted with this, let me rephrase my position on the main point.

Dan claims that any agent faced with the choice between sending out
probes to colonize the universe and staying home and using the probe
mass to compute stuff, would rather stay home. The idea is that there
is no information that could be returned from the colonized universe
that couldn't be more quickly computed by that probe mass staying
home. This seems a remarkable claim to me.

>> an explosive improvement. Good changes are hard to find, and each one
>
>Robin, positive intelligence autofeedback loops are unprecendented. It
>does no good looking for comparisons. Because there are none.

Of course it is precedented. That is exactly what research feeding back
into education is. As humanity learns more, we get better at learning
more, and our understanding expands. It just doesn't explode as fast
as you think possible.

>Human IQ is (assymetrically) bell-shaped distributed. It spans the
>entire spectrum, from moron to Einstein. Assuming, everybody is an
>Einstein equivalent? Surely, this must have some impact upon the world?

Of course it will. And human IQ *does* seem to be have increasing for
the last century. Hal posted on this a few months (or was it a year) ago.

Robin D. Hanson hanson@hss.caltech.edu http://hss.caltech.edu/~hanson/