From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jun 01 2003 - 18:28:21 MDT
Robin Hanson wrote:
> Hal Finney wrote:
>
>> This is a commonly discussed question on the everything-list.
>> The consensus resolution is that simpler universes must be inherently
>> more probable. To provide an overly concise explanation, this can be
>> justified by imagining that universes correspond to computer programs
>> that
>> describe the "laws of physics" and initial conditions for that universe.
>> Finite length programs can be thought of as prefixes of infinite length
>> strings. The measure of a program will be greater if it is shorter,
>> because then it is a prefix of proportionately more infinite strings.
>> Therefore shorter programs are of higher probability, and therefore
>> more lawful universes are inherently more probable. ...
>
> Why simpler things should be more likely is a deep and difficult topic,
> which I've read a bit on. But I just don't understand this argument.
> Why must universe A have a higher probability that universe B if the
> string describing A is a prefix for the string describing B?
I too have been pondering this for some time. One thought that occurs to
me is that if process A is shorter than process B, then in a world of all
possible Level IV computations, processes that are isomorphic to A, or
unidentifiably close to A, will occur more often than B.
Rumor has it that there is actually a theorem that establishes that, for
Turing machines, the probability that a string S will be produced by a
Turing machine specified by random coinflips, is 2^-K(S), where K(S) is
the Kolmogorov complexity, i.e., the length of the shortest Turing machine
that produces S. This is a nonobvious result, in that it could be that,
beyond the *first* machine that produces S, more complex machines - for
example, those of length greater than K(S)+100 - tend to produce S with
greater frequency than 2^-K(S). Unfortunately I do not know the name of
this theorem, or even whether it is more than a mere rumor. It would seem
to be required for Solomonoff induction, though.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sun Jun 01 2003 - 18:40:55 MDT