"Eliezer S. Yudkowsky" <firstname.lastname@example.org> writes:
> Anders Sandberg wrote:
> > unlikely is that it cannot evolve, since evolution includes an
> > irreversible selection step where the unfit are weeded out. In a
> > reversible world individuals could un-die and and there would be no
> > evolution. In the same way memory and learning would be impossible,
> > since forgetting would be just as likely as learning.
> Let me try an analogy. Suppose that your computer had all RAM
> randomized. After interpreting these random instructions, eventually a
> sort of order emerges where 1s clump together and 0s clump together and
> most computational transactions occur when two areas of 1s and 0s mix.
> Anders Sandberg wrote:
This is not an ideal example, since the thermodynamics of this system is rather iffy (in addition, whether complex structures appear depends on the evolution rule). It has an arrow of time (if the rule is irreversible) but the amount of computation it can do is finite.
The conditions near the big bang was closer to having the memory constantly randomized; in this system no order can emerge other than in the functionalist "stones implementing windows" sense.
> >From the perspective of any systems that may evolve in the 1s-and-0s
> world, our computers are so much random noise. No useful
> information-processing could possibly occur with 1s and 0s so mixed
> together! There's no arrow of time, either.
Now I'm confused. This seems to be more like the second paragraph above, but your example is a deterministic dynamical system with a fairly standard dynamics.
> > > In fact, information is directly proportional to
> > > entropy. The more information it takes to encode a system, the more
> > > mathematically random ("normal") that system is, the more entropy that
> > > system has. Ergo, maximally efficient information-processing takes
> > > place at uniform temperature.
> > Be careful about the word "information", it has different meanings in
> > different fields. The information in information theory is rather
> > different from the information concept used in daily life and not
> > necessarily even identical to negentropy.
> I am using "information" in the information-theoretical sense of "size
> of Turing machine needed to produce output".
That is the algorithmic complexity measure (sometimes called Kolmogorov complexity); it is measured in bits but it is not what is used in standard information theory. The definition of bits as an unit of information is based on Shannon information, "the amount of uncertainty reduced by the answer to a yes/no question where the outcomes have 50% a priori probability".
Suppose you have the string "0000000000...0001000000...." with a single 1 at position 4711. The algorithmic complexity is different from the Shannon entropy (which sees it as a sequence of symbols to be compressed rather than the output of a program). Note that as I move the single 1 further and further into the string, the algorithmic complexity grows since the program has to encode the number where to put it.
This is a truly messy area.
> > But remember that there were particle horizons; at time t only
> > particles within each others past lightcone could have affected each
> > other.
> Yeah, hence the caveats. I'm afraid I don't know. I got the impression
> that during the very early Big Bang, everything was within everything
> else's light cone. Of course, that doesn't mean an infinite number of
> actions could be performed. What's the function for the radius of the
> Big Bang as a function of time? Anyone know?
Nick gave you it, but I would like to add that you should really integrate the light-cones to make sure about the horizons; I think they get tricky near t=0.
> > > Anyway, the question isn't whether there are macroscopic computations
> > > (of the sort we're used to) taking place, but whether arbitrary
> > > computations can be encoded in the physical process of the Big Bang.
> > That is a different questions. There are arbitrary computations
> > encoded in the thermal vibrations in my desk. Somewhere it is running
> > my old ZX81 fractal program...
> All computations are recorded/encoded, but not instantiated, in the
> digits of pi. Let me define "instantiated" a bit more nearly, to
> distinguish from "recorded". For one thing, it's obvious that recording
> can't be enough, or all computations would be real and everything is
> true, which could possibly the case and certainly explains everything,
> but is not useful for making choices.
(Actually, Max Tegmark's paper "Is the TOE really the ultimate enseble theory?" (available on the net, look in xxx.lanl.gov) suggests that this can be useful and actually have explanatory power).
> Your desk is almost certainly NOT instantiating the fractal program.
> Justifying this would take far more time than I'm willing to spend; my
> apologies and feel free not to believe me; by my best guess the answer
> is entirely subjective.
Huh? If the answer is subjective you can't justify it. What I meant is
that there is a function f so that the
f(desk-state(t))=program-state(t) over time. This is fairly trivial.
> For the purpose of Alpha Point behavior, the question is not really a
> mathematical point of defining instantiation, but whether the Alpha
> "Powers" are powerless digits of pi or active entities (like us) capable
> of controlling their environment, and particularly of surviving or
> escaping the Great Freezing.
> The question is this: If you were the Singularity, could you set up the
> Big Bang in such a way as to encode your consciousness AND your
> technology into the initial conditions? If this is possible without
> introducing large-scale anomalies, then it would happen as a result of
> pure chance in some region. Remember, we're talking about infinity
> here. Larger than 3^^^^3 or Graham's Number.
I'm not quite buying that infinity factor; but even if I accept it, the Alpha would have a problem with encoding initial conditions due to the infinity of *other*, different, Alphas co-existing and doing the exact same thing, as well as the even larger infinity of noise processes corrupting the data.
(If Alpha has X bits, it could randomly appear with a good chance in a field of 2^X random bits. Which suggests that for every Alpha there are 2^(X-1) bits of noise - the bigger the gods, the noisier :-)
> * = Mitchell Porter has suggested this is due to the inflationary
> scenario, which could have made computing power unusable due to
> lightspeed problems. Our Universe may be optimized so that no Alpha
> Powers could exist!
Have you told Vinge? Maybe his zones of thought are reasonable after all! :-)
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! email@example.com http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y