RE: Constraints on the "singularity"

Ramez Naam (ramezn@EXCHANGE.MICROSOFT.com)
Mon, 13 Oct 1997 10:52:55 -0700


Let me reiterate that I'm not positing /hard/ or unbreakable =
constraints
on future entities or technologies.=A0 They may exist.=A0 They may =
not.=A0 No
one is in a position to say definitively.

In my observation of the world, though, I note that the rate of
advancement of /technology/ often far eclipses the rate of advancement
of /theory/.=A0 Technology grows at an amazing rate for some time, till =
it
approaches the boundaries of our theoretical models of the world.=A0 At
that point its rate of growth slows.=A0 Eventually new theoretical =
models
are put forth that facilitate a new round of amazing technological
growth.=A0 Thus advancement of theory becomes a /bottleneck/ on the
advancement of technology.=A0

That being the case, let me rephrase the question of the thread
somewhat.=A0=A0 What are the most significant theoretical bottlenecks =
that
we're approaching?=A0 How far can we get before hitting those
bottlenecks?=A0

(I'm an adherent of the philosophy of asking questions for a purpose, =
so
let me explain this one:=A0 These theoretical bottlenecks present =
likely
targets for our research and contemplation.=A0 Knowing how far off they
are helps us prioritize our allocation of resources to the various
areas.)

> From: Eliezer S. Yudkowsky [SMTP:sentience@pobox.com]
> What about negative matter?=A0 You can have an arbitrary amount of
> computing
> material in a given volume, with net mass zero.

Interesting.=A0 I'm not familiar with negative matter, any =
recommendations
on a primer?

> > Given the likely mass, age, and size of the universe, and the
> > constraints listed above, what is the maximum achievable
computational
> > power of the universe?
>
> Infinite.=A0 There exist physical processes which are not
> simulable-to-arbitrary-accuracy by Turing machines.=A0 Even if all
> physical
> processes *are* simulable, they still use real numbers.=A0 Perhaps a
> clever
> Being could exploit a chaotic Mandelbrot-like boundary to perform
> calculations
> of arbitrary complexity in constant time.

Hmm.=A0 Not sure I agree with your premise.=A0 It seems that to really
exploit this you run up against Planck space/time again, and what you
really get is Quantum Computing.=A0 Which, while it provides truly
mind-boggling computational power for certain classes of problems,
certainly does not provide infinite computing power.

This is what I mean by a theoretical bottleneck.=A0 Perhaps some future
model of the fundamentals of space/time/matter/energy will provide us a
way to manipulate physical processes of arbitrary complexity in =
constant
volume.=A0 However, our current models do not.=A0

> > Given c, the age, size, and rate of expansion of the universe, how
> long
> > would it take an earth-spawned power to infest the galaxy?=A0 =
1/10e6
of
> > the universe?=A0 1% of the universe?=A0 10% of the universe?
>
> General relativity makes the speed of light fundamentally arbitrary.
> They can
> infest the entire Universe in zero time, and finish before they
started.

This is totally at odds with my understanding of GR.=A0 Relative to an
observer anywhere else in the universe, c is a very real constraint.=A0
Certainly relative to the action here on earth (or whatever starting
ground the power has), c appears to be quite a real constraint.=A0 =
Please
explain.