If you're in thermal equilibrium with a negentropy source, the amount of
negentropy you can extract out of it depends on your free energy and its
temperature. (Assuming the change in temperature as you dump energy into
the source is negligible, the maximum amount of negentropy you can extract
is F/T.) So while negentropy is what you ultimately care about, free
energy is certainly a real resource.
I'm not sure what you mean by "free energy costs more per unit entropy at
low temperatures." Would you please explain?
> Advanced civs would want to use black holes though, as they are by far the
> largest source of negentropy. The entropy of a black hole goes as the square
> of the mass, while the entropy is linear for more familiar uses of mass.
> The big problem is that at their low temps radiation flow is very low, so
> it seems a very slow process to get negentropy out of the hole by sending
> radiation in.
There is going to be a tradeoff between the number of bits of negentropy
extracted per second and the energy expenditure per second. Plugging the
black hole temperature and surface area formulas into Stefan-Boltzmann's
Law, and assuming a complete shell around the black hole, we get
(1.57e-60*M^2*Th^4 - 3.56e32/M^2) Joules per second net energy flow into
the black hole, where M is the mass of the black hole in kilograms, and Th
is the temperature of the shell in Kelvins. Divide this by k*Th*ln 2, we
get the number of bits of negentropy extracted per second. With a solar
mass black hole and a shell temperature of 1K, you'd be expending 6.2
Watts and extracting 6.5e23 bits per second. With a Milky Way mass black
hole and a shell temperature of 0.1K, you'd be expending 6.2e20 Watts and
extracting 6.5e44 bits per second.
Another problem with the idea is that if the universe turns out to be
open, and these advanced civs have a sufficiently low discount rate for
fresh bits, they would be better off hoarding their mass/energy
until the universe gets cooler instead of turning them into black holes.