Anders Sandberg gives a calculation of the thermodynamic efficiency of the
dyson shell in his (draft) Jupiter Brain paper, and concludes that a
smaller dyson shell is more efficient than a larger one. But I think his
reasoning is slightly flawed. I'll show that the reverse is true, and that
dyson shells may have a much lower temperature than previously thought
(see for example
http://www.student.nada.kth.se/~nv91-asa/dysonFAQ.html#LOOK) which may
explain why they have not yet been observed. Previous temperature
estimates assume that the energy captured by dyson shells is also used up
(dissippated) by them. However it seems more reasonable to assume that
dyson shells only serve as energy collectors, and that the energy stored
by them is eventually used elsewhere. For example really large computers
may be set in interstellar space for more efficient cooling, physical
security, and access to multiple energy sources.
A heat engine works between two heat reservoirs with a temperature
difference. A dyson shell works between the Sun which has a surface
temperature of 5800 K and the temperature of its external radiators which
is initially at 3 K, the temperature of the cosmic background radiation.
However, even with a temperature difference as large as this it cannot be
100% efficient and therefore must radiate away waste heat. However its
ability to radiate waste heat is limited by its external surface area and
soon its radiators grow hotter as the waste heat builds up. This has two
effects. First it allows the radiators to eliminate heat more quickly, and
second it makes the dyson shell less efficient because the temperature
difference is smaller, thereby causing it to producing more waste heat.
Fortunately the first effect dominates so the external temperature
eventually reaches an equilibrium. The temperature at this equilibrium can
be found by applying Carnot's Theorem and Stefan-Boltzmann's Law.
Carnot's theorem says an ideal heat engine working between temperatures of
Th and Tl has an efficiency of (Th-Tl)/Th. Here Th is the temperature of
the Sun, and Tl is the temperature of the dyson sphere's external
radiators. It follows that the dyson sphere must radiate away Tl/Th of the
Sun's energy output (S) as waste heat. The amount of waste heat that can
be radiated away by a blackbody sphere of radius r and temperature Tl is
derived from Stefan-Boltzmann's Law as 4*pi*r^2*sigma*(Tl^4-Tb^4) where Tb
is cosmic background temperature and sigma is Stefan-Boltzman's constant.
So to find the equilibrium temperature we set that equal to S*Tl/Th and
solve for Tl.
Doing this in Mathematica produces a complex looking solution to the
quartic equation, which I will not bother to reproduce here. However it
turns out that the effect of the cosmic background radiation is not
noticeable until we get to a radius of about 100 AUs. Ignoring the cosmic
background radiation gives us Tl~=(S/(4*pi*r^2*sigma*Th))^(1/3). At 1 AU
this is 161 K with efficiency of 97%, and at 10 AUs this is 35 K with
efficiency of 99.4%. At 100 AUs, and taking into account cosmic background
radiation, Tl is 7.6 K with efficiency of 99.87%.