Re: Gravity calculations and dark matter clarifications

From: Amara Graps (amara@amara.com)
Date: Tue Jun 06 2000 - 16:23:15 MDT


Jeff Davis (jdavis@socketscience.com), Date: Sun, 04 Jun 2000 writes:

>Extropes,
>Check out
>http://abcnews.go.com/sections/tech/CuttingEdge/cuttingedge.html

This article is about the GRAPE-6. The last one I had any knowledge
of was the GRAPE-4. It's a special purpose computer for only one
type of calculation: the Particle-Particle method for calculating
forces of an Nbody problem. I've seen this computer used mostly for
simulating globular clusters and following the evolution and
dynamics of galaxies. They can now evolve a spiral galaxy for 3
billion years, noting the spiral patterns in the arms. Also, Makino
discovered with his Nbody calculations something called
"gravothermal oscillations".

A description of the "Particle-Particle" method that Makino put into
his computer hardware can be found here:

http://www.amara.com/papers/nbody.html

I published some of the above article in the British trade journal:
_Scientific Computing World_ one year ago: "N-Body Simulations Push
Hardware and Software Limits," A.L. Graps, SCW, April/May 1999.

The following is from my Nbody methods page, talking a little bit
more on this topic, with a photo that I took of a GRAPE-4 computer
that is located in a research institute nearby.

{begin quote}
GRAPE/HARP computer

To simulate tens of thousands of particles by a highly-accurate
particle-particle method, the computational physicist can expect his
simulation to perform at least 10^{14} floating operations. Such
considerations inspired the building of the GRAPE computer by
Junichiro Makino and colleagues in Tokyo. The GRAPE (acronym for
"GRAvity PipE") computer is built around specialized pipeline
processors for computing the gravitational force between particles.

Makino et al. has refined the technology further by including chips
that perform a Hermite integration, in order to reduce memory
requirements and to allow somewhat larger time steps. They call
their specialized GRAPE computer: HARP. HARP is the acronym for
"Hermite AcceleratoR Pipeline."

http://galileo.mpi-hd.mpg.de/~graps/scw/grapepic.jpg

This Figure shows a specially designed GRAPE-4 system, the HARP
computer, which gives hardware support for a Hermite integration
scheme. Photo taken by A. Graps at the Astronomisches-Rechen
Institut in Heidelberg, Germany in December 1998. A colleague's
watch helps provide the size of this small, innocent-looking
computer.

Jun Makino's group
http://grape.c.u-tokyo.ac.jp/
{end quote}

>Robert B (as in Blunt) and Amara (the extropian dust bunny?
>(insert smily here)),

hee hee. good one. I had not thought of that use for the term dust
bunny. "Dust bunnies under the bed" is the usual dust joke.

>While they talk about the usual stars and galaxies, they make
>no mention of the ever elusive dark matter. If they input data based
>on "bright" matter, and then run their calculations, do they
>get "junk in gives junk out", or do they have an opportunity to
>"solve for" the distribution and role of dark matter.

Do you mean solving for dark matter using Nbody simulations ? I will
assume that is what you mean. And there are a lot of different
candidates for dark matter. The place that I've seen Nbody
simulations used to study the existence of dark matter are the
studies when one evolves galaxies with their associated halos, and
then looks for MACHOs: Massive Compact Halo Objects. These are the
Compact Objects responsible for the gravitational lensing of stars
in the nearby (and our own, I think) galaxies.

Simulation.

Create a small galaxy, like a dwarf spheroidal galaxy ("dSph"). The
dSph's are known to orbit the Milky Way at distances ranging from a
few tens to a few hundred kpcs. The physical model includes stars
and gas of the disk and central bulge (the "visible" matter) plus a
function for "dark matter". The dark matter resides in a spheroidal
halo extending far beyond the visible part of the system.

Some details: Assume that the galaxy has a smooth gravitational
potential, which is solved using various distributions. A
collisionless Boltzmann equation gives the stellar distribution
function f_s and the dark matter distribution f_d. The interstellar
medium is modelled with conservation laws for ideal gas plus gravity
and radiative cooling. The gravitational potential is governed by a
Poisson's equation where the dark matter function f_d is one of the
terms. So then stars, gas and dark matter all move in their own
self-consistent potential and the system is evolved over time.

Note: Real galaxies have energy sources of turbulent motion,
magnetic fields, cosmic rays, and I am not sure how far along the
simulations are to have all of these things. There's some extremely
advanced hydrodynamics code (like ZEUS3D) running, though.

Then this simulated objected is projected onto a "sky", and its
brightness profile, line-of-sight velocity dispersion and _apparent_
M/L ratio are determined. These can be directed compared to the
observed values for Galactic dSph satellites.

Compare to observations.

Observations of these dSph galaxies can determine the mass using the
galaxy's velocity dispersion, and then after noting the luminosity
one derives a mass-to-light ratio.

Then one sees what f_d, that is, the dark matter density is
necessary to fit the observations (?).

So, the above description is what I understand fow how one solves
for the dark matter using Nbody methods. I've not worked in this
field myself, so I'm just going by my cursory reading of the
astrophysics literature.

[W.H., if you're reading this, correct me if I'm wrong.]

Amara

********************************************************************
Amara Graps email: amara@amara.com
Computational Physics vita: finger agraps@shell5.ba.best.com
Multiplex Answers URL: http://www.amara.com/
********************************************************************
"It works better if you plug it in." -- Sattinger's Law



This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:12:36 MDT