Skip to content

Gianfranco Bertone: Identifying Dark Matter

March 14, 2013

Dark matter is a pillar of modern cosmology. It is impossible to understand the universe and observations without it. The picture that emerges from observations is that of all the matter in the universe, 15% is in the form of baryons. 1% creates stars, 7% is in the form of gas, and 7% is a diffuse gas. The 85% that remains is dark.

What are the properties that a particle needs to have in order to be a dark matter candidate ? You have to put in a number of conditions. We need to know whether it fits in with the required abundance, whether it is cold, neutral, if it fits with big-bang nucleosynthesis, if it messes up with star formation, if it is collisionless, if couplings are okay, and if we can probe it at accelerators.

The list of thinkable dark matter candidates is very long. It is a zoo, but natural candidates are wimps, arising from theories addressing the stability of the electroweak scale, etc. For instance, a SUSY neutralino, but not only that. Other candidates are axions, or sterile neutrinos, for instance.

Dark matter searches are by direct and indirect detection, and direct production at colliders. Indirect detection has become popular because there is a natural mechanism to produce these wimps by thermal production in the early phase of the universe. Two dark matter particles annihilate into particles of the SM. The reaction can go both ways, and the number density of these particles will relate the expansion of the universe and the annihilation cross section. In a simple calculation their abundance and cross section are coupled. Electroweak-scale cross sections can reproduce the correct relic density.

If one looks at regions where the number density is very high, you can get a annihilation flux, because the rate of annihilation is high. If they produce gauge bosons etc, one can ask what is the final spectrum of SM particles coming out. We can calculate e.g. the photon flux from the cross section of annihilation and density.

One can use simulations to compute the flux of gamma rays due to these processes. One may then compare this to the maps made by Fermi. By dviding the simulation by the square root of the true fluxes measured by Fermi, pixel by pixel, one gets a sensitivity map. The galactic center has the largest sensitivity, but there are a number of other places to look at. The galactic center is extremely complicated region, so the current constraints are obtained using dwarf galaxies. Excesses of photons are a signature, their absence places limits on the annihilation cross section, which will depend on the wimp mass. E.g. arxiv:1108.3546. So one gets constraints on the wimp mass as well.

A line has been measured using Fermi data. This was a claim from the observed center of the galaxy. This evidence can be as high as 4 sigma, at 130 GeV. Historically this was considered a smoking gun. The jury is still out on this issue, but the interesting thing to note is that Hess-II can give an answer on this line, since it could get a 5-sigma detection with a few hours of observation.

Direct detection occurs by direct scattering of DM in a atom of the detector. The speaker had no time to discuss this in detail. Instead he discusses direct searches at the LHC. The standard signature is that one has the Standard Model, and then some extension – SUSY partners, or excited states, or others. There can also be a partner of the gauge bosons which can play the role of a dark matter candidate. What one can do e.g. at the LHC is to produce a squark, that decay all the way to the lightest particle of the theory, producing SM particles in the meantime. A lot of the particles of the new theory could be discovered and their mass determined. One could then determine the parameter of the theory which match the experimental data, and then infer the relic density in a realistic scenario.

One can simulate the response of the LHC detectors to a particle like that. If you do that you find a different answer for the relic density. You then do not get enough information to nail down the relic dansity of the dark matter particle. The problem is that there are big islands in parameter space that are compatible with LHC data, and these correspond to different solutions in the SUSY setup. THere is not enough information to tell in which region of parameter space the neutralino is.

To break the degeneracy, one may look at the relic density vs scattering cross section plane, by making an assumption: that the same benchmark point that produces a signal at the LHC is detected in a detector like XENON, for instance. So there is true complementarity and mutual symbiosis in these searches for dark matter. One just ask to assume that the local abundance of the dark matter is the same on average in the universe.

No comments yet

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: