The big question is what Planck will provide us: is the data compatible with inflation ? We will also soon know the number of effective neutrinos. We will soon map the dark matter sky. So this workshop came four days too early.
But we also have the LHC. Is the Higgs all there is ? Essentially, if we change a little the top and Higgs mass we can go up to the Planck scale without the need of new physics. But we have evidence for new physics from neutrinos. Neutrinos point to high scales. The question of where are these extra energy scales is open.
Astroparticle physics is thus going up and down the cosmic ladder. There are two fundamental question: are there any intermediate scales between EW and inflation ? And how do particles and fields or laws of the intermediate scales shape the genesis, evolution and destruction of cosmic structures ?
The European Astroparticle Physics Coordination (APPEC) started in 2001. ASPERA was launched and for six years gave 6MEuro funding to 19 countries with 3000 researchers and 220 MEuros consolidated funds. APPEC is a fusion with ASPERA. They managed to embed the work in a more global effort. THey made roadmaps. The first one was in 2008, pointing at dark matter and energy, neutrino mass and properties; gravitational wavers, highe neergy photons and neutrinos and UHECR. In total seven magnificent topics.
In 2011 there was an update. A prioritization was introduced, as a time ordering. They also funded all projects that had been prioritized in the roadmap with common goals for R&D. They also developed interdisciplinary connections for geophysics, underwater science, and underground science. Also built contacts with the industry for innovation.
In dark matter searches APPEC supports a program to extend the target mass of noble liquids to a few tons (e.g. DARWIN). There are 30 projects worldwide with different techniques. The leader is XENON, but in two-years time bolometers will arrive at the same sensitivity.
In neutrino physics, APPEC supports a vigorous R&D program of the liquid argon technique and beam design studies in anticipation of a critical decision in 2015. LAGUNA is being planned. Rapid progress in nu oscillation physics has established a strong scientific case for a LB neutrino programme exploring CP violation and the mass hierarchy. CERN should develop a neutrino programme to pave the way for a substantial European role in future LB.
For dark energy, the communities and agencies have converged to large sky surveys with telescopes both on ground and in space. The field has finally a clear long-term program.
For High-energy neutrino telescopes an interesting thing from the agency point of view is the joint effort for a km^3 scale detector. Also the collaboration of Pingu and Orca is important, due to their complementary nature. In cosmic ray observatories we might be seeing the GZK or in large part the exhaustion of the acceleration mechanism. AUGER submitted a proposal to distinguish muons from the EM component better.
In summary, we have seen coordination in gravitational waves and dark energy, as well as in high-energy universe observatories. In other domains more work is needed.
A first detection of gravitational waves tests Einstein’s prediction. Another motivation for looking for gravitational waves is that one can look beyond the visible universe, to understand black holes, supernovae, GRBs. Third, one can look as back in time as a theorist can conceive, gaining information from the beginning of the universe.
The main features are two transversal polarization states, associated with massless, spin-2 particles (gravitons. The radiaotion is emitted by a time-varying quadrupole.
There are no laboratories to produce gravitational waves. A 1000 tons steel rotor at 4 Hz would give a luminosity of 10^-30 W: the comment of Einstein is “a practically vanishing value”. A collapse to neutron star with mass 1.4 M-sun gives 10^52 watts, and if this is in the galaxy this gives a 10^-18 power, in the virgo cluster 10^-21. This is a challenge to contemporary experimental physics.
Gravitational waves are emitted by coherent acceleration of a large portion of matter. They cannot be shielded, and they arrive to the detector in pristine condition. They can reveal features of their sources that other messengers can’t provide.
A pulsar in the sky has been shown to lose energy exactly as general relativity predicts. The orbital period decreases. So we are confident that GWs exist.
The methods of detection at very low frequency involve cosmic microwave background polarization. When the wave period is years, we go to pulsar timing. For higher frequency there are space interferometers, and present detector already taking data are at very high frequency.
We can learn a lot by detecting the waveforms of gravitational waves. For a supernova the information is a inner detailed dynamics of the SN. For spinning neutron stars, one gets the neutron star locations near the earth, and pulsar evolution. For coalescing binaries (two compact objects spiraling into one another) one can determine masses of the objects, distance to the system, measuring the Hubble constant. The stochastic background, relic of the early universe, provides a confirmation of the big bang, and inflation. It is also a unique probe to the Planck epoch. It might even provide the proof of existence of cosmic strings.
The speaker mentioned that every newly opened astronomical window has found unexpected results. Optical window was opened by Galilei, and the first surprise was Jupiter’s moons. Cosmic rays in 1912 provided us with the discovery of the muon. And so on. So the point is clear: if we open a window to study gravitational waves we are confident of new discoveries and surprises.
In the world there are several interferometric and resonant-mass detectors. Auriga and Nautilus are the only resonant-mass detector in operation now.
Forty years of attempts of detection have passed. There is a international committee for gravitational waves (GWIC). There is a lot of collaboration rather than competition, because of the need to join forces and data.
There was a phase change in the view. From 2005 on, we know that there are no such strong sources in the sky, need to be more conservative and try to detect the sources we imagine in the sky at the moment. A world-wide network of interferometers is composed of LIGO in Hanford and Livingston, VIRGO in Italy, GEoO600, and Kagra, a 3km interferomenter, is in preparation underground in the Kamioka mine.
Detecting such a tiny signal in presence of noise of all kinds mean isolation of mirrors from ground and acoustic sources. Need to use material that reduce thermal noise. The sensitivity is limted at low frequency by seismic noise, in the middle part by thermal noise. The progress in reducing the noise in LIGO can be illustrated with a graph of the strain as a function of frequency. The reduction versus time from 2001 to 2006 has allowed to go down by over three orders of magnitude.
What were the results of LIGO and VIRGO ? We have not yet a detection in our hands, but we think this is not far in the future. Data is analyzed jointly. Several year long science data runs have happened. Limits on gravitational waves from known millisecond pulsars have been set, and on compact binary coalescence rates in our local neighborhood. There are also limits in stochastic background of cosmological origin.
One can also think of searching for coincidence between neutrino telescopes and gravitational waves. A paper (Phys.Rev.Lett. 103 (2009) 031102) describes the concept.
After this first data, interferometers are off because they are preparing a second generation: advanced VIRGO and advanced LIGO. The sensitivity now is such that one could detect tens or hundreds of events per year by increasing the sensitivity by a factor of 10. This is possible with more powerful lasers, different topologies of interferometry, optimization, and putting all that we have learned to work. In 2015 this could be done and is the present goal.
The present array can be completed with a detector in India, LIGO-India. This is a 2-km interferometer that could be transported from the US to India. This improves the localization capability for a signal. It would also remove some blind spots in the sky.
So we are on the threshold of a new era of gravitational wave astrophysics. Sensitivities. But a third generation is already being thought of: underground detectors to remove noise and cryogenic, to reduce thermal noise. A 10km triangle underground can detect sources in a wide portion of the universe. This requires pushing forward the technology.
In space allows to escape from backgrounds at small frequencies. The LISA detector originally had arms of 5 million kms. Has been reduced because of costs. There will be savings in weight, launch cost. A demonstrator will be launched next year.
The earth is opaque to UHE neutrinos: the cross section becomes big since it scales with squared energy. This means the signal is only coming from the top hemisphere. For through-going muons one runs out of target on top, so one is mostly looking at the horizon. Through-going muons and taus are favored channels used for GZK flux probe. Effective areas are 5 times larger at 10^18 eV relative to electrons, but one thus gets only 25% of the neutrinos going to muons. Also, muons of HE lose a third of their energy per kilometer, so this energy measurement is degraded.
Vetoing the Cosmic-Ray background with a surface array could regain the coverage for through-going muons but the ice-top coverage is limited. One can think of a array of detector on top of IceCube as a veto.
One can put acoustic detectors in some of the IceCube strings. One gets acoustic emissions from the energy deposited in ice over a short amount of time. The unfortunate story is that the attenuation length is much shorter than what was expected. The promise of cheap detectors for large volume, but now the general feeling is that acoustic in ice has too high a threshold (E>1EeV) to be useful.
One can think of using Askar’yan effect: coherent radiation from excess negative charge developed in a EM shower (compton production of electrons, with very small amount of positron annihilation). A radio pulse is emitted in ice at a Cherenkov angle of about 55 degrees. The signal is broadband up to GHz frequencies in a cone. Askaryan radiation was observed in a SLAC test beam on a ice block (2007). For use in neutrino detection in ice experiments, Snell’s law says that rays incident shallower than 44 degrees from horizontal will be reflected back into the ice, so the geometry of the interaction is important. The solution is to dig into the ice, 200m below surface being a acceptable compromise.
Holes were drilled (6″ diameter) at 200m for a test. Even at 200 meters, the distribution of vertices that one’d get for 10^16 eV energy is shallow, and one would only catch a small part of the signal.
ARA37 will be a huge array of holes next to the South Pole site. The electronics are connected to antennas at 200m depth. Each station is a autonomous GZK neutrino detector. The electronics needs a fast digitizer. >3GHZ sampling of RF transients from all antennas. The status of this experiment is that a testbed was deployed in 2010-2011 and running since then. Ara1 deployed last year. Ara2 and Ara3 are being deployed this year. Deployment of future stations depends on funding.
Arianna is a similar concept. A shallow detector some meters below the surface of the ice. This is located on Moores Bay. The idea is that you can get bounces from the ice-water interface, increasing your acceptance to the signal of high-energy neutrino signals. Right now four stations have been deployed.
Finally, Anita is a ballon with a RF detector launghed from McMurdo station. Flies above the antarctica at 37km, an dviews more than a million cubic kms of ice. They recently removed the horizontal polarization trigger but still saw 5 ultra-high-energy cosmic ray events. Anita III (2013-2014) will continue the endeavour.
In summary, optical is on the verge of discovery of non-GZK neutrino flux. IceCube will not see many events even after many years. Acoustic development is coming to a close due to the unfortunate situation with the attenuation lenght. Radio is the technology of choice for GZK neutrinos.
Very high energy neutrinos have always been predicted to come from the sources of UHE cosmic rays, as first stated by Waxman and Bahcall. The limit was challenged byt nobody doubts the relation between the cosmic ray sources and astrophysical neutrinos. The upper limits of W&B are stronger than those of Manheim, Protheroe et al.
Cosmogenic neutrinos however do not come directly from the source, but are diffuse. They are produced by the interaction of UHECR with the photons in the universe. This suggestion was made in 1969. Often called GZK neutrinos, these should instead be called BZS neutrinos (Berezinsky, Zatsepin, Stecker (1973)). What happens is that there is a minimum proton energy that can interact with the photon background. If one takes the average energy of the microwave background, the threshoudl is about 3*10^19 eV. There is also production in the infrared and optical background.
Of course as always, when protons interact and produce pions, we have gamma rays and neutrinos. The spectra one expects have still quite high energy. These are cosmogenic neutrinos and cosmogenic gamma rays. The pizero decays in two gammas. A charged pion decays and produces three neutrinos and an electron, and the neergy of two peaks is lower.
The fluxes of neutrinos depend on flavor. THere is a two-peak system in the flux vs energy plot. Electron neutrinos and antineutrinos peak at much lower energy, slightly above 10^16 eV. It is very important for the shape and flux of cosmogenic neutrinos that we establish their emission in UHECR sources, the maximum acceleration energy, their energy spectrum at acceleration, the composition, and the distribution of these sources in the Universe. These parameters are of course correlated with one another.
The maximum acceleration energy of UHE protons is one of the most important astrophysical parameters for the estimate of the cosmogenic neutrino flux. If this is below 10^20 eV, then the total flux is very low. What is important is the max energy per nucleon: if a nucleus has larger energy still what matters is the energy of each proton individually, of course.
Using the current measurement of the UHECR flux, particularly auger fits, one obtains predictions for neutrinos. Different neutrino flavours in a standard cosmogenic neutrino calculation show different peak values. If coming from neutron decay (antineutrinos of electron kind) their energy is smaller.
CMB is not the only universal photon field in the universe. The infrared and optical background is much more energeitc but the number density is much smaller. This means that steeper proton spectra generate more neutrinos in interactions with this background than flatter ones.
The highest possible cross section of processes useful to detect these neutrinos is at the Glashow resonance: a electron antineutrino interacts with an electron to produce a W-. Some of the decay products of the W- would produce a visible signature in the detector. Detection of the Glashow resonance would support a heavy composition of the cosmic rays.
Two neutrino events with very high energy were detected by IceCube. Is it possible that these events be cosmogenic neutrinos ? There are several processes we have to look at. The Glashow resonance is one. One must also thing at CC interaction of an electron neutrino above 1PeV, or tau neutrinos with the same energy. It is difficult to see the tau track because the decay length is of 50 meters. Then there are neutral current interactions of any neutrino type. So the responsible neutrinos could be either extraterrestrial or atmospheric, with a strong prompt (charm) contribution. For atmospheric background neutrinos, the calculated flux is 0.086 events in two years of IceCube, so these are not likely atmospheric.
The current limits on the flux of high-energy cosmogenic neutrinos are higher than the W&B limit. In conclusion, it is likely that the two neutrinos come from sources than by UHECR in propagation.
Juan Antonio Aguilar: Interpretation of Current IceCube Results on Galactic and Extragalactic Sources
Neutrinos are not deflected by magnetic fields so they can tell us where they come from. This is important in understanding the origin of cosmic rays. For galactic sources the candidates are SN remnants, for extra-galactic CRs, these may be active galactic sources, gamma ray bursts, and magnetars.
Icecube was already described in a previous talk this morning. The most recent point-source results are obtained by the run of three years. Juan showed a nice plot of point sources in the sky. The two parts of the planisphere is made up of events which are up-going and down-going. There is no evidence of neutrino point sources in the data. To calculate an upper limit on the flux is performed using the Neyman construction. The limit is the best for most of the declination angles.
For Markarian 421, most of the sensitivity is coming from the region between 10 and 100 TeV energies. For southern sources the sensitivity peak moves to higher energies.
The galactic sources rely on the “standard candle”, the crab nebula. This is mainly regarded as a leptonic source, but past gamma-ray flares up to TeV energies challenge the view. IceCube may soon constrain the existing models. Other supernova remnants are far from being constrained by IceCube data, still having limits a factor of four higher than expectations.
There are six TeV associations with supernova remnants based on Milagro. Stacking the signal one may test the model from Halzen et al.
Another galactic source is the mqso. X-Ray binaries, with a haracteristic radio emission believed to come from relativistic jets, that may point to us – when they do we expect neutrino to also be visible. So the flux will show a periodicity. The hottest such source is Cygnus-X3. The collaboration is about to unblind the results of three-year running on this source.
For extra-galactic sources, on GRB the analysis is twofold. There is a model-independent one, “catch-all”, and a model-dependent one. For the latter they were expecting 5.2 events from the model of Guetta et al., but they expect 0, which excludes it. For the model-independent analysis there are two events found, with a more unclear situation.
On active galactic nuclei signals, protons are accelerated together with electrons. They will lose energy in proton-proton and proton-gamma interaction or synchrotron emission. From a paper by Halzen and J.Becker, they assume only pp interactions (arxiv:1302.1015). The resulting neutrino flux can be normalized to the IceCube limits, which can be converted in a limit on the target density.
Another similar approach was doen for pp interactions in Blazars. One may calculate the neutrino flux and normalize it to the Fermi total energy flux. Scanning the parameter space then yields neutrino fluxes, and looking at those parameters that result in fluxes above the limits of IceCube. The resulting limits rest on some assumptions.
In conclusion, there is no evidence of point sources of neutrinos from three years of icecube exposure. Individual SNR are still difficult to detect with IceCube but different analysis techniques such as stacking should allow to bring down the sensitivity to the flux prediction level. The absence of a signal from GRBs challenges the idea of GRBs as the only responsible source of ultra-high-energy cosmic rays.
The most extreme window for astronomy can be opened based on multi-messenger observation: charged cosmic rays, gamma, and neutrinos. UHE neutrinos can be produced at the sources, or during propagation. The latter we call cosmogenic neutrinos. Neutrinos could reveal the acceleration of nuclear primaries in astrophysical objects. They can test alternative models to the origin of ultra-high-energy cosmic rays.
The Pierre Auger Observatory is made up by 500 members in 18 institutions. The detector is made up by an array of 1660 cherenkov stations on a 1.5-km triangular grid, covering 3000 square kilometers. The detectors sample the lateral distribution of particles. There are four sites equipped with fluorescent detectors, which only work during clean moonless nights. These are important to calibrate the energy measurement of the full data sample, although they only have a 15% duty cycle.
There are two low-energy extensions of this array: Amiga and HEAT. The first is a denser array instrumented with muon detectors. The second are three further detectors at high elevation.
Each surface detector is a plastic tank filled with purified watered, 12 cubic meters, and equipped with three PMTs which detect the Cherenkov light. They have independent power supply and communication is provided by radio.
When a ultra-high energy neutrino comes in and hits the detector one sees a signal which is spread in time over several microseconds. Typical of a signal with a large EM component. Or, after traversing the atmosphere, the shower gets older and the EM component is absorbed. What you then see is a peak contained in a couple hundred nanoseconds. This can be used to distinguish hadronic-induced showers from neutrino-induced showers. So the signal for neutrinos is a young shower with significant EM component with a broad signal.
The detection channels are two. Earth-skimming neutrinos are almost upward-going or horizontal. This gives high interaction probability and sensitivity to tau decays occuring just above the detector. This has no background but little signal and exposure-limited. The other channel is the normal downward-going signal, which is still targeted to inclined events to be sensitive to reduce the hadronic component, zenith angles of 75 to 90 degrees.
The identification of neutrino candidates involves the selection of inclined events with significant EM component. Many stations must record a broad signal. At minimum 60% of the stations satisfy a specific trigger of time bins over threshold. Quality cuts for neutrino identification are on the elongated footprint, the speed along the footprint consistent with the speed of light, and small RMS of the speed. A further discriminating observable which is sensitive to the shape of the recorded signal is the signal integral over the peak value, again targeting the width of the signal.
The analysis strategy for downgoing candidates makes use of all the above observables, and these are combined with a Fisher linear discriminant. You want no neutrino events in the selected data, and the cut can be tuned to allow for none of it. The cut is set at <1 event of background expected in 20 years of running.
The result is that no candidates are found in the 5.5 years of data taking since 2004. With this result they can set limits on the flux as a function of neutrino energy. This is one order of magnitude higher than the flux predicted by cosmogenic models. Max sensitivity is for energies of 10^18 electronvolts. There they are competitive with IceCube.