Friday, December 8, 2017

Stopping a neutrino beam; measuring their interaction cross-section

Neutrinos are popularly known as the particles that go through anything and everything.  Neutrinos from beta decay can escape from the best shielded nuclear reactor, and neutrinos from nuclear fusion escape from the center of the sun.  Neutrinos interact only via the weak interaction, which is indeed weak.  But, that doesn't mean that they can go through anything - the IceCube Neutrino Observatory recently demonstrated experimentally that it is possible to stop a beam of neutrinos, in a paper published in Nature (also freely available on the arXiv).

To do this, IceCube used two tricks. 

First, it use extremely energetic neutrinos, with energies above 1 TeV (1 tera-electron volt, or 1012 electron Volts), extending up to 1 PeV (1 peta-electron volt, or 1015 eV), millions of times more energetic than neutrinos from nuclear fusion or radioactive ion decay.  The cross-section (probability) for neutrinos to interact rises with energy (linearly at first, then moderated to scale roughly as Energy0.3.  So, at an energy of 30 TeV (the rough mid-point of the measurement) the cross-section is several million times higher than it is for neutrinos from radioactive decay.  Of course, there aren't that many neutrinos this energetic, but, at 1 cubic kilometer in volume,  IceCube is big enough to collect a good sample.  The analysis used 10, 784 energetic muons from neutrinos that passed through at least some of the Earth.

Second, it used a very thick absorber - the Earth.  With this, the measurement was quite simple.   It Compared to a baseline of near-horizontal neutrinos that traversed only a relatively small amount of matter, energetic near-vertical neutrinos were absorbed going through the Earth.  The figure above shows the predicted transmission probability (= 1 - absorption probability), as a function of neutrino energy and zenith angle; the latter shows how much Earth matter was traversed.   

There are of course many complications - experimental uncertainties on the neutrino energy, neutral current interactions, where a neutrino may emerge from the Earth with a lower energy than it entered, modelling the material within the Earth, etc., but the result clearly showed that neutrinos are absorbed at about the expected rate.  More precisely, the best-fit cross-section was. 1.3 +/- 0.5 times the predictions of the Standard model where I have combined the statistical and systematic uncertainty.  It was not trivial to find a good definition for the neutrino energy range for which this measurement applies, because different methods give somewhat different energy ranges, but we settled on a method that returned a range from 6.3 TeV to 980 TeV.  For comparison, the highest energy measurements at an accelerator laboratory only reached 0.37 TeV - our measurement reaches order of magnitude higher energies than than.  The figure below puts this in perspective, comparing our measurement with the previous accelerator work.  The cross-sections (y axis) are divided by the neutrino energy so that everything fits on the graph better; otherwise, it would span many orders of magnitude.


I have to mention that this was the dissertation work of my (now graduated) graduate student, Sandra Miarecki.  Sandy had a very interesting preparation for graduate school - she was a career US Air Force Pilot, serving many roles, including as a test pilot, before retiring from the Air Force and coming to graduate school in Berkeley.   After graduate school, she became an Assistant professor at the US Air Force Academy.   The LBNL news center has a very nice article about her.

The Nature article also recieved a fair amount of press coverage.  I will just mention one article,  in Symmetry magazine, which goes into more detail about the analysis than other press writeups.



Thursday, November 16, 2017

Gravity waves, Gamma-rays and gold jewelery

It has been a bumper month for astrophysicists.

On October 16th, the combined LIGO/VIRGO collaborations announced the observation of gravitational waves from an even that occurred on August 17th.  Unlike the previous observations, these waves came from relatively 'light' objects, reflecting the collisions of two presumed neutron stars, with masses around 1.1 to 1.6 times the mass  of the sun, forming a black hole with a mass around 2.74 times the mass of the sun.   Previous gravitational wave events had come from the collisions of much heavier objects.

But, that's not all.  Two seconds later, the FERMI observatory, a satellite containing a large gamma-ray detector, and the INTEGRAL satellite both observed pulses of gamma-rays coming from the same direction.   This is the classical signature of a 'gamma-ray burst' (GRB).  GRBs were first observed in the 1960's by the VELA satellites, built to monitor gamma-rays from possible atmospheric or space-based nuclear weapons tests.   VELA did not observe these, but it did find mysterious bursts of gamma-rays coming from space.     These bursts have been the subject of scientific speculation for decades, and the conventional wisdom was that some GRBs came from the merger of neutron stars or black-hole on neutron star mergers.  That theory has now been amply confirmed by the LIGO/VIRGO/FERMI/INTEGRAL observation.   The graphic above, from the LIGO collaboration, shows the process.

Of course, this collision site was studied by many many other astronomical instruments.  IceCube looked, but we didn't see anything.   However, the optical studies were very fruitful.  Multiple telescopes observed an optical signal that lasted for a few days, plus an infrared signal that lasted for nearly two weeks.  These signals were consistent with some predictions made by my LBNL colleague Dan Kasen and his collaborators.  Kasen made a detailed model of the graviational, nuclear and atomic processes that would occur in a collision of two neutron stars, and, from that, predicted the optical and infrared light emission.  His model predicts considerable production of heavy elements (heavier than iron) via rapid neutron capture (the 'r-process').   The shorter-lived broadband optical emission comes from an initial ejection of lighter nuclei. The long-lived infrared component comes from a secondary emission which is powered by the radioactive decay of heavy elements which heat the plasma that surrounds the newly formed black hole.  Heavy elements (Z between 58 and 90) scatter the light strongly, so it takes longer to escape from the plasma.

This agreement is of great interest to nuclear physicists, since it may provide a new answer to the question: where do the heavy elements in the universe come from?  Previously, it was thought that they were mostly produced in supernovae, explosions that occur when heavy stars reach the end of their livetime and collapse.  However, Dan's simulations  shows that GRBs produce heavy elements, and could account for much or all of the gold used in our jewelry, along with all of the other heavy elements.




Wednesday, July 19, 2017

Where have all the sunspots gone?

One use for IceCube is to study solar flares, or, more generally, solar weather.  Solar weather is important; solar flares are often accompanied by coronal mass ejections (CME), the ejection of plasma which sometimes hit the Earth.  This plasma can disrupt radio communications, disable satellites, and endanger astronauts.   In 1859, a large CME, the Carrington event, disrupted telegraph communications in the U. S. and Europe, and produced enormous auroras which were visible over much of the globe.  Modern electronics is far more susceptible to damage than simple telegraph systems.  A similar event occurring today would cause incredible ($1 trillion?) damage.   With appropriate warning, we could reduce the damage significantly by turning off and/or shielding as much electronics as possible, grounding airplanes, etc.  For this very practical reason, it is important to understand solar weather in more detail.

In addition,  sunspots and solar flares contribute to the variation in the Sun's total output. Sunspots and solar flares reduce and increase the solar irradiance directly.  So, the lack of sunspots in the current 11-year solar cycle (cycle 24) might be thought to reduce solar irradiance, and hence help combat global warming.  However, the story is a bit more complicated than that.  Both sunspots and solar flares are governed by the sun's magnetic fields; there is a more direct correlation between the Sun's magnetic activity and irradiance, as can be seen from this plot from the Bartol Institute at the University of Delaware, which compares solar magnetic activity and the rate of neutrons from cosmic-rays reaching Earth.  Before ~1950 (i. e. before global warming became significant), these magnetic field variations likely accounted for much of the observed climate variation.  The evidence for this comes from comparing carbon-14 dating curves (carbon-14 measures cosmic-ray activity) with climatic data from tree rings and ice cores.

Most of the particles emitted in a solar flare or CME are, by IceCube standards, low energy, protons, neutrons (which mostly decay before reaching Earth) and photons with energies of at most a few billion electron volts (GeV).   When the individual particles reach Earth, they leave relatively few direct traces at ground level; only a small fraction of them lead to (via a small air shower) particles reaching the Earths surface.   However, a CME contains a very large number of particles, an, if the density is high enough, it can raise the counting rate of terrestrial particle detectors  This is known as a ground level event (GLE).  Space scientists have deployed neutron detectors at many sites around the globe to monitor these signals.  Because of the total sensitive area and low background rates of the 162 IceTop surface array tanks, it is a very sensitive detector for GLE's, and we had expected to detect of order 1 GLE per year. 

Unfortunately, Nature has not cooperated.  At this week's International Cosmic-Ray Conference, IceCube presented an analysis of GLE's from 2011 to 2016.  Only three GLEs were observed, and they were 'quite small by historical standards.'  The reason for the low rate is unknown, but it may be connected with the paucity of sunspots during the current solar cycle. This might be expected to lead to a reduction in solar irradiance.  Data from the SORCE satellite presents a mixed picture, showing a small (~0.07%) increase in solar irradiance from 2009 and 2015, with a similar sized decrease over the past two years.  

We do not understand what is causing these changes; clearly the Sun still has many secrets.

Many thanks to Paul Evenson (Bartol Institute, Delaware) for useful discussions on the relationship between sunspots, flares, CME and magnetic fields.  Of course, any errors here are my own.







Monday, December 19, 2016

Searching for something new - beyond the "Standard Model"

Particle and nuclear physicists face a real dilema.  Our "Standard Model" explains most of what we observe at accelerator and non-accelerator experiments, IceCube included.    The Standard Model has been around for about 40 years.  It's three generations of quarks and leptons, four forces, and the Higgs boson come together to provide a good description of the processes we observe at the Large Hadron Collider (LHC) and other accelerators, like Brookhaven's Relativistic Heavy Ion Collider, not to mention underground neutrino detectors.   The only clear crack in the standard model is the fact that neutrinos oscillate between the different flavors, and therefore should have mass.   But, most of us don't feel like this is a huge crack.

So, we have been looking for holes in the Standard Model for the past 40 years.  With the discovery of the Higgs boson in the bag, this is now the main rationale for the LHC.  Each year the four LHC experiments put out hundreds of new results; the search for "New" (beyond the standard model) physics is a major focus.  Unfortunately, they have not found any clear evidence for any new physics.

There are some good reasons we know that there must be physics beyond the Standard Model.  The evidence for both dark matter and dark energy is clear and convincing.  Many theories of dark matter model it as a new particle that could very well be discovered at the LHC.  Dark energy is even more mysterious.  It is beyond the reach of any as-yet proposed laboratory scale experiments, but it is a cler reminder that the universe still has some deep secrets.

Although it is not our primary focus, IceCube is also searching for new physics, mostly that involving neutrinos.  As part of this search, we continue to study neutrino oscillations (see my previous post here) in more detail.  One of the things that we are looking for is a new type of neutrinos, which do not interact; these are called 'sterile neutrinos.'   If regular neutrinos oscillated into sterile neutrinos, it would look just like these neutrinos disappeared.   We search for sterile neutrinos by looking at how likely neutrinos produced in cosmic-ray air showers are to appear in IceCube.  If sterile neutrinos exist, neutrinos traveling long distances through the Earth might disappear.      In a recent study, published in Physical Review Letters (here, and also available here through the Cornell arXiv), we set strict limits on sterile neutrinos.  We pub limits on the possible existence of sterile neutrinos with certain characteristics; the main characteristics are the mass difference between sterile neutrinos and regular neutrinos, and the mixing angle (strength of coupling) between sterile and regular neutrinos.
The IceCube limits are of particular interest because they rule out a region of parameter space (mass difference and mixing angle) that had been suggested by a couple of earlier experiments.  These earlier results had attracted great attention, but we now know that they are unlikely to be correct.

So, we need to keep looking to find a different crack in the Standard Model, possibly including sterile neutrinos with different masses and couplings. 







Monday, September 26, 2016

Make IceCube Big Again



As a scientific experiment, IceCube is approaching maturity.   We have collected 6 years of data with the complete detector.  In one way, this is a lot.  For most analyses, the statistical precision increases as the square root of the amount of data collected).   So, to make a measurement with half the error requires four times as much data.  It will take IceCube another 18 years to half the statistical errors that are possible with the current data.  18 years is a long time, so for many of our studies, future increases in precision will be relatively slow.

This is not true for everything.  For many analyses, it is a matter of waiting for the right astrophysical event.  Assuming that high-energy neutrinos are produced by episodic (non-constant) sources, one nearby gamma-ray burst, or supernova, or whatever is producing astrophysical neutrinos, would be a huge discovery.   This is worth waiting for. 

We continue to improve our analysis techniques, and we will be able to continue to make progress here for some time.  And, there are some analyses that are only now becoming possible, either because they require software that is only now becoming possible, or because they require a lot of data.  So, we are still productively busy.

But, we are also thinking more intensively about follow-on experiments.  There are several possibilities on the table.  

PINGU would be a dense infill array, with a threshold of a few GeV, able to determine which neutrino flavor is the lightest.

Gen2 (above) is a comprehensive IceCube upgrade, probably including PINGU, but focused on an array 10 times larger than IceCube.  It would have a similar number of strings to IceCube, but be build with more sensitive optics.  Because the strings would be more widely dispersed than IceCube, it would have a higher energy threshold, well matched to studies of astrophysical neutrinos. We (both the collaboration and the broader neutrino astronomy community) think, but cannot completely demonstrate that Gen2 will be able to find the source of our cosmic neutrinos.

Gen2 will likely also include a large (10 square kilometer) surface air-shower array.  One main purpose of the array will be to reject downward-going atmospheric neutrinos, improving our sensitivity to astrophysical sources which are above the horizon; the center of our galaxy is of prime interest.

There are several efforts to build a large radio-detection array, either as part of Gen2, or as a stand-alone project.  Here, the main possibilities are ARIANNA, which I have discussed multiple times before, or ARA, a radio-detection project at the South Pole. 

in Europe, there is also a large effort to build an optical detector array, KM3NeT in the Mediterranean Sea.  KM3NeT will eventually include a large (~ 5 cubic kilometers?) astrophysical array, and a smaller array, ORCA, which will have physics goals similar to PINGU.  KM3NeT is starting to deploy test strings now, and ORCA might be complete in the next ~ 3 years.  Construction of the astrophysical array is also starting, although the 5 km^3 array will not be complete until the mid 2020's. 

On the U.S. side, these projects are perfectly aligned with  National Science Foundation priorities. NSF director France Cordova recently unveiled a 9-point R & D agenda; one of the 6 science themes was "multimessenger astronomy."  Unfortunately, even despite this, these U.S. projects seem stalled, due to lack of funding now and in the near-term future.  From my perspective, this is very unfortunate; if  an excellent science case and a good match to the funding agency's directors priorities isn't enough to move a project forward, then what is needed?  Although the full Gen2 would not be inexpensive (comparable in cost to IceCube), one could build either of the radio-detection arrays or a moderate sized surface array for under $50 million - not much by current particle/nuclear physics standards.

Some of the ideas presented here were presented in a comment I wrote for Nature: Invest in Neutrino Astronomy; it appeared in the May 25 issue.


Tuesday, August 9, 2016

Nailing down the astrophysical neutrinos - a new analysis

IceCube has just released a new analysis of the astrophysical neutrino flux, using 6 years of through-going neutrino data.  The results are fully compatible with our previous studies of through-going events, but with much smaller uncertainties, leading to a much more statistically significant result.  The increased data set also allows us to look at some new questions.

Previously, the strongest (statistically) IceCube astrophysical neutrino analysis used four years of contained event data.  The significance of the result (compared to the null hypothesis of no astrophysical neutrinos) was 6.5 sigma (standard deviations), which is very strong, and well above the 5-sigma threshold widely used in particle physics as needed to claim a discovery.  Based on raw numbers, the probability of getting a 5 sigma result is tiny - about 1 in 3.5 million.  But, there are two things to keep in mind.  First, with modern computers, it is easy to do a lot of experiments by changing parameters.  One 'good' example is to consider the number of different places in the sky we could look for a signal.  This large number must be considered when we search for point source searches.  A more problematic example is to change analysis parameters to try to make the signal larger.  We try very hard to avoid this - it is one reason that we use blind analysis where possible - but it can sometimes be hard to avoid unconscious bias.  I had previously discussed an earlier analysis - almost identical, but with only  3 years of data.

The second thing to remember is that this was a single result, based on a single analysis.  We were very very careful before making the contained event analysis public, but still, discoveries need confirmation.  Unfortunately, IceCube is the only experiment large enough to study these neutrinos.  So, we developed a complementary analysis that studies through-going muons from neutrino interactions outside the detector.  The original version  (alternate link to freely available version) of this analysis used 2 years of data, and found an excess, consistent with the contained event analysis, with a statistical significance of 3.7 sigma.

Now, we have released a new analysis, using 6 years of data.  It finds an astrophysical flux, with a significance of 5.6 sigma - enough for a discovery on its own.  The spectrum is shown at the top of this post.  This data also allows us to say more about the characteristics of the astrophysical neutrinos.  The measured spectral index (the 'gamma' in dN/dE_nu = A * (E_nu/1 TeV)^gamma is measured to be 2.13 +/- 0.13.  This is in some tension with the findings from the contained event analysis, which find gamma much closer to 2.5.  This tension could be from statistical fluctuations (it is about a 2 sigma difference, so not too improbable), or it could have something to do with the different event samples.   This plot shows the tension, with the different enclosed regions showing the range of astrophysical neutrino spectral index (gamma, x axis) with the corresponding signal strength (flux, y axis).  The solid red curve is from the current analysis, while the blue curve shows the combined result from previous studies.  If the two measurements were in good agreement, the curves should meet.  But, they don't; besides statistics, there are several possible explanations.



The through-going neutrino analysis samples, on average, more energetic neutrinos than the contained event study, so one simple explanation might be that a power law neutrino energy spectrum, like dN/dE_nu = A * (E_nu/1 TeV)^n is too simple a model.  The spectral index gamma might change with energy.  There is no reason to expect a single power law.  Alternately, there could be  some difference between the muon-neutrino sample (through-going events) and showers from a mixture of all three flavors; the latter is not expected, but statistical fluctuations, or a more complex energy spectrum seem like the most likely possibilities.

The fact that two different analyses get the same answer is very encouraging.  There is, by both design and result, zero overlap between the two events samples. Further, the systematic uncertainties for the two analyses are very different, so the analyses are almost completely independent.  So, for anyone who was waiting for this signal to go away, it looks increasingly unlikely.\

I should mention that this is the analysis that first found the 2.2 PeV neutrino that I have previously discussed here and here.   So, if you were waiting for a more detailed publication, this new paper is it.

Monday, July 25, 2016

The highest energy neutrinos

It has been a while since I last discussed the so-called 'GZK' neutrinos, neutrinos which are produced when ultra-high energy cosmic-rays interact with cosmic microwave background (CMB) photons.  As you may recall, CMBR photons were produced during the big bang; they have cooled off during the roughly 14 billion years since then.  They are now mostly at microwave frequencies, with a peak at 6.6 GHz, corresponding to an average photon energy of 0.00024 electron Volt, or a temperature of 2.725 degrees Kelvin (i.e. above absolute zero).  Although this is not much energy, it can be enough to excite ultra-high energy protons into a state called the Delta-plus (basically an excited proton).  When the Delta-plus decays, it produces a proton and a neutral pion, or a neutron and a positively charged pion.  When the positively charged pions decay, they produce a neutrino and a muon; when the muon decays, it produces two more neutrinos and an electron. 

We know that ultra-high energy cosmic rays exist, and we know that CMB photons exist, so these are often considered to be a 'guaranteed' source of neutrinos.  These neutrinos are the main goal of radio-detection experiments like ARIANNA, ARA and, of course, ANITA.  However, there are a few caveats.  If the highest energy cosmic rays are mostly iron, rather than protons, then the flux of these GZK neutrinos will be drastically reduced, below the point where these experiments can see them. Also GZK neutrinos have been produced continuously since the early universe (and are almost never absorbed), so the number of GZK neutrinos existing today depends on how many ultra-energetic cosmic-rays there were in the early universe; this is a much smaller uncertainty than due to the proton vs. iron (or something in between) question.

Until recently, all searches have been negative.   The one possible exception is an anomalous event observed by the ANITA experiment, the fourth, 'anomalous' event in their recent paper.  I have previously discussed ANITA; this event emerged from a reanalysis of data from their first flight. The ANITA Collaboration describes the event as consistent from a primary source that emerged from the earth; this might be from a neutrino or a long-lived tau lepton.  The tau lepton could have been produced in an air shower, and travelled through the Earth, before emerging to produce this shower.  The event could also be a mis-reconstructed downward-going shower.    Although the event is very interesting, we do see the difficulty of trying to draw conclusions based on one event.  It is also clear that the ANITA collaboration feels this difficulty; the event is one of four presented in a paper on downward-going cosmic-rays, rather than highlighted on its own.

Of course, IceCube is also looking for GZK neutrinos.  Our latest search, based on 6 years of data, has recently appeared here.  To cut to the chase, we didn't find any GZK neutrinos; the analysis did find two lower energy (by GZK standards) events, including the previously announced energy champ.  From this non-detection, we set limits that are finally reaching the 'interesting' region.  The plot below shows our upper limits as a function of energy, compared with several models.

One needs to be careful in interpreting the curves on the figure.  One needs to understand how the curves were made to understand the implications.  The limit curve is a 'quasi-differential limit, in decades of energy.  Basically, this means that, at each energy, the solid line limit is produced by assuming a continuous neutrino flux with an E^-1 energy spectrum; the E^-1 is chosen to roughly approximate the GZK neutrino flux; more detailed analyses, also given in the paper, use the entire spectrum to calculate 'Model Rejection Factors' to rule out (or not) the different calculations of GZK neutrinos.  We are now starting to rule out some models.