Neutrino 2010 was an interesting conference. There were no earthshaking new results, but there was steady progress on many fronts.
The most interesting new results came from the MINOS and MiniBoone experiments. These are both detectors that observe neutrinos produced by an accelerator at Fermilab, near Chicago. Both experiments are studying neutrino oscillations, whereby a neutrino produced with one flavor (electron, muon or tau neutrino) oscillates as it travels from the accelerator to the detector.
MINOS has observed a possible difference between how neutrinos and antineutrinos oscillate. If correct, this would be very surprising, signalling a big difference between matter and antimatter. Although this result got significant publicity, apparently due to a Fermilab press release, the difference was not statistically large, and almost everyone at the conference was happy to treat it as a likely statistical fluctuation, pending more data. The other anomaly, from MiniBoone is harder to characterize, but is also likely a statistical fluctuation.
Two other popular topics were searches for neutrinoless double beta decay, and progress toward enormous (100,000-500,000 ton detector) next generation detectors.
In neutrinoless double beta decay, a nucleus changes it's atomic number by two (i.e. germanium decays to selenium, or xenon to barium), emitting two electrons and no neutrinos. This is only possible if a neutrino can act as it's own antiparticle, so this would be a major discovery. If this process occurs, it is very rare, with a half live of well over 10**20 (10 to the 20th power) years. So, these experiments must monitor large quantities (typically 100 pounds to 1 ton) of material for long periods, with a sensitivity to observe even a handful of decays. This is not easy. We heard 6 talks on neutrinoless double beta decay, discussing a wide variety of possible methods.
Over the past two years, there has been considerable progress toward a very large detector to make precision measurements of neutrino oscillations. The U.S. version would be located in DUSEL, the Deep Underground Science and Engineering Laboratory, which is proposed to be built in an old gold mine in South Dakota. The Japanese are also pursuing a similar project on an island between Japan and South Korea (the location is chosen to be the optimal distance from the Japan Hadron Facility accelerator), and the Europeans are considering several projects at diverse sites.
My talk, on radiodetection of neutrinos, went well, and seemed well received. It was a tough talk to prepare, since I had to introduce the concept, and also cover experiments looking for neutrino interactions in the moon, and two types of experiments looking for neutrino interactions in Antarctic ice (including, of course, ARIANNA). I also had a chance to talk to a number of people who are interested in ARIANNA.
Although Athens is a very interesting city, June is not the optimal time for a visit. They were having a heat wave during the conference, and temperatures were in the high 90's or low 100's (depending on which source you looked at), and it was also fairly humid. Worse, there was a 3-day metro (subway) strike during the conference. This was quite disruptive, since many of us were taking the metro between our hotels and the conference center. Of course, during the strike, the busses were jammed past capacity, and taxis were hard to get.
This strikes was not an isolated incident; more strikes are planned to protest government cutbacks due to the budget deficit and the economic conditions imposed by the European/IMF bailout. The threat of strikes has trimmed the tourist trade (it is down about 15% according to what I've read), and Athens seemed less crowded than usual. My hotel was not overly full, and a fair fraction of the residents were neutrino physicists. My flight to Greece was half empty, and there seemed to be a number of parked Olympic Air planes at the Athens airport.
Thursday, June 24, 2010
Friday, June 11, 2010
Neutrino 2010
It is now June; school is getting out, and the summer conference season is starting. The big conference for neutrino physicists, Neutrino 2010 (it's held every 2 years) is next week, in Athens, Greece. About 530 neutrino physicists will gather for a week, to hear the latest results on everything neutrinos. Talks will cover a results from accelerators (Fermilab, CERN...) and non-accelerator experiments, along with the latest theory.
One hot topics is neutrino oscillations, whereby a neutrino from one flavor (like an electron neutrino) oscillates, over time turning into another flavor, like a muon neutrino. There are three different flavors, connected by three different mixing angles, which give a neutrinos propensity to turn into a different flavor. The three flavors have slightly different masses; the mass differences control how long the conversion takes. There is also a phase angle which, if non-zero, would allow charge-parity (CP) violation in neutrinos. This might help explain why the universe is all matter, with no visible antimatter. One way to study this is to shoot
a beam of neutrinos from an accelerator to a distant detector, and measure the oscillation probability. Another way to study oscillations is to use naturally occurring neutrinos. Neutrinos produced by nuclear reactions in the sun have plenty of time to oscillate before arriving at the earth; this is how neutrino oscillations were initially discovered. Or, one can use neutrinos produced in cosmic-ray air showers, which may oscillate as they pass through the earth on their way to a detector like IceCube.
A number of non-accelerator experiment are looking for a process called neutrinoless double beta decay, whereby an atomic nucleus decays, producing two electrons; the nuclear charge changes by two. For example, ^36Germanium decays into ^36Selenium, plus two neutrinos. This process can only happen if a neutrino is something called a "Majorana particle" which means that it is it's own antiparticle. In any case, the half-life for this process must be very long, well over 10^{22} years, so, one need a very large chunk of germanium to study this.
Neutrino astrophysics is also represented at the conference, with a couple of sessions including talks on high-energy astrophysical neutrinos. I will be giving an overview talk on radio-detection of neutrinos, covering ~ half a dozen experiments, including ARIANNA. It was a challenge to squeeze this all into a 15 + 5 minute (15 to speak, 5 for questions) talk.
I'm not looking forward to the long plane-flight to Athens; this will occupy a good chunk of the weekend.
I will try to post more frequently during the conference, both on conference life, and on new results.
One hot topics is neutrino oscillations, whereby a neutrino from one flavor (like an electron neutrino) oscillates, over time turning into another flavor, like a muon neutrino. There are three different flavors, connected by three different mixing angles, which give a neutrinos propensity to turn into a different flavor. The three flavors have slightly different masses; the mass differences control how long the conversion takes. There is also a phase angle which, if non-zero, would allow charge-parity (CP) violation in neutrinos. This might help explain why the universe is all matter, with no visible antimatter. One way to study this is to shoot
a beam of neutrinos from an accelerator to a distant detector, and measure the oscillation probability. Another way to study oscillations is to use naturally occurring neutrinos. Neutrinos produced by nuclear reactions in the sun have plenty of time to oscillate before arriving at the earth; this is how neutrino oscillations were initially discovered. Or, one can use neutrinos produced in cosmic-ray air showers, which may oscillate as they pass through the earth on their way to a detector like IceCube.
A number of non-accelerator experiment are looking for a process called neutrinoless double beta decay, whereby an atomic nucleus decays, producing two electrons; the nuclear charge changes by two. For example, ^36Germanium decays into ^36Selenium, plus two neutrinos. This process can only happen if a neutrino is something called a "Majorana particle" which means that it is it's own antiparticle. In any case, the half-life for this process must be very long, well over 10^{22} years, so, one need a very large chunk of germanium to study this.
Neutrino astrophysics is also represented at the conference, with a couple of sessions including talks on high-energy astrophysical neutrinos. I will be giving an overview talk on radio-detection of neutrinos, covering ~ half a dozen experiments, including ARIANNA. It was a challenge to squeeze this all into a 15 + 5 minute (15 to speak, 5 for questions) talk.
I'm not looking forward to the long plane-flight to Athens; this will occupy a good chunk of the weekend.
I will try to post more frequently during the conference, both on conference life, and on new results.
Wednesday, June 2, 2010
A paper!
It's been a while since I've posted. The main reason is that I've been busy. One of the things that I've been doing is writing a scientific paper describing the prototype hardware, and the results of our season.
The paper is finally done, and submitted to "Nuclear Instruments and Methods," a journal that publishes papers on instrumentation for nuclear, particle and astro-physics. There it will be peer reviewed by anonymous referees (most likely 2 reviewers), who will recommend whether it should be published or not. Most likely, they will recommend acceptance, but they will also most likely suggest some ways to improve the paper. At the same time, the preprint version was posted to the Cornell preprint server, as number 1005.5193; click on 'PDF' on the upper right to get the paper text.
It may not be obvious, but writing any scientific paper is a lot of work, even a relatively 'simple' paper like this one. Besides the text, there are figures (graphs, plots, etc) which can take a lot of work to make. There are also many numbers to be checked. When you actually sit down to write a paper, you are forcibly confronted with all of the pesky details that you've successfully avoided over the past few months.
This was true even though we made some specific decisions to speed things up. For example, we do not discuss our on-going data analyses. These are interesting, but there is a lot to do before we're ready to publish these analyses.
The major 'result' in this paper is a measurement of the ice thickness at our site: 572 +/- 6 m. In principle, this is a simple calculation - take the round-trip travel time for the radio waves we bounced off the ice-water interface, and divide by twice (for travel in both directions) the speed of light. Unfortunately, nothing is that simple. In any material, the radio waves travel slower than Einstein's famous unchanging speed of light - that invariance only holds in a vacuum. The reason is that the radio waves interact with the medium. One way to think about this is to imagine the radio waves scattering off the atoms in the ice, so they don't travel in a direct, straight line. The speed of the waves depends on the snow/ice density throughout the trip. The top 75 meters of snow/ice (the 'firn') changes gradually from snow (density 40% of that of ice) at the top, to pure ice about 75 m down. After trying to model this myself, I consulted with colleagues and poked around in the library (now mostly on the internet), and finally found a review written by two real professionals who bounce radio waves through Antarctic ice sheets for a living. Suddenly, it was simple. Their article even included different density profiles for different places in Antarctica, along with estimates of the consequent uncertainty.
Another thing that took some time was getting comments from the other authors, and responding to them. Our paper has 7 authors from 3 institutions; everyone who contributed to the experiment and wanted to be an author. We also got some comments from non-authors, mostly people who were involved less directly.
Most of the comments were good, and strengthened the paper. Of course, a good suggestion (e.g. improve this figure by...) takes longer to implement than a bad idea that can be rejected. And, there is always discussion about expanding the scope of the paper.
With larger groups,dealing with comments can get sticky, especially if people disagree on what should or should not go into the paper (that wasn't the case for us). Large collaborations, like IceCube have formal policies and procedures for internally reviewing papers, collecting and mediating over comments from collaborators, etc.
In the end, I think that the paper came out pretty well. But I may be biased.
The paper is finally done, and submitted to "Nuclear Instruments and Methods," a journal that publishes papers on instrumentation for nuclear, particle and astro-physics. There it will be peer reviewed by anonymous referees (most likely 2 reviewers), who will recommend whether it should be published or not. Most likely, they will recommend acceptance, but they will also most likely suggest some ways to improve the paper. At the same time, the preprint version was posted to the Cornell preprint server, as number 1005.5193; click on 'PDF' on the upper right to get the paper text.
It may not be obvious, but writing any scientific paper is a lot of work, even a relatively 'simple' paper like this one. Besides the text, there are figures (graphs, plots, etc) which can take a lot of work to make. There are also many numbers to be checked. When you actually sit down to write a paper, you are forcibly confronted with all of the pesky details that you've successfully avoided over the past few months.
This was true even though we made some specific decisions to speed things up. For example, we do not discuss our on-going data analyses. These are interesting, but there is a lot to do before we're ready to publish these analyses.
The major 'result' in this paper is a measurement of the ice thickness at our site: 572 +/- 6 m. In principle, this is a simple calculation - take the round-trip travel time for the radio waves we bounced off the ice-water interface, and divide by twice (for travel in both directions) the speed of light. Unfortunately, nothing is that simple. In any material, the radio waves travel slower than Einstein's famous unchanging speed of light - that invariance only holds in a vacuum. The reason is that the radio waves interact with the medium. One way to think about this is to imagine the radio waves scattering off the atoms in the ice, so they don't travel in a direct, straight line. The speed of the waves depends on the snow/ice density throughout the trip. The top 75 meters of snow/ice (the 'firn') changes gradually from snow (density 40% of that of ice) at the top, to pure ice about 75 m down. After trying to model this myself, I consulted with colleagues and poked around in the library (now mostly on the internet), and finally found a review written by two real professionals who bounce radio waves through Antarctic ice sheets for a living. Suddenly, it was simple. Their article even included different density profiles for different places in Antarctica, along with estimates of the consequent uncertainty.
Another thing that took some time was getting comments from the other authors, and responding to them. Our paper has 7 authors from 3 institutions; everyone who contributed to the experiment and wanted to be an author. We also got some comments from non-authors, mostly people who were involved less directly.
Most of the comments were good, and strengthened the paper. Of course, a good suggestion (e.g. improve this figure by...) takes longer to implement than a bad idea that can be rejected. And, there is always discussion about expanding the scope of the paper.
With larger groups,dealing with comments can get sticky, especially if people disagree on what should or should not go into the paper (that wasn't the case for us). Large collaborations, like IceCube have formal policies and procedures for internally reviewing papers, collecting and mediating over comments from collaborators, etc.
In the end, I think that the paper came out pretty well. But I may be biased.