Tuesday, February 3, 2015

The 2014/2015 ARIANNA field season - part 2

This is part 2 of the post by Joulien Tatar about the 2014-2015 (really, 2014) field season, with more photos by Chris Persichilli.





Including a few stormy days that kept us in our tents, we installed four new stations in about a week.  There were three stations already installed from previous years so technically we were done with the hexagonal station array construction.  However, given that we were ahead of schedule, we decided to unbury two of the three pre-existing stations in order to upgrade their hardware and conduct some performance studies.  One of the stations was left completely unhampered with it for a third consecutive season to continue studying long term system performance.  Then disaster struck - we ran out of good coffee!  For an avid coffee drinker, like myself, this was a nightmare.  The panic subsided after a desperate scramble turned up a can of instant coffee stashed away in one of the food boxes...phew...

Most of our remaining time on site was devoted to station calibration and ice studies.  Both involved using a pulser, which generates a very short pulse, connected to a LPDA that transmits the pulse.  For the station calibrations, the stations' own antennas and electronics were used to digitize the transmitted pulse.  We managed to collect a large amount of station calibration data for a multitude of different transmitting antenna locations and tine orientations relative to the configuration of the stations' receiving antennas.  Since the received signal was fully propagated through the full electronics chain, as a neutrino signal would, these calibration studies will ultimately help quantify how well we can reconstruct the point in the sky the neutrino arrived from.
 


ARIANNA is in Antarctica because it needs vast amounts of high quality ice to act as a neutrino target and signal propagation medium.  When using the ice nature has pre-made, we are in a sense stuck with what we've got.  We can't change the ice properties, so have to fully understand them in order to accurately predict the shape of the neutrino signal we should expect.  That is the main purpose of the ice studies.  The idea is simple.  We transmit a well understood and reproducible pulse (with pulser and a LPDA) and study what we receive (with an oscilloscope and another LPDA).  The received signal has propagated down through the ice-sheet, reflected off the ice-water interface and then gone back up to the surface.  Thus, if we deconvolve the effect of the electronics and antenna from the received signal, any differences with the transmitted signal are due to properties of the ice.  To study the uniformity of the ice properties in the vicinity of the stations we repeated the same procedure of transmitting and receiving the signal but changed the location.  Then the received signals at the different locations can be compared and contrasted to look for ice difference.  Since we made the setup portable, we were able to move it at a number of different random locations away from camp.  These studies will confirm previous ice measurements establish the degree of uniformity of the properties of the ice at the ARIANNA site.



With our hard drives full of interesting data and the stations running great, it was time to break down camp and get back to McMurdo to take a hot shower that we had all felt desperate for.  Two days before our scheduled helo pick-up, a wolfer flew in to help us take the camp down.  With her help we made a berm of wooden crates full of equipment that is wintering over.  Everything was cleaned, packed and ready for the day of our departure, except our personal sleep tents.  On the day of the flight we took down the personal tents in time for a 3pm departure.  Unfortunately, the helo that was supposed to pick us up had mechanical issues and was grounded for the day.  That caused the helo schedule to fall behind.  We ended up being picked up around 9pm.  Luckily, that day was almost windless and sunny so spending the time outside wasn't a problem.  It was a good opportunity to reflect on our season, enjoy the scenic Transantarctic Mountains, and take lots of pictures.


 Many showers later, all of our stations are running great, and sending high-quality data we are in the process of analyzing.  The race to get ready for the next season has started and preparations for the 2015-2016 season are well on their way.  While preparing for another deployment, we are also hard at work analyzing data and summarizing physics results from the data we collected in a number of papers the ARIANNA collaboration will publish in the not so distant future, so stay tuned!
 



The 2014/2015 ARIANNA field season

The ARIANNA 7-station hexagonal array is now complete, thanks to an outstanding field season!  I didn't get to go on this, but here is a guest post from Joulien Tatar, who did.  The photos are by Chris Persichilli:





This was the last season for ARIANNA R&D work in Antarctica and it was a great one.  We successfully accomplished all of the tasks we had hoped to do this year and more.  ARIANNA is now ready to transition from the R&D phase by scaling up to the full array of ~1500 stations in an effort to detect cosmogenic neutrinos.

The deployment team this season (2014-2015) consisted of 5 physicists from UC Irvine:  Steve Barwick (PI), Corey Reed (Project Scientist: i.e. glorified postdoc :), James Walker (grad student), Chris Persichilli (grad student) and myself Joulien Tatar (postdoc).  Needless to say all of us were really excited to be going to Antarctica.  Chris and James were even more so since this was their first trip down. 


Even though preparation for the following season always begins as soon as we get back from the previous deployment, getting to Antarctica is much simpler than one would think.  NSF has excellent subcontractors who take care of the whole process (plane tickets, luggage, hotels, clothing, etc) of getting us safely to McMurdo, the main USAP base, and then back home.

Before flying with a USAP cargo plane (actually a US Air Force C-17) to McMurdo from Christchurch, New Zealand, we spent a couple of days going through safety briefings and getting appropriate clothing for the weather in Antarctica.  In between our scheduled tasks, we had time to explore Christchurch.  Steve was heartbroken.  He was last in Christchurch before the earthquake.  Most of the historic structures, charming coffee shops, lively bars and restaurants he used to frequent are gone.  However, I saw Christchurch come a long way from the rubbles it was in when I first visited in Nov. 2011, eight months after the major tremor.  Then, Downtown, where most of the structural damage took place, was completely fenced off and guarded by military personnel.  There was destruction everywhere you looked.  Now most of downtown has been rebuilt.  New businesses are opening throughout the city.  People have come back and things seem to be back to normal.  There is rarely chatter about the earthquake any more.

After a couple of days in Christchurch, we left for McMurdo.  Sometimes, the head winds are too high or the weather in McMurdo is poor for landing so the plane has to turn back.  This time we were fortunate to make it to McMurdo on our first try.  We spent about a week in McMurdo.  We had more briefings and training to go through while waiting for all of our cargo, shipped by boat, to make its way to McMurdo so we can leave for our field camp.  The station this year was at maximum capacity.  Many projects that were supposed to take place during the 2013-2014 season were delayed a year due of the government shutdown.  As a result, this year the USAP had to catch up and provide support for more projects than typical.



It took 6 helicopter flights to fly all of the components for the new stations and the rest of our electronics equipment.  Two wilderness first responders (wolfers) went to the ARIANNA site (in Moore's Bay on the Ross Ice Shelf) a day ahead of us to set up camp.  This was great because it allowed us to make every day count by starting work as soon as we arrived at camp.



Not long after we flew to the field camp, the wolfers went back to McMurdo.  The five of us, physicists, were alone on a remote Antarctic ice-sheet fully prepared to fend for ourselves and each other.  This is the first season we have camped without a wolfer.  The important help we were accustomed to getting from them was now all on us.  We had to make sure we always had melted snow for water, cleaned the two common tents (kitchen tent and science tent), check-in with McMurdo daily, cook, etc etc...



Cooking was one of the most time consuming and difficult chores we had.  We would each rotate to cook and clean for a day.  So one person would cook for everyone once every five days, which was not too bad.  Cooking in a small tent with a very limited amount of spices and ingredients (all provided to us from McMurdo) requires a certain amount of ingenuity it turns out we all possess.  The food we made was delectable and we managed not to burn down the tent.  We wrote down our food recipes so they can be used by the deployment team for years to come. :)



Our primary science objective was to have seven stations up and running and collecting high quality data.  That effort began by assembling station components that we did not ship pre-assembled.  The most time-consuming part of the assembly process was the power tower.  It consisted of putting together two ~10' triangular metal segments, mounting a 100W solar panel, and attaching two communication antennas.  It is as simple as it sounds and it took less than an hour to do.  Everything else (battery, antennas, electronics box with DAQ) came pre-assembled.  Once we had all of the components laid out and thoroughly tested, we were ready to take them to their final installation location.  Each station was placed at a corner of a hexagon and has a spacing of ~1km from the center of the hexagon where the seventh station (and our base camp) was located.  The transportation of the station was almost effortless, since we were given a sled we could load everything into and pull it with a snowmobile.  At a station's site we would first install the power tower and then dig four vertically oriented triangular slots to place the antennas in.  Digging these ~6' deep and 2' wide holes with a shovel was the most labor intensive and time consuming (1 hour) process of the station installation.  Once the holes were dug, we placed the Log-periodic Dipole Antennas (LPDA) in them, connected the various cables (power, communication, and LPDA) to the electronics box we placed at the base of the power tower, and had data streaming all the way back to our UC Irvine data server!  It took us less than 4 hours to install a station.  If we started installation a bit after breakfast, the station would be up and running before it was time for lunch.

To be continued in part 2...

Sunday, January 11, 2015

ANITA flies again



The ANITA balloon-born neutrino detection experiment has just finished its third flight.  It flew for  22 days, 9 hours, during which time it circled Antarctic about 1 1/3 times. 

ANITA floats high in the atmosphere (usually more than 100,000 feet), while it's 32 horn antennas look for radio waves from neutrino interactions in the Antarctic ice.  Because of its height, it can scan an enormous volume of ice, out to the horizon, up to 600 km away.  However, because of the distance to the interactions, it has a pretty high energy threshold, above 10^19 eV (roughly 100 times higher than ARIANNA).  In its previous two flights, it did not see any neutrino interactions, but it did set some of the best current limits on  ultra-high energy cosmic neutrinos.   They also observed pulses which they attribute to coming from cosmic-ray air showers.

For the third (and what was planned to be the last) flight, the collaboration made a number of improvements to increase the experiments sensitivity, including the addition of a large, lower-frequency antenna, which be stowed for take-off and then released in-flight to hang below the balloon.   

This flight was shorter than the previous flights; the second (ANITA-II) flight lasted 31 days, while the first flight was 35 days.  NASA has a nice web-page showing the ANITA flight track.   So, although the detector may have been more sensitive, this flight is unlikely to dramatically improve the overall ANITA sensitivity.

Katie Mulrey has written a couple of nice blog posts about the ANITA pre-flight preparations.  They are here and here

I'm eagerly looking forward to hearing more about how the flight went, and how the data looks. 


Tuesday, January 6, 2015

IceCube Year(+) in Review


This has been another good year for IceCube. 

Big Bird made the American Physical Society list of the top ten physics news stories in 2014.  The details were published our article extending the high-energy contained event search to 3 years of data (as discussed in a previous post).

We also published a number of interesting articles on other subjects: cosmic neutrinos at lower energies, dark matter, neutrino oscillations, the search for magnetic monopoles, and other subjects.

Francis Halzen and I have written an article for the CERN Courier summarizing what IceCube has learned so far, and discussing future plans with HEX and PINGU.  I'm happy to report that the article became the cover story for the December, 2014 issue.  It is available here; I may be biased, but I think it's a nice article, well worth reading.


Monday, December 22, 2014

Computing in Science, including a walk down memory lane


One of the bigger changes in science over the past 20+ years has been the increasing importance of computing.  That's probably not a surprise to anyone who has been paying attention, but some of the changes go beyond what one might expect.

Computers have been used to collect and analyze data for way more than 20 years. When I was in graduate school, our experiment (the Mark II detector at PEP) was controlled by a computer, as were all large experiments.  The data was written on 6250 bpi [bits per inch] 9-track magnetic tapes.   These tapes were about the size and weight of a small laptop, and were written or read by a reader the size of a small refrigerator.   Each tape could hold a whopping 140 megabytes.  This data was transferred to the central computing via a fast network - a motor scooter with a hand basket.  It was a short drive, and, if you packed the basket tightly enough, the data rate was quite high. The data was then analyzed on a central cluster of a few large IBM mainframes. 

At the time, theorists were also starting to use computers, and the first programs for symbolic manipulation (computer algebra) were becoming available.   Their use has exploded over the past 20 years, and now most theoretical calculations require computers.  For complex theories like quantum chromodynamics (QCD), the calculations are way too complicated for pencil and paper.  In fact, the only way that we know to calculate QCD parameters manifestly requires large-scale computing.  It is called lattice gauge theory.  It divides space-time into cells, and uses sophisticated relaxation methods to calculate configurations of quarks and gluons on that lattice.  Lattice gauge theory has been around since the 1970's, but only in the last decade or so have computers become powerful enough to complete calculations using physically realistic parameters.

Lattice gauge theory is really the tip of the iceberg, though.  Computer simulations have become so powerful and so ubiquitous that they have really become a third approach to physics, alongside theory and experiment.   For example, one major area of nuclear/particle physics research is developing codes that can simulate high-energy collisions.  These collisions are so complicated, with so many competing phenomena present, that they cannot be understood from first principles.  For example, heavy-ion studies at the Relativistic Heavy Ion Collider (RHIC) and CERN's Large Hadron Collider (LHC),  when two ions collide, the number of nucleon (proton or neutron) on nucleon collisions depends on the geometry of the collision.  The nucleon-nucleon collisions produce a number of quarks and gluons, which then interact with each other.  Eventually, the quarks and gluons form hadrons (mesons, composed of a quark and an antiquark, and baryons, made of three quarks).  These hadrons then interact for a while, and, eventually, fly apart.  During both interaction phases, both short and long-ranged forces are in play, and it appears that a certain component of the interactions may be better described using hydrodynamics (treating the interacting matter as a liquid) than in the more traditional particle interaction sense.  A large number of computer codes have been developed to simulate these interactions, each emphasizing different aspects of the interactions.  A cottage industry has developed to compare the codes, and use them to make predictions for new types of observables, and to expand their range of applicability.   Similar comments apply, broadly, to many other areas of physics, particularly including solid state physics, like semiconductors, superconductivity, and nano-physics.

Of course, computing is also important for Antarctic neutrino experiments.  IceCube uses computers for data collection (each Digital Optical Module (DOM) contains an embedded CPU) and event selection and reconstruction, and, particularly, for detector simulation.   The hardest part of the simulation involves propagating photons through the ice, from the charge particle that creates them to the DOMs.   Here, we need good models of how light scatters and is absorbed in the ice; developing and improving these models has been an ongoing endeavor over the past decade.   Lisa Gerhardt, Juan Carlos Diaz Velez and I have written an article discussing IceCube's ongoing challenges, "Adventures in Antarctic Computing, or How I Learned to Stop Worrying and Love the Neutrino," which appeared in the September issue of IEEE Computer.  I may be biased, but I think it's a really interesting article, which discusses some of the diverse computing challenges that IceCube faces.  In addition to the algorithmic challenges, it also discusses the challenges inherent to the scale of IceCube computing, where we use computer clusters all over the world - hence the diagram as the top of the post.







Monday, November 17, 2014

PINGU - which neutrino is heaviest?


To continue with a theme from an earlier post - neutrino oscillations, we have learned much about neutrino physics over the past decade.  They have mass, and can oscillate from one into another.  We know the mass differences, and we know the relative oscillation probabilities (mixing angles).  But, there are still a couple of very important open questions:

1) Are neutrinos their own antiparticles?  This would mean that they are a type of particle known as "Majorana particles."

2) Do neutrinos violate CP (charge-parity) conservation?

3) What is the mass of the lightest neutrino?  We know the mass differences, but less about the absolute scale - it is less than of order 1 electron volt, but it could be 0.1 electron Volt or 1 nanoelectron volt.

Experiments to answer these questions are large and difficult, with some proposed efforts costing more than a billion dollars and taking more than a decade to build,. and another decade to collect data 

However, IceCube has pointed the way to relatively cheaply answer the third question.    We can do this by building an infill arrray, called PINGU (Precision IceCube Next Generation Upgrade), which will include a much denser array of photo sensors, and so be sensitive to neutrinos with energies of a few GeV - an order of magnitude lower than in IceCubes existing DeepCore infill array.

PINGU will precisely measure the oscillations of atmospheric neutrinos as they travel through the Earth.  Oscillations happen equally well in a vacuum,  as in dense material.  However, low-energy electron-flavored neutrinos (with energies of a few GeV) will also weakly and collectively scatter from the electrons in the Earth as they travel through it.  This will shift how they oscillate.  Importantly, the shift depends on whether the lightest neutrino is mostly the electron neutrino, or mostly other flavors.  I said 'mostly' here because, although neutrinos are produced as electron-, muon- or tau- neutrinos, because of oscillations, as they travel through they Earth, they travel as a mixture of these states.  But, one of the states is mostly electron neutrinos, and we'd like to know if it is the lightest state or not.  The graphic (top) shows the probability of oscillation vs. neutrino energy and zenith angle (angle that it emerges from the Earth) for the two possible hierarchies.  "Normal" means that the mostly-electron neutrino is the lightest, while "Inverted" means that its not.   The oscillogram is taken from the PINGU Letter of Intent; similar plots have been made by a number of people.

We think that we can make this measurement with another 40 strings of digital optical modules, spread over an area smaller than DeepCore.  Standalone, this will cost roughly $100M (less if it is built in tandem with a high-energy extension for IceCube), and take 3 years of installation.  Data taking will take another 3 years or so.

It is worth pointing out that this measurement will also boost efforts to study question 1 via neutrinoless double beta decay, a reaction in which an atomic nucleus decays by converting two protons into neutrons, while emitting two electrons, and no neutrinos.  The comparable process, two-neutrino double beta decay, involves the emission of two electrons and two neutrinos.  This has been observed, but the neutrinoless version has not. The lack of neutrino emission can be inferred by the absence of missing energy.   The expected rate of neutrinoless double beta decays depends on which neutrino is heaviest, among other things, so knowing the answer to question 3 will help provide a definitive answer to question 1.

Thursday, October 23, 2014

Looking ahead - is bigger better?

Now that IceCube has a few years of data under our belt, it is natural to size up where we are, and where we want to go.  

Where we are is pretty clear.  We have observed a clear astrophysical neutrino signal, with a spectral index most likely in the 2.3-2.5 range, so the neutrino flux goes as dN_nu/dE_nu ~ E_nu^-2.3.  However, we have seen no sign of a point source signal (either continuous or transient), so we do not yet know of any specific neutrino sources.  Dedicated searches for neutrinos in coincidence with gamma-ray bursts (GRBs) have largely (but not completely) eliminated GRBs as the source of the observed neutrinos.  Although we continue to collect statistics, we have enough data now that the only way that we could observe a statistically significant point source signal in the near future would be if a powerful transient source appears.   It is likely that IceCube is too small to observe point sources.

This observation of point sources is a natural focus for a next-generation instrument.  ARIANNA (or another radio-neutrino detector) is clearly of high interest, but probably has an energy threshold too high to study the class of cosmic neutrinos seen by IceCube.  So, there growing interest in an optical Cherenkov detector like IceCube, but 10 times bigger.   It wouldn't be possible to deploy 800 strings in a reasonable time (or at a reasonable cost), so most of the preliminary designs involve of order 100 strings of optical sensors with spacings larger than the 125 m prevalent in IceCube - 254 to 360 m are  popular separations. It would have better optical sensors than were available when we built IceCube.

This detector would necessarily have a higher energy threshold than IceCube, but would be well-matched to the astrophysical neutrino signal spectrum.  Clearly, it would do a much better job of measuring the neutrino energy spectrum than IceCube did. 

But, the key question is whether it is sensitive enough to have a good chance of seeing point sources.  There are many calculations which attempt to answer this question.  Unfortunately, the results depend on the assumptions about the distribution of sources in the universe, and their strengths and energy spectra, and, at least so far, we cannot demonstrate this convincingly.  So, stay tuned.

Of course, an alternative direction is toward an infill array with even higher density than PINGU.  This is at a more advanced stage of design, and will be covered in my next post.