As a scientific experiment, IceCube is approaching maturity. We have collected 6 years of data with the complete detector. In one way, this is a lot. For most analyses, the statistical precision increases as the square root of the amount of data collected). So, to make a measurement with half the error requires four times as much data. It will take IceCube another 18 years to half the statistical errors that are possible with the current data. 18 years is a long time, so for many of our studies, future increases in precision will be relatively slow.
This is not true for everything. For many analyses, it is a matter of waiting for the right astrophysical event. Assuming that high-energy neutrinos are produced by episodic (non-constant) sources, one nearby gamma-ray burst, or supernova, or whatever is producing astrophysical neutrinos, would be a huge discovery. This is worth waiting for.
We continue to improve our analysis techniques, and we will be able to continue to make progress here for some time. And, there are some analyses that are only now becoming possible, either because they require software that is only now becoming possible, or because they require a lot of data. So, we are still productively busy.
But, we are also thinking more intensively about follow-on experiments. There are several possibilities on the table.
PINGU would be a dense infill array, with a threshold of a few GeV, able to determine which neutrino flavor is the lightest.
Gen2 (above) is a comprehensive IceCube upgrade, probably including PINGU, but focused on an array 10 times larger than IceCube. It would have a similar number of strings to IceCube, but be build with more sensitive optics. Because the strings would be more widely dispersed than IceCube, it would have a higher energy threshold, well matched to studies of astrophysical neutrinos. We (both the collaboration and the broader neutrino astronomy community) think, but cannot completely demonstrate that Gen2 will be able to find the source of our cosmic neutrinos.
Gen2 will likely also include a large (10 square kilometer) surface air-shower array. One main purpose of the array will be to reject downward-going atmospheric neutrinos, improving our sensitivity to astrophysical sources which are above the horizon; the center of our galaxy is of prime interest.
There are several efforts to build a large radio-detection array, either as part of Gen2, or as a stand-alone project. Here, the main possibilities are ARIANNA, which I have discussed multiple times before, or ARA, a radio-detection project at the South Pole.
in Europe, there is also a large effort to build an optical detector array, KM3NeT in the Mediterranean Sea. KM3NeT will eventually include a large (~ 5 cubic kilometers?) astrophysical array, and a smaller array, ORCA, which will have physics goals similar to PINGU. KM3NeT is starting to deploy test strings now, and ORCA might be complete in the next ~ 3 years. Construction of the astrophysical array is also starting, although the 5 km^3 array will not be complete until the mid 2020's.
On the U.S. side, these projects are perfectly aligned with National Science Foundation priorities. NSF director France Cordova recently unveiled a 9-point R & D agenda; one of the 6 science themes was "multimessenger astronomy." Unfortunately, even despite this, these U.S. projects seem stalled, due to lack of funding now and in the near-term future. From my perspective, this is very unfortunate; if an excellent science case and a good match to the funding agency's directors priorities isn't enough to move a project forward, then what is needed? Although the full Gen2 would not be inexpensive (comparable in cost to IceCube), one could build either of the radio-detection arrays or a moderate sized surface array for under $50 million - not much by current particle/nuclear physics standards.
Some of the ideas presented here were presented in a comment I wrote for Nature: Invest in Neutrino Astronomy; it appeared in the May 25 issue.