There’s a recent paper proposing some strategies of dealing with neutrinos in dark matter searches. A detector can’t be shielded from neutrinos, since the mean free path in materials is incredibly large, so neutrinos represent an irreducible background in dark matter searches. Neutrinos can interact in such a way as to give the same signal as background and thus can only be removed statistically. This is a big problem for dark matter searches, as once this neutrino background is reached, an experiment relying on the number of events will report a cross section limit that scales with the square root of the exposure (mass times time) rather than with the exposure. So, you would have to run for four times as long to improve the result by a factor two or one hundred times as long to improve by a factor of ten. When we don’t even know the order of magnitude of the number of signal events, this is a major problem for very large scale detectors running with low detection thresholds.
This paper proposes trying to mitigate this effect by looking at how the number of signal events changes throughout the year. In both the neutrino and dark matter cases, a sinusoidal annual modulation is expected. If the neutrino signal is well understood, evidence dark matter can be extrapolated from deviations from the expected neutrino-only result. Dark matter will change the phase and magnitude of the modulating signal, so this information can be used to strengthen the statistical power of the result. The drawback of these methods is that they require several years of data in order to see the signal (it’s an annual modulation so not much can be done about this). Experiments regularly run for years without major issues, so this is not an intractable problem.