Tag Archives: Experiments

New ATLAS Search for New Physics

ATLAS released a new preprint yesterday on a search for new physics using monophoton events with missing transverse momentum. The paper mentions that this type of search is sensitive to various models of new physics, including some variations of large extra dimensions, dark matter, and supersymmetry.

In particular, these events could indicate the collision of two quarks leading to invisible particles, with a photon coming from initial state radiation. Final state radiation generally won’t be allowed in many of these models since an invisible particle won’t interact with photons at tree level. Collisions have basically no transverse momentum, so a high energy photon without anything else to balance out the transverse momentum is a strong indication that some invisible particle was also present in the final state. This channel is not free of backgrounds. As the paper notes, one of the most important backgrounds is Z production where the Z decays to neutrinos. If a photon is emitted in conjunction with this, then the even will look identical to the signal event from new physics. This represents an irreducible background since even a perfect reconstruction process can’t eliminate it. Instead, new physics must be found on top of this background (as well as others).

As with more or less every other paper from the LHC so far, no significant deviation from the Standard Model is found, so ATLAS is able to set exclusion limits for a number of different models.


New Results from T2K and MINOS

There are a couple neutrino papers out today.

T2K has a new result looking for short-baseline electron neutrino disappearance in the near detector. This kind of search is looking for neutrinos oscillating into a non-interacting, or sterile, neutrino. There are various “anomalies” in certain oscillation results that can potentially be resolved by introducing sterile neutrinos, so this is a way to search for them or exclude their existence.

MINOS has a new result on charged-current quasielastic (CCQE) scattering of muon neutrinos on iron using their near detector. They find some differences between their data and current models at low Q2.

DarkSide Releases First Dark Matter Results

The liquid-argon time projection chamber experiment DarkSide-50 has release the first results from their 50 kg dark matter detector. Using atmospheric argon rather than some highly purified source of argon, they are able to obtain the best current dark matter limit using argon as the target material. The result isn’t as strong as the current leading limits, which use xenon, but this was likely expected. Encouragingly, the DarkSide ended up with a background-free measurement, which bodes well for the continued use of argon-based dark matter detectors. DarkSide is currently operating in the Gran Sasso laboratory (LNGS) in Italy, a deep underground facility where many low-background particle physics experiments are performed.

New Minerva Pion Production Result

Minerva has a new preprint out today on coherent charged pion production in carbon. They measure both types of charged pions in neutrino and antineutrino beams. The measurement looks at energies up to 20 GeV and finds that agreement with the GENIE Monte Carlo generator is not very good. Minerva is a neutrino experiment using the NuMI beamline at Fermilab.

SuperK Proton Decay Result

The SuperK collaboration has released a new proton decay search, looking for protons decaying to a neutrino and a kaon. SuperK (Super-Kamiokande) is a large water Cherenkov detector in the Kamioka mine near Toyama, Japan that has already been running for many years. In addition to results like this, SuperK is also used as the far detector for T2K.

Proton decay is not allowed by the Standard Model due to conservation of baryon number. Many extensions of the Standard Model, such as Grand Unified Theories (GUTs) that combine electroweak interactions with strong interactions at very high energy scales do have proton decay. GUTs typically propose that the new unified force have a larger symmetry group that breaks down through some process into the SU(3)xSU(2)xU(1) symmetry of the Standard Model. This is similar to how the SU(2) (weak isospin) x U(1) (hypercharge) symmetry group of the Standard Model is broken into two clearly distinct forces at energies much less than the electroweak scale of around 100 GeV. The W and Z bosons acquire masses (80.4 and 91.2 GeV, respectively) via the Higgs mechanism, while the choice of the Higgs vacuum expectation value (vev) and mixing angle between the weak isospin and hypercharge fields are chosen to give us a massless photon, a massive neutral Higgs, and massive W and Z bosons with different masses and couplings to other particles. The fact that the W and Z have mass and the photon does not ensures that below the electroweak scale, the weak force is much weaker than electromagnetism.

One of the simplest GUTs is an SU(5) symmetry, although this has been ruled out for some time.  This theory has multiplets that include both leptons and quarks, and interactions conserve the difference between lepton number and baryon number rather than both separately. This allows for a proton to decay to modes including leptons. Other GUTs similarly include various channels for proton decay.

This result looks for a kaon in the final state and finds a lower limit on the proton lifetime through this channel of 5.9×1033 years: many orders of magnitude longer than the age of the universe.

Physics for Non-Physicists: Decay Widths

While cross sections tell us how strong the interactions are between different types of particles, there is another quantity – the decay width – which tells us how quickly the decay of an unstable particle is likely to happen.

As in particle collisions, decays proceed in a probabilistic manner. If we start with an unstable particle, the probability that it will have decayed by a time interval t later is

P(t) = \left(\frac{1}{2}\right)^{t/\tau_{1/2}} = \exp\left(-\frac{t}{\tau}\right) = \exp\left(-\Gamma t\right)

(though checking resets the clock for decays) where τ1/2 is the half-life, τ is the lifetime, and Γ is the decay width. You can easily work out the relationships between these three values. They are just three different ways to parameterize the same quantity.

Half-lives are the most commonly used value for a general audience because reducing a quantity by factors of 2 is easier to visualize than reducing by factors of e. In high energy physics, there are several reasons to use the decay width rather than a lifetime or half-life.

In calculations, the decay width is directly analogous to the cross section. It includes factors for the kinematics of the initial and final state and a squared matrix element (or amplitude). If a decay cannot happen, the decay width will end up being 0, which is more mathematically useful than the corresponding infinite lifetime.

When calculating decay widths, there can be many different channels. Each decay channel (final state) has its own width (a partial width), while the sum of the partial widths gives the total width. So, if there are a series of decay channels labeled i, the total width is

\Gamma = \frac{1}{\tau} = \sum\limits_{i} \Gamma_i.

The different types of decays all occur with the same lifetime. The ratio of a partial width to the total width tells gives us the fraction of all decays that go into the channel(s) described by the partial width. This ratio is known as the branching fraction.

The decay width has effects beyond just telling us the lifetime. It turns out that the decay width also forces us to modify the mass of the decaying particle in calculations by adding an imaginary part. One interesting effect of this is that it means that unstable particles do not have a well-defined mass. For example, if you create a Z boson in a collision and it then decays to particles that you measure, even if you perfectly measure the momenta of the decay products you won’t necessarily be able to reconstruct the Z mass of 91.2 GeV. Rather, you’ll see a distribution of masses centered near 91.2 GeV with a characteristic width equal to Γ for the Z boson. Typically, the production and subsequent decay of an unstable particle results in us seeing a peak in the invariant mass (center of mass energy) distribution of the final state particles that we measure.

This example for the Z boson was used to definitively determine that there can only be 3 flavors of Standard Model neutrinos. Collider detectors don’t actually find neutrino interactions directly. Rather, their branching ratios affect the shape of the Z peak, so by measuring the peak in another channel, the number of neutrinos could be extrapolated from the shape. The shape of the Z peak as measured by the LEP electron-positron collider at CERN (LEP was replaced by the LHC, which sits in the same tunnel) almost perfectly matched the expected result for three neutrino flavors.  From this result, we know that any additional neutrinos must either not interact with the Z (these would be “sterile” neutrinos if they don’t interact directly with anything in the standard model) or they must be heavy enough to prevent the Z from decaying to the new neutrinos and to prevent any hints of their existence from popping up in other measurements.

Hadrons that undergo strong decays, such as the Δ (delta) particles (spin 3/2 particles made of up and down quarks – like an excited state of a proton or neutron) are often called as “resonances” rather than “particles” because their lifetimes are so short that they cannot be feasibly measured. However, by having short lifetimes, they have large widths (clear resonant peaks) that can be measured, so we can indirectly find the lifetime by looking at an energy spectrum.

Finally, measurements of things like branching fractions provide a very nice way to test different models that might predict the values of these things. In some cases, the width can make a measurement more difficult. We are lucky that the particle we think is the Higgs has a fairly light mass of 125 GeV. At this mass, its width is of order 1 GeV, so the experiments are actually largely limited by the detector resolution. If the mass were more like 1 TeV or more, the width would expand dramatically, eventually being about as large as the mass. In this case, the peak, if you could still call it that for such a large width, would not be easy to find above background levels, and a discovery would be extremely difficult if not impossible.