Bluefors – leaderboard other pages

Topics

CERN explores opportunities for physics beyond colliders

Our understanding of nature’s fundamental constituents owes much to particle colliders. Notable discoveries include the W and Z bosons at CERN’s Super Proton Synchrotron in the 1980s, the top quark at Fermilab’s Tevatron collider in the 1990s and the Higgs boson at CERN’s LHC in 2012. While colliding particles at ever higher energies is still one of the best ways to search for new phenomena, experiments at lower energies can also address fundamental-physics questions.

The Physics Beyond Colliders kick-off workshop, which was held at CERN on 6–7 September, brought together a wide range of physicists from the theory, experiment and accelerator communities to explore the full range of research opportunities presented by the CERN complex. The considered timescale for such activities reaches as far as 2040, corresponding roughly to the operational lifetime of the LHC and its high-luminosity upgrade. The study group has been charged with pulling together interested parties and exploring the options in appropriate depth, with the aim of providing input to the next update to the European Strategy for Particle Physics towards the end of the decade.

As the name of the workshop and study group suggests, a lot of interesting physics can be tested in experiments that are complementary to colliders. Ideas discussed at the September event ranged from searching for particles with masses far below an eV up to more than 1015 eV, to prospects for dark matter and even dark-energy studies.

Theoretical motivation

Searches for electric and magnetic dipole moments in elementary particles are a rich experimental playground, and the enormous precision of such experiments allows a wide range of new physics to be tested. The long-standing deviation of the muon magnetic moment (g-2) from the Standard Model prediction could indicate the presence of relatively heavy supersymmetric particles, but also the presence of relatively light “dark photons”, which are also a possible messenger to the dark-matter sector. A confirmation, or not, of the original g-2 measurement and experimental tests of other models will provide important input to this issue.

Electric dipole moments are inherently linked to the violation of charge–parity (CP) symmetry, which is a necessary ingredient to explain the origin of the baryon asymmetry of the universe. While CP violation has been observed in weak interactions, it is notably absent in strong interactions. For example, no electric dipole moment of the neutron has been observed so far. Eliminating this so-called strong-CP problem gives significant motivation for hypothesising the existence of a new elementary particle called the axion. Indeed, axion-like particles are not only natural dark-matter candidates but they turn out to be one of the features that are abundant in well-motivated extensions of the Standard Model, such as string theory. Axions could help to explain a number of astrophysical puzzles such as dark matter. They may also be connected to inflation in the very early universe and to the generation of neutrino masses, and potentially are even involved with the hierarchy problem.

Neutrinos are also the source of a large range of puzzles, but also opportunities. Interestingly, essentially all current experiments and observations – including that of dark matter – can be explained by a very minimal extension of the Standard Model: the addition of three right-handed neutrinos. In fact, theorists’ ideas range far beyond that, motivating the existence of whole sectors of weakly coupled particles below the Fermi scale.

Ambitions may even lead to tackling one of the most challenging of questions: dark energy. While the effective couplings between ordinary matter and dark energy must be quite small, there is still significant room for observable effects in low-energy experiments, for example using atom interferometry.

Experimental opportunities

It is clear that CERN’s priority over the coming years is the full exploitation of the LHC – first in its present guise and then, from 2026, as the High-Luminosity LHC (HL-LHC). The HL-LHC places stringent demands on intensity and related characteristics, and a major upgrade of the LHC injectors is planned during Long Shutdown 2 (LS2) beginning in 2019 to provide beams in the HL-LHC era. Despite this, the LHC doesn’t actually use many protons. This leaves the other facilities at CERN open to exploit the considerable beam-production capabilities of the accelerator complex.

CERN already has a diverse and far-sighted experimental programme based on the LHC injectors. This spans the ISOLDE radioactive beam facility, the neutron time-of-flight facility (nTOF), the Antiproton Decelerator (AD), the High-Radiation to Materials (HiRadMat) facility, the plasma-wakefield experiment AWAKE, and the North and East experimental areas. CERN’s proton-production capabilities are already heavily used and will continue to be well-solicited in the coming years. A preliminary forecast shows that there is potential capacity to support one more major SPS experiment after the injector upgrade.

The AD is a classic example of CERN’s existing non-collider-based facilities. This unique antimatter factory has several experiments studying the properties of antiprotons and anti-hydrogen atoms in detail. Here, in the experimental domain, the time constant for technological evolution is much shorter than it is for large high-energy detectors. The AD is currently being upgraded with the ELENA ring, which will increase by two orders of magnitude the trapping efficiency of anti-hydrogen atoms and will allow different experiments to operate in parallel. After LS2, ELENA will serve all AD experiments and will secure CERN’s antimatter research into the next decade. The ISOLDE and nTOF facilities also offer opportunities to investigate fundamental questions such as the unitarity of the quark-mixing matrix, parity violation or the masses of the neutrinos.

The three main experiments of the North Area – NA61, COMPASS and NA62 – have well-defined programmes until the time of LS2 and all have longer term plans. After completion of its search for a QCD critical point, NA61 plans to further study QCD deconfinement with emphasis on charm signals. It will also remain a unique facility to constrain hadron production in primary proton targets for future neutrino beams in the US and Japan. The Common Muon and Proton Apparatus for Structure and Spectroscopy (COMPASS) experiment, meanwhile, intends to further study the hadron structure and spectroscopy with RF-separated beams of higher intensity in order to study fundamental physics linked to quantum chromodynamics.

An independent proposal submitted to the workshop involved using muon beams from the SPS to make precision measurements of μ–e elastic scattering, which could reduce by a factor of two the present theoretical hadronic uncertainty on g-2 for future precision experiments. Once NA62 reaches its intended precision on its measurement of the rare decay K+π+νν, the collaboration plans comprehensive measurements in the K sector in addition to one year of operation in beam-dump mode to search for heavy neutral leptons such as massive right-handed neutrinos. In the longer term, NA62 aims to study the rare decay K0π0νν, which would require a similar but expanded apparatus and a high-intensity K0 beam. In general, rare decays might reveal deviations from the Standard Model that indicate the presence of new heavy particles that alter the decay rate.

Fixed ambitions

The September workshop heard proposals for new ambitious fixed-target facilities that would complement existing experiments at CERN. A completely new development at CERN’s North Area is the proposed SPS beam-dump facility (BDF). Beam dump in this context implies a target that absorbs all incident protons and contains most of the cascade generated by the primary-beam interaction. The aim is for a general-purpose fixed-target facility, which in the initial phase will facilitate a general search for weakly interacting “hidden” particles. The Search for HIdden Particles (SHiP) experiment plans to exploit the unique high-energy, high-intensity features of the SPS beam to perform a comprehensive investigation of the dark sector in the few-GeV mass range (CERN Courier March 2016 p25). A complementary approach, based on observing missing energy in the products of high-energy interactions, is currently being explored by NA64 on an electron test beam, and the experiment team has proposed to extend its programme to muon and hadron beams in the future.

From an accelerator perspective, the BDF is a challenging undertaking and will involve the development of a new extraction line and a sophisticated target and target complex with due regard to radiation-protection issues. More generally, the foreseen North Area programme requires high intensity and slow extraction from the SPS, and this poses some serious accelerator challenges. A closer look at these reveals the need for a concerted programme of studies and improvements to minimise extraction beam loss and associated activation of hardware with its attendant risks.

Fixed-target experiments with LHC beams could be carried out using either crystal extraction or an internal gas jet, and initially these might operate in parasitic mode upstream from existing detectors (LHCb or ALICE). Combined with the high LHC beam energy, an internal gas target would open up a new kinematic range to hadron and heavy-ion measurements, while beam extraction using crystals was proposed to measure the magnetic moments of short-lived baryons.

New facilities to complement fixed-target experiments are also under consideration. A small all-electric storage ring would provide a precision measurement of the proton electric dipole moment (EDM) and could test for new physics at the 100 TeV scale, while a mixed electric/magnetic ring would extend such measurements to the deuteron EDM. The physics motivation for these facilities is strong, and from an accelerator standpoint such storage rings are an interesting challenge in their own right (CERN Courier September 2016 p27).

A dedicated gamma factory is another exciting option being explored. Partially stripped ions interacting with photons from a laser have the potential to provide a powerful source of gamma rays. Driven by the LHC, such a facility would increase by seven orders of magnitude the intensity currently achievable in electron-driven gamma-ray beams. The proposed nuSTORM project, meanwhile, would provide well-defined neutrino beams for precise measurements of the neutrino cross-sections and represent an intermediate step towards a neutrino factory or a muon collider.

Last but not least, there are several non-accelerator projects that stand to benefit from CERN’s technological expertise and infrastructure, in line with the existing CAST and OSQAR experiments. CAST (CERN Axion Solar Telescope) uses one of the LHC dipole magnets to search for axions produced in the Sun, while OSQAR attempts to produce axions in the laboratory. Researchers working on IAXO, the next-generation axion helioscope foreseen as a significantly more powerful successor to CAST, have expressed great interest in co-operating with CERN on the design and running of the experiment’s large toroidal magnet. The high-field magnets developed at CERN would also increase the reach of future axion searches in the laboratory as a follow-up of OSQAR at CERN or ALPS at DESY. DARKSIDE, a flagship dark-matter search to be sited in Gran Sasso, also has technological synergies with CERN in the cryogenics, liquid-argon and silicon-photomultiplier domains.

Next steps

Working groups are now being set up to assess the physics case of the proposed projects in a global context, and also their feasibility and possible implementation at CERN or elsewhere. A follow-up Physics Beyond Colliders workshop is foreseen in 2017, and the final deliverable is due towards the end of 2018. It will consist of a summary document that will help the European Strategy update group to define its orientations for non-collider fundamental-particle-physics research in the next decade.

Secrets of discovery

Did you expect that gravitational waves would be discovered during your lifetime?

Yes, and I thought it quite likely it would come from two colliding black holes of just the sort that we did see. I wrote a popular book called Black Holes and Time Warps: Einstein’s Outrageous Legacy, published in 1994, and I wrote a prologue to this book during my honeymoon in Chile in 1984. In that prologue, I described the observation of two black holes, both weighing 25 solar masses, spiralling together and merging and producing three solar masses of energy and gravitational waves, and that’s very close to what we’ve seen. So I was already in the 1980s targeting black holes as the most likely kind of source; for me this was not a surprise, it was a great satisfaction that everything came out the way I thought it probably would.

Can you summarise how an instrument such as LIGO could observe such a weak and rare phenomenon?

The primary inventor of this kind of gravitational-wave detector is Ray Weiss at MIT. He not only conceived the idea, in parallel with several other people, but he, unlike anybody else, identified all of the major sources of noise that would have to be dealt with in the initial detector and he invented ways to deal with each of those. He estimated how much noise would remain after the experiment did what he proposed to limit each noise source, and concluded that the sensitivity that could be reached would be good enough. There was a real possibility of seeing the waves that I as a theorist and colleagues were predicting. Weiss wrote a paper in 1972 describing all of this and it is one of the most powerful papers I’ve ever read, perhaps the most powerful experiment-related paper. Before I read it, I had heard about his idea and concluded it was very unlikely to succeed because the required sensitivities were so great. I didn’t have time to really study it in depth, but it turned out I was wrong. I was sceptical until I had discussions with Weiss and others in Moscow. I then became convinced, and decided that I should devote most of the rest of my career to helping them succeed in the detection of gravitational waves.

How will the new tool of “multi-messenger astronomy” impact on our understanding of the universe?

Concerning the colliding black hole that we’ve seen so far, astronomers who rely on electromagnetic signals have not seen anything coming from them. It’s conceivable that in the future something may be seen because disturbances caused when two black holes collide and merge can lead to X-ray or perhaps optical emissions. We also expect to see many other sources of gravitational waves. Neutron stars orbiting each other are expected to collide and merge, which is thought to be a source of gamma-ray bursts that have already been seen. We will see black holes tear apart and destroy a companion neutron star, again producing a very strong electromagnetic emission as well as neutrino emission. So the co-ordinated gravitational and electromagnetic observation and neutrino observations will be very powerful. With all of these working together in “multi-messenger” astronomy, there’s a great richness of information. That really is the future of a large portion of this field. But part of this field will be things like black holes, where we see only gravitational waves.

Do gravitational waves give us a bearing on gravitons?

Although we are quite sure gravitational waves are carried by gravitons, there is no chance to see individual gravitons based on the known laws of physics. Just as we do not see individual photons in a radio wave because there are so many photons working together to produce the radio wave, there are even more gravitons working together to produce gravitational waves. In technical terms, the mean occupation number of the gravitational-wave field that is seen is absolutely enormous, close to 1040. With so many gravitons there is no hope, unfortunately, to see individual gravitons.

Will we ever reconcile gravity with the three other forces?

I am quite sure gravity will be reconciled with the other three forces. I think it is quite likely this will be done through some version of string theory or M theory, which many theorists are now working on. When it does happen, the resulting laws of quantum gravity will allow us to address questions related to the nature of the birth of the universe. It would also tell us whether or not it is possible to build time machines to go backward in time, what is the nature of the interior of a black hole, and address many other interesting questions. This is a tremendously important effort, by far the most important research direction in theoretical physics today and recent decades. There’s no way I could contribute very much there.

Regarding future large-scale research infrastructures, such as those proposed within CERN’s Future Circular Collider programme, what are the lessons to be learnt from LIGO?

Maybe the best thing to learn is having superb management of large physics budgets, which is essential to make the project succeed. We’ve had excellent management, particularly with Barry Barish, who transformed LIGO and took over as director when we were just about ready to begin construction (Robbie Waught, who had helped us write a proposal to get the funding from the NSF and Congress, also got two research teams at Caltech and MIT to work together in an effective manner). Barry created the modern LIGO and he is an absolutely fantastic project director. Having him lead us through that transition into the modern LIGO was absolutely essential to our success, plus a very good experiment idea and a superb team, of course.

You were an adviser to the blockbuster film Interstellar. Do you have any more science and arts projects ahead?

I am 76. I was a conventional professor for almost 50 years, and I decided for my next 50 years that I want to do something different. So I have several different collaborations: one on a second film; collaborations in a multimedia concert about sources of gravitational waves with Hans Zimmer and Paul Franckman, who did the music and visual effects for Interstellar; and collaborations with Chapman University art professor Lia Halloran on a book with her paintings and my poetry about the warped side of the universe. I am having great fun entering collaborations between scientists and artists and I think, at this point of my life, if I have a total failure with trying to write poetry, well that’s alright: I’ve had enough success elsewhere.

The usefulness of ‘useless’ knowledge

As far back as 1939, the US educator Abraham Flexner penned a stirring paean to basic research in Harper’s Magazine under the title “The usefulness of useless knowledge.” Flexner, perhaps being intentionally provocative, pointed out that Marconi’s contribution to the radio and wireless had been practically negligible. He went on to argue that the 1865 work of James Clerk Maxwell on the theoretical underpinnings of electricity and magnetism, and the subsequent experimental work of Heinrich Hertz on the detection of electromagnetic waves, was done with no concern about the practical utility of the work. The knowledge they sought, in other words, was never targeted to a specific application. Without it, however, there could have been no radio, no television and no mobile phones.

The history of innovation is full of such examples. It is practically impossible to find a piece of technology that cannot be traced back to the work of scientists motivated purely by a desire to understand the world. But basic research goes further. There is something primordial about it. Every child is a natural scientist imbued with curiosity, vivid imagination and a desire to learn. It is what sets us apart from any other species, and it is what has provided the wellspring of innovation since the harnessing of fire and the invention of the wheel. Children are always asking questions: why is the sky blue? What are we made of? It is by investigating questions like these that science has advanced, and because it can inspire children to grow up into future scientists or scientifically aware citizens.

Education and training are among CERN’s core missions. Over the years we have developed programmes that reach everyone from primary-school children to professional physicists, accelerator scientists and computer scientists. We also keep tabs on the whereabouts of young people passing through CERN, and it is enriching to follow their progress. Around 1000 people per year receive higher degrees from universities around the world for work carried out at CERN. Basic research therefore not only inspires young people to study science, it also provides a steady stream of qualified people for business and industry, where their high-tech, international experience allows them to make a positive impact.

Turning to the UN’s admirably ambitious Global Goals for Sustainable Development, which officially came into force on 1 January 2016 and will last for 15 years as part of the Agenda 2030 programme, the focus on science and technology is positive and encouraging. It testifies to a deeper understanding of the importance of science in driving progress that benefits all peoples and helps to overcome today’s most pressing development challenges. But Agenda 2030’s potential can only be fulfilled through sustained commitment and funding by governments. If we are to tackle issues ranging from eliminating poverty and hunger to providing clean and affordable energy, we need science and we need people to be scientifically aware.

Places like CERN are a vitally important ingredient in the innovation chain. We contribute to the kind of knowledge that not only enriches humanity, but also provides ideas that become the technologies of the future. Some of CERN’s technology has immediate impact on society, such as the World Wide Web and the application of particle accelerators to cancer therapy and many other fields. We also train young people. All this is possible because governments support science, technology, engineering and mathematics (STEM) education and basic research, but we should do more. The scientific community, including CERN, urged Agenda 2030 to consider a minimum GDP percentage devoted by every nation to STEM education and basic research. This is particularly important in times of economic downturn, when private funding naturally concentrates on short-term payback and governments focus on domains that offer immediate economic return, at the expense of longer-term investment in fundamental science.

Useless knowledge, as Flexner called it, is at the basis of human development. Humankind’s continuing pursuit of it will make the UN’s development goals achievable.

Cosmology with MATLAB

By Dan Green

World Scientific

51tWsSpJhyL._SX342_SY445_QL70_ML2_

The aim of this book is to show how software packages such as MATLAB can be extremely useful for studying cosmology problems by means of complex simulations. Thanks to the greatly improved accuracy of cosmological data and the increased computing power available, the calculation and graphic tools offered by this software can be profitably employed to study physics problems and compare different models.

A theory that successfully describes the universe and its evolution in terms of only six fundamental parameters has been developed. It accounts for the Big Bang (BB), cosmic microwave background (CMB) radiation and the evolution of matter to the present day. However, the model cannot explain some experimental results. The inflation hypothesis, which postulates the existence of a scalar field that caused an exponential expansion of the very early universe, can solve some of these open problems.

This book provides a basic exposition of BB cosmology and the inflationary model using MATLAB tools for visualisation and to develop the reader’s understanding of the parametric dependence of the observables. Different models are compared, including one that assumes the Higgs field as the scalar inflationary field. In this way, readers can gain experience in using various MATLAB tools (including symbolic mathematics, numerical-solution methods and plots) and also apply them to other problems.

Proton-radius puzzle deepens

The international CREMA (Charge Radius Experiment with Muonic Atoms) collaboration has measured the radius of the deuteron more accurately than ever before, finding that it is significantly smaller than previously thought. The result, which was obtained using laser spectroscopy of muonic deuterium at the Paul Scherrer Institute (PSI) in Switzerland, is consistent with a 2010 measurement of the proton radius by the same group, which also showed a significantly smaller value than expected.

The 2010 result, which found a proton radius of 0.84087±0.00039 fm versus the CODATA value of 0.8751±0.0061 fm, formed the basis of what has been dubbed the proton-radius puzzle. The new measurement of the deuteron’s size gives rise to an analogous mystery. If the results hold firm, they could force physicists to adjust the Rydberg constant, which is currently known to the eleventh decimal place, and perhaps imply the existence of an as-yet-unknown force beyond the Standard Model.

Consisting of one proton and one neutron, the deuteron is the simplest compound nucleus. Its properties, such as the root-mean-square charge radius and polarisability, therefore serve as important benchmarks for understanding nuclear forces and structure. Using the most intense source of muons available, provided by the PSI proton accelerator, the CREMA team injected around 300 low-energy muons per second into an experimental chamber filled with gaseous deuterium molecules. Here, muons eject electrons from the molecules, which break up to form muonic deuterium. A complex pulsed laser system was then used to raise muonic-deuterium atoms from the metastable 2s state into the next excited state, 2p, after which the muons fall back to the ground state and emit an X-ray photon. Because the energy levels of the muonic atom strongly depend on the size of the nucleus, measuring the 2s–2p energy splitting in muonic deuterium by means of laser spectroscopy reveals the size of the deuteron with unprecedented precision.

Based on measurements of three 2s–2p transitions, the team found a value of 2.12562±0.00078 fm for the deuteron radius. This is 2.7 times more accurate but 7.5σ smaller than the CODATA-2010 value of 2.1424±0.0021 fm. The value is also 3.5σ smaller than the radius obtained by electronic deuterium spectroscopy. When combined with the electronic isotope shift, says the team, this yields a proton radius similar to the one measured from muonic hydrogen and thereby amplifies the proton-radius puzzle.

“You could say that the mystery has now doubly confirmed itself,” says lead-author Randolf Pohl of the University of Mainz, Germany. “After our first study came out in 2010, I was afraid some veteran physicist would get in touch with us and point out our great blunder. But the years have passed, and so far nothing of the kind has happened.”

As to the possible cause of the discrepancy, physicists remain cautious. “Naturally, it can’t be that the deuteron – any more than the proton – has two different sizes,” says CREMA-member Aldo Antognini of the PSI. The most likely explanation would be experimental imprecision, he says. For example, there could be an error with the hydrogen spectroscopy, which was used in some of the earlier measurements of both the proton and deuteron’s size. “If it should actually turn out that the hydrogen spectroscopy is giving a false – that is, minimally shifted – value, that would mean that the Rydberg constant must be minimally changed,” he says.

Currently, research groups in Munich, Paris, Toronto and Amsterdam are working to obtain more accurate measurements via hydrogen spectroscopy, and their results are expected in the coming years. The CREMA collaboration has also recently studied muonic helium-3 and helium-4 ions, and expects at least a five-fold reduction in uncertainties in their charge radii compared with the electron-scattering results. Next, the team plans to target the magnetic properties of the proton by measuring the so-called Zemach radius, which is the limiting quantity when comparing experiment and theory of the 1s hyperfine splitting in regular hydrogen.

“If all of the relevant experiments are correct, there must be some physics beyond the Standard Model going on,” says Gerald A Miller of the University of Washington, who was not involved in the PSI study. “In particular, the muon–proton and electron–proton interactions differ in ways that cannot be accounted for by the electron–muon mass difference, and that statement is strengthened by the newly published result.”

Further experiments should show whether the proton-radius measurements based on hydrogen atoms are less accurate than originally stated, he adds. One is CREMA’s measurement of the helium-4 radius using muonic atoms, while another is the MUon proton Scattering Experiment (MUSE) at PSI, which compares muon– to electron–proton scattering for both charges. “Given enough experiments, the proton-radius puzzle will be solved in a few years,” says Miller. “We can all speculate about the final result, but it’s more scientific to wait for results.”

DIRAC experiment observes new exotic atom

Inclusive πK production via the interaction p + Ni πK+ + X. The ionisation or break up of AKπ leads to so-called atomic pairs.
Image credit: arXiv:1605.06103v1.


The DIRAC (DImeson Relativistic Atom Complex) experiment at CERN has discovered a new type of exotic atom made up of a π and K meson. The “strange dimesonic” state provides an ideal laboratory for testing quantum chromodynamics (QCD) in the low-energy region and joins a long list of non-standard atoms, which also include positronium, muonic atoms and antihydrogen, that help physicists study in detail how particles interact.

The πK system is a type of hadronic atom in which meson pairs are bound electromagnetically, similar to pionium (a π+π atom), which was studied previously by the DIRAC experiment. It was produced by firing 24 GeV/c protons from CERN’s Proton Synchrotron (PS) into platinum or nickel foil targets. Here, relativistic dimesonic bound states formed by Coulomb final-state interactions move inside the target and can break up, resulting in particle pairs characterised by a small relative momentum in the centre-of-mass system of the pair. The recently upgraded DIRAC experiment observed 349±62 such atomic pairs, corresponding to a signal of 5.6σ.

The observation is part of an effort that began almost a decade ago, when the DIRAC collaboration reported a 3.2σ enhancement of πK pairs at low relative momentum based on a platinum target. This was followed in 2014 by 3.6σ evidence using a nickel target.  The latest result is based on data obtained in both platinum and nickel targets, also using information from all subdetectors and enhanced background description based on Monte Carlo simulations, and represents the first statistically significant observation of the strange dimesonic πK atom.

The team is now working towards a measurement of the πK atom lifetime, which is predicted to be 3.5±0.4 fs. This will allow DIRAC to measure for the first time a parameter of low-energy πK interactions called the scattering length. With an expected precision of around 35%, the result can then be compared with precise predictions from lattice QCD and chiral perturbation theory. The latter provides a way to predict scattering lengths in the low-energy sector of QCD and to study a potential flavour dependence of the quark condensate responsible for chiral-symmetry breaking.

“A recent study has shown that the production rate of πK atoms from the proton beam of CERN’s Super Proton Synchrotron will be 25 times higher compared to that from the PS,” explains DIRAC spokesperson Leonid Nemenov. “This will allow us to measure the πK scattering lengths with a precision better than 5% and to check the precise predictions of QCD for these values basing on a Lagrangian describing u, d and s quarks. The DIRAC collaboration is now planning to prepare the dedicated Letter of Intent for such an experiment.”

AMS reports unexpected result in antiproton data

Researchers working on the AMS (Alpha Magnetic Spectrometer) experiment, which is attached to the International Space Station, have reported precision measurements of antiprotons in primary cosmic rays at energies never before attained. Based on 3.49 × 105 antiproton events and 2.42 × 109 proton events, the AMS data represent new and unexpected observations of the properties of elementary particles in the cosmos.

Assembled at CERN and launched in May 2011, AMS is a 7.5 tonne detector module that measures the type, energy and direction of particles. The goals of AMS are to use its unique position in space to search for dark matter and antimatter, and to study the origin and propagation of charged cosmic rays: electrons, positrons, protons, antiprotons and nuclei. So far, the collaboration has published several key measurements of energetic cosmic-ray electrons, positrons, protons and helium, for example finding an excess in the positron flux (CERN Courier November 2014 p6). This latter measurement placed constraints on existing models and gave rise to new ones, including collisions of dark-matter particles, astrophysical sources and collisions of cosmic rays – some of which make specific predictions about the antiproton flux and the antiproton-to-proton flux ratio in cosmic rays.

With its latest antiproton results, AMS has now simultaneously measured all of the charged-elementary-particle cosmic-ray fluxes and flux ratios. Due to the scarcity of antiprotons in space (being outnumbered by protons by a factor 10,000), experimental data on antiprotons are limited. Using the first four years of data, AMS has now measured the antiproton flux and the antiproton-to-proton flux ratio in primary cosmic rays with unprecedented precision. The measurements, which demanded AMS provide a separation power of approximately 106, provide precise experimental information over an extended energy range in the study of elementary particles travelling through space.

The antiproton (p), proton (p), and positron (e+) fluxes are found to have nearly identical rigidity dependence

In the absolute-rigidity (the absolute value of the momentum/charge) range 60–500 GV, the antiproton (p), proton (p), and positron (e+) fluxes are found to have nearly identical rigidity dependence, while the electron (e) flux exhibits a markedly different rigidity dependence. In the absolute-rigidity range below 60 GV, the p/p, p/e+ and p/e+ flux ratios each reach a maximum, while in the range 60–500 GV these ratios unexpectedly show no rigidity dependence.

“These are precise and completely unexpected results. It is difficult to imagine why the flux of positrons, protons and antiprotons have exactly the same rigidity dependence and the electron flux is so different,” says AMS-spokesperson Samuel Ting. “AMS will be on the Space Station for its lifetime. With more statistics at higher energies, we will probe further into these mysteries.”

LHC hits 2016 luminosity target

 

At the end of August, two months ahead of schedule, the integrated luminosity delivered by the LHC reached the 2016 target value of 25 fb–1 in both the ATLAS and CMS experiments. The milestone is the result of a large group of scientists and technical experts who work behind the scenes to keep the 27 km-circumference machine operating at the highest possible performance.

Following a push to produce as many proton–proton collisions as possible before the summer conferences, several new ideas, such as a novel beam-production technique in the injectors, have been incorporated to boost the LHC performance. Thanks to these improvements, over the summer the LHC was routinely operating with peak luminosities 10%–15% above the design value of 1034 cm–2 s–1.

This is a notable success, especially considering that a temporary limitation in the Super Proton Synchrotron only allows the injection of 2220 bunches per beam instead of the foreseen 2750, and that the LHC energy is currently limited to 6.5 TeV instead of the nominal 7 TeV. The excellent availability of all the key systems of the LHC is one of the main reasons behind these achievements.

The accelerator team is now gearing up for the season finale. Following a technical stop, a forward proton–proton physics run took place in mid-September. Proton–proton physics is scheduled to continue until the last week in October, after which proton–lead physics will take over for a period of one month. The LHC and its experiments can look forward to the completion of what is already a very successful year.

Beamline competition calls all schools

CERN has announced the 4th edition of its Beamline for Schools Competition, which will see two winning teams of students undertake an experiment of their design at a fully equipped CERN beamline next year. The 2017 competition, which is made possible thanks to the Alcoa Foundation, is open to teams of high-school students aged 16 or older. A maximum of nine students per winning team will be invited to CERN, and teams can be composed of pupils from a single school or a number of schools working together.

Previous winners have tested webcams and classroom-grown crystals in the beamline, and also studied how particles decay and investigated high-energy gamma rays. Interested schools can pre-register their team to receive the latest updates, and further information about how to apply can be found at cern.ch/bl4s. The deadline for submissions is 31 March 2017.

ATLAS observes single top-quarks at 13 TeV

The neural-network discriminant for the positive lepton channel.

The ATLAS collaboration is exploiting the window of opportunity opened by the LHC’s 13 TeV run to search directly for unknown particles. Complementary to this approach, the collaboration is also looking for deviations in the cross-sections and kinematic distributions of Standard Model processes, which could be caused by energy-dependent couplings that become accessible at the higher collision energy.

Using data recorded in 2015 corresponding to an integrated luminosity of 3.2 fb–1, ATLAS has recently measured the total cross-sections of single top-quark and top-antiquark production via the t-channel exchange of virtual W bosons. This channel has exciting kinematic features such as polarised top-quarks and forward spectator jets. Compared to the dominant top-quark−top-antiquark (tt) pair-production process, however, the single-production process is experimentally more challenging due to a higher background level. Because the two major background processes are W+jets and tt pair production, the selection of candidate events requires one charged lepton, missing transverse momentum and two hadronic jets to be present (exactly one of which has to be identified to contain b hadrons).

To measure the cross-section of top-quark and top-antiquark production separately, the events are separated into two channels according to the sign of the lepton charge. ATLAS uses neural networks to exploit the kinematic differences between the signal and background processes as much as possible, thereby optimising the statistical power of the data set. Ten different kinematic variables were combined into a discriminant, which is assumed to be close to zero for background-like events and unity for signal-like events (see figure).

The cross-sections were measured to be 156±28 pb for top-quark production and 91±19 pb for top-antiquark production. These are slightly higher than expected (+15% and +12%, respectively), but still in good agreement with the predictions. The largest uncertainties are related to the Monte Carlo generators used to model the t-channel single top-quark process and the tt pair-production process, the b-jet identification efficiency and the jet energy scale. In future measurements of the single top-quark process, the focus will be on reducing the uncertainties, exploiting improved calibrations and extending studies of the Monte Carlo generators.

bright-rec iop pub iop-science physcis connect