Comsol -leaderboard other pages

Topics

Anomaly! Collider Physics and the Quest for New Phenomena at Fermilab

By Tommaso Dorigo
World Scientific

Also available at the CERN bookshop

CCboo1_07_17

Anomaly! is a captivating story of supposed discoveries that turned out not to be. The book provides an honest and not always flattering description of how large high-energy physics collaborations work, what makes experimental physicists excited, and of the occasional interference between scientific goals and personal factors such as ambition, career issues, personality clashes and fear of being scooped. Dorigo, who complements his recollections with many interviews and archival searches, proves to be a highly skilled communicator of science to the general public, as already known to the readers of his often controversial blog A Quantum Diaries Survivor. Thanks to well-chosen alternation of narration and explanation, several sections of the book read like a novel.

The main theme, as indicated by the title, is the anomalies (or outliers) that tantalised members of the CDF collaboration at Fermilab – and sometimes the external world – but ultimately turned out to be red herrings. The author uses these stories to show how cautious experimental particle physicists have to be when applying statistics in their data analysis. He also makes a point about the arbitrariness of the conventional 3σ and 5σ thresholds for claiming “evidence” and “discovery” of a new phenomenon.

Slightly off topic, given the title of the book, three chapters are devoted to the ultimately successful search for the top quark, the first evidence of which was very far from being an “anomaly”: its existence was expected in the mainstream and the “global fits” of other collider data were already pointing at the right mass range. Here Dorigo is interested in the opposite lesson: the conventional thresholds on p-values, originally motivated by the principle “extraordinary claims demand extraordinary proofs”, are hard to justify when a discovery is actually a confirmation of the dominant paradigm. (The author explicitly comments on the similarity with the Higgs boson discovery two decades later.) The saga of the top-quark hunt, which contains many funny and even heroic moments, is also an occasion for the author to elaborate on what he describes as over-conservative attitudes dominating in large teams when stakes are high.

In general, the book’s topics have clearly been chosen more by the importance of the lesson they teach than by their ultimate impact on science. Almost an entire chapter is devoted to a measurement of the Z boson mass at Fermilab, which was already known in advance to be doomed to obsolescence very soon, as the experiments at the upcoming LEP accelerator were more suited to that kind of measurement. Still, the chapter turns out to be an enthralling story, ending with a mysterious attempt by an unsporting competitor from another US laboratory to sabotage the first CDF report of this measurement at an international conference. In some other cases, the choice of topics is driven by their entertainment value, as in the case of the episode of the “Sacred Sword”, a radioactive-contamination incident that luckily ended well for its protagonists.

The author’s role in the book is at the same time that of an insider and of a neutral observer, attending crucial meetings and observing events unfold as a collaboration member among many others, with the remarkable exception of the final story where he plays the role of internal reviewer of one of the eponymous anomalies. In spirit and form, Anomaly! reminds me of Gary Taubes’ celebrated Nobel Dreams, but with more humour and explicit subjectivity. Although far from being scholarly, Anomaly! may also appeal to readers interested in the sociology of science or in the epistemological problem of how a scientific community finally settles on a single consensus, in the vein of Andrew Pickering’s Constructing Quarks, Peter Galison’s How Experiments End and Kent Staley’s The Evidence for the Top Quark: Objectivity and Bias in Collaborative Experimentation. The latter, in particular, is interesting to compare with the chapters of Anomaly! that narrate the same story.

Supersymmetry, Supergravity, and Unification

By Pran Nath
Cambridge

CCboo2_7_17

This book discusses the role played by supersymmetry, and especially supergravity, in the quest for a unified theory of fundamental interactions. These are vast subjects, which not only embrace particle physics but also have ramifications in many other fields, such as modern mathematics, statistical physics and condensed-matter systems.

The author focuses on a rather specific subject: supergravity as a plausible scenario (perhaps more convincing than supersymmetry itself) for physics beyond the Standard Model. This justifies the way the author has chosen to distribute the material over the 24 chapters, for a total of 500 pages.

The first seven chapters introduce the field theories and symmetry principles on which a framework for the unification of particle forces would be based. After a short history of force unification, the author covers general relativity, Yang–Mills theories, spontaneous symmetry breaking, the basics of the Standard Model, the theory of gauge anomalies, effective Lagrangians and current algebra.

Supersymmetry is introduced next, with a short mathematical formulation including the concepts of graded lie algebras, superfields and the basic tools needed to construct (rigid) supersymmetric field theories, their multiplets and invariant Lagrangians. Non-supersymmetric grand unified theories and their supersymmetric extensions are also reviewed, investigating in particular the potential role they play in gauge coupling unification. It is surprising that the author does not discuss the original motivation for advocating supersymmetry in this context, which is related to the hierarchy problem and to the issue of naturalness of scales. No such discussion occurs in this chapter nor in the following one, devoted to the minimal supersymmetric Standard Model. The theory of supergravity and its mathematical structure, including matter couplings, is briefly exposed as well.

The second half of the book includes five chapters dedicated to the phenomenology of supergravity, covering in detail supergravity unification, CP violation, proton decay and supergravity in cosmology and astroparticle physics. In particular, supergravity inflation and supersymmetric candidates for dark matter are discussed at length. Further theories of supergravity and their connection to string theories in diverse dimensions are only briefly touched upon.

The last part of the book provides some tools, such as anti-commuting variables and spinor formalism, which are needed to write supersymmetric Lagrangians and to extract physical consequences. Notations, conventions and other miscellaneous arguments including further references conclude the volume.

The book can be considered as a valuable and updated addition to Steven Weinberg’s third volume on supersymmetry in The Quantum Theory of Fields series (2000, Cambridge University Press).

The author is a world expert on supersymmetry and supergravity phenomenology, who has contributed to the field with many original and outstanding works.

Certainly useful to graduate students in physics, the book could also prove to be a resource for advanced graduate courses in experimental high-energy physics.

A quarter century of DIS workshops

With a total of 304 talks, Deep Inelastic Scattering 2017 (DIS17) demonstrated how deep inelastic scattering (DIS) and related topics permeate most aspects of high-energy physics and how we still have a huge amount to learn about strong interactions. Held at the University of Birmingham in the UK from 3–7 April, more than 300 participants from 41 countries enjoyed a week of lively scientific discussion and largely unanticipated sunshine.

The first of this series of annual international workshops on DIS and related topics took place in Durham, UK, in the Spring of 1993, when the first results from the world’s only lepton-hadron collider, HERA at DESY, were discussed by around 80 participants. A quarter of a century later, the workshop series has toured the globe, digested data from the full lifetime of HERA and numerous fixed-target DIS experiments, as well as playing a major role in the development and understanding of hadron-collider physics.

The dominant theme of DIS17 this year was the relevance of strong interactions, parton densities (PDFs) and DIS to the LHC. But a wide and eclectic range of other topics was included, notably new results from experiments at the Relativistic Heavy Ion Collider (RHIC), JLab and HERA, as well as theoretical advances and future plans for the field.

Following plenary review talks covering the latest news from the field, there followed two and a half days during which seven working groups operated in up to six simultaneous parallel sessions, covering: PDFs; low proton momentum fraction (Bjorken-x) physics; Higgs and beyond-the-Standard Model (BSM) studies in hadron collisions; hadronic, electroweak and heavy-flavour observables; spin and 3D hadron structure; and future facilities. The Birmingham event included a topical lecture on probing ultra-low-x QCD with cosmic neutrinos at IceCube and Auger, and a special session was devoted to the status and scientific opportunities offered by future proposed DIS facilities at CERN such as the Large Hadron electron Collider, LHeC) and at BNL or JLab in the US (the Electron Ion Collider, EIC).

All aspects of proton–proton collisions at the LHC featured during this year’s DIS event, from the role of parton densities and perturbative QCD dynamics in beyond-the Standard Model searches and Higgs boson studies, through the measurement and interpretation of processes that are sensitive to parton densities (such as electroweak gauge boson production), to topics that challenge our understanding of strong-interaction dynamics in the semi- and non-perturbative regimes. Ten years after HERA completed data-taking, the collider still featured strongly. The final round of combined inclusive DIS data published in 2016 by the H1 and ZEUS experiments have been integrated into global PDF fits, and also for a handful of new measurements and combinations. Heavy-ion collision results from RHIC and the LHC were also well represented, as were insights into 3D proton structure and hadron spin from semi-inclusive DIS and polarised proton–proton collisions at COMPASS, JLab and RHIC, and current and future DIS measurements with neutrinos.

Data from HERA and the LHC have brought a new level of precision to the parton densities of the proton, with associated theoretical advances including the push towards higher order (next-to-next-to-next-to leading order) descriptions. Taming the “pathological” rise of the proton gluon density at low-x in the perturbative domain remains a major topic, which is now being addressed experimentally in ultra-peripheral collisions and forward measurements at the LHC, as well as through theoretical modelling of low-x, low-Q2 HERA data with nonlinear parton dynamics and resummation techniques. The related topic of diffractive electron–proton scattering and the heavily gluon-dominated diffractive PDFs is benefiting from the full HERA statistics. New insights into elastic and total cross-sections, such as TOTEM’s observation of a non-exponential term in the four-momentum transfer dependence of the elastic cross-section, are emerging from the LHC data. Uncertainties in PDFs remain large at high x, and intense work is ongoing to understand LHC observables such as top-quark pair production, which are sensitive in this region. New data and theoretical work are revealing the transverse structure of the proton for the first time in terms of transverse-momentum-dependent parton densities. The LHC’s proton–lead collision data are also constraining nuclear PDFs in an unprecedented low-x kinematic region.

Concerning the future of DIS, potential revolutions in our understanding could be made with polarised proton and heavy-ion targets and with step changes in energy and luminosity becoming abundantly clear. The EIC offers 3D hadron tomography and an unprecedented window on the spin and flavour structure of protons and ions. Its eA scattering programme would probe low-x parton dynamics in a region where collective effects ultimately leading to gluon saturation are expected to become important. The LHeC offers a standalone Higgs production programme complementary to that of the LHC, as well as a new level in precision in PDFs that could be applied to extend the sensitivity to new physics at the LHC. The ep and eA scattering programme also would probe low-x parton dynamics in the region where gluon saturation is expected to be firmly established. Together, the proposed facilities open up an exciting set of new windows on hadronic matter with relevance to major questions such as quark confinement and hadronic mass generation.

The next instalment of DIS in April 2018, to be held in Kobe, Japan, is eagerly awaited.

Venice EPS event showcases the best of HEP

Major scientific gatherings such as the European Physical Society (EPS) biennial international conference on High Energy Physics offer a valuable opportunity to reflect on the immense work and progress taking place in our field, including the growing connections between particle physics and the universe at large. This year’s EPS conference, held in Venice, Italy, from 5–12 July, was also the first large conference where the results from the 2015 and 2016 runs of the Large Hadron Collider (LHC) at 13 TeV were presented.

Setting the bar just a day into the Venice event, LHCb announced the discovery of a new doubly charmed baryon from precision measurements of B decays, with heavy-flavour analyses continuing to offer a rich seam of understanding. LHCb also presented the intriguing anomalies being seen in the ratios of certain Standard Model decays that hint at deviations from lepton universality, with further data from LHC Run 2 hotly anticipated.

The LHC is firmly in the precision business these days. In the last two years, the machine has delivered large amounts of collision data to the experiments and striking progress has been made in analysis techniques. These have enabled measurements of rare electroweak processes such as the associated production of a top quark, a Z boson and a quark (tZq) by ATLAS, for example, and the definitive observation of WW scattering by CMS. Top physics is another booming topic, with new top-mass and single-top production measurements and many other results, including “legacy” measurements from the Tevatron experiments, on show.

At the core of the LHC’s analysis programme is the exploration of the Higgs boson, which now enters its sixth year. Particularly relevant is how the Higgs interacts with other particles, since this could be altered by physics beyond the Standard Model. While the Higgs was first spotted decaying into other bosons (W, Z, γ), ATLAS reported the first evidence for the decay of the Higgs boson to a pair of bottom quarks, with a significance of 3.6σ, while CMS presented the first observation by a single experiment of the decay to a pair of τ leptons, with a significance of 5.9σ. The Higgs mass is also narrowing to 125 GeV, while the fundamental scalar nature of the new particle continues to raise hope that it will lead to new insights.

The lack of direct signs of new physics at the LHC is an increasing topic of discussion, and underlies the importance of precision measurements. Direct searches are pushing the mass limits for new particles well into the TeV range, but new physics could be hiding in small and subtle effects. It is clear that there is physics beyond the Standard Model, just not what it is, and one issue is how to communicate this scientifically fascinating but non-headline-worthy aspect of today’s particle-physics landscape.

High precision is also being attained in studies of the strong interaction. ALICE, for example, reported an increase in strangeness production with charged multiplicity that seems to connect smoothly the regimes seen in pp, pPb and PbPb collisions. Overall, and increasingly with complementary results from the other LHC experiments, ALICE is closing in on the evolution of the quark–gluon plasma, and thus on understanding the very early universe.

Particle physics, astrophysics and cosmology are closer today than ever, as several sessions at the Venice event demonstrated. One clear area of interplay is dark matter: if dark matter interacts only through gravity, then finding it will be very difficult for accelerator-based studies, but if it has a residual interaction with some known particles, then accelerators will be leading the hunt for direct detection. Cosmology’s transformation to a precision science continues with the recent detection of gravitational waves, with LIGO’s results already placing the first limits on the mass of the graviton at less than 7.7 × 10–23 eV/c2. There were also updates from dark-energy studies, and about precision CMB explorers beyond Planck.

Neutrino physics is also an extremely vibrant field, with neutrino oscillations continuing to offer chances for discovery. The various neutrino-mixing angles are starting to be well measured and Nova and T2K are zooming in on the value of the CP-violating phase, which seems to be large, given tantalising hints from T2K. The hunt for sterile neutrinos continues, and for neutrinoless double beta decay, with several searches ongoing worldwide.

In summary, the 2017 EPS-HEP conference clearly demonstrated how we are progressing towards a full understanding both of the vastness of the universe and of the tiniest constituents of matter. There are many more results to look forward to, many of which will be ready for the next EPS-HEP event in Ghent, Belgium, in 2019. As summed up by the conference highlights: the field is advancing on all fronts – and it’s impressive.

ITER’s massive magnets enter production

The ITER site

It is 14 m high, 9 m wide and weighs 110 tonnes. Fresh off a production line at ASG in Italy, and coated in epoxy Kapton-glass panels (image top left), it is the first superconducting toroidal-field coil for the ITER fusion experiment under construction in Cadarache, Southern France. The giant D-shaped ring contains 4.5 km of niobium-tin cable (each containing around 1000 individual superconducting wires) wound into a coil that will carry a current of 68,000 A, generating a peak magnetic field of 11.8 T to confine a plasma at a temperature of 150 million degrees. The coil will soon be joined by 18 others like it, 10 manufactured in Europe and nine in Japan. After completion at ASG, the European coils will be shipped to SIMIC in Italy, where they will be cooled to 78 K, tested and welded shut in a 180 tonne stainless-steel armour. They will then be impregnated with special resin and machined using one of the largest machines in Europe, before being transported to the ITER site.

Science doesn’t get much bigger than this, even by particle-physics standards. ITER’s goal is to demonstrate the feasibility of fusion power by maintaining a plasma in a self-sustaining “ignition” phase, and was established by an international agreement ratified in 2007 by China, the European Union (EU), Euratom, India, Japan, Korea, Russia and the US. Following years of delay relating to the preferred site and project costs, ITER entered construction a decade ago and is scheduled to produce first plasma by December 2025. The EU contribution to ITER, corresponding to roughly half the total cost, amounts to €6.6 billion for construction up to 2020.

Fusion for energy

The scale of ITER’s components is staggering. The vacuum vessel that will sit inside the field coils is 10 times bigger than anything before it, measuring 19.4 m across, 11.4 m high and requiring new welding technology to be invented. The final ITER experiment will weigh 23,000 tonnes, almost twice that of the LHC’s CMS experiment. The new toroidal-field coil is the first major magnetic element of ITER to be completed. A series of six further poloidal coils, a central solenoid and a number of correction coils will complete ITER’s complex magnetic configuration. The central solenoid (a 1000 tonne superconducting electromagnet in the centre of the machine) must be strong enough to contain a force of 60 MN – twice the thrust of the Space Shuttle at take-off.

Vacuum-pressure impregnation tooling

Fusion for Energy (F4E), the EU organisation managing Europe’s contribution to ITER, has been collaborating with industrial partners such as ASG Superconductors, Iberdrola Ingeniería y Construcción, Elytt Energy, CNIM, SIMIC, ICAS consortium and Airbus CASA to deliver Europe’s share of components in the field of magnets. At least 600 people from 26 companies have been involved in the toroid production and the first coil is the result of almost a decade of work. This involved, among other things, developing new ways to jacket superconducting cables based on materials that are brittle and much more difficult to handle than niobium-titanium. In total, 100,000 km of niobium-tin strands are necessary for ITER’s toroidal-field magnets, increasing worldwide production by a factor 10.

Since 2008, F4E has signed ITER-related contracts reaching approximately €5 billion, with the magnets amounting to €0.5 billion. Firms that are involved, such as SIMIC where the coils will be tested and Elytt, which has developed some of the necessary tooling, have much to gain from collaborating in ITER. According to Philippe Lazare, CEO of CNIM Industrial Systems Division: “In order to manufacture our share of ITER components, we had to upgrade our industrial facilities, establish new working methods and train new talent. In return, we have become a French reference in high-precision manufacturing for large components.”

CERN connection

Cooling the toroidal-field magnets requires about 5.8 tonnes of helium at a temperature of 4.5 K and a pressure of 6 bar, putting helium in a supercritical phase slightly warmer than it is in the LHC. But ITER’s operating environment is totally different to an accelerator, explains head of F4E’s magnets project team Alessandro Bonito-Oliva: “The magnets have to operate subject to lots of heat generated by neutron irradiation from the plasma and AC losses generated inside the cable, which has to be removed, whereas at CERN you don’t have this problem. So the ITER coolant has to be fairly close to the wire – this is why we used forced-flow of helium inside the cable.” A lot of ITER’s superconductor technology work was driven by CERN in improving the characteristics of superconductors, says Bonito-Oliva: “High-energy physics mainly looks for very high current performance, while in fusion it is also important to minimise the AC losses, which generally brings a reduction of current performance. This is why Nb3Sn strands for fusion and accelerators are slightly different.

CERN entered formal collaboration with ITER in March 2008 via a co-operation agreement concerning the design of high-temperature superconducting current leads and other magnet technologies, with CERN’s superconducting laboratory in building 163 becoming one of the “reference” laboratories for testing ITER’s superconducting strands. Niobium-tin is the same material that CERN is pursuing for the high-field magnets of the High Luminosity LHC and also a possible future circular collider, although the performance demands of accelerator magnets requires significant further R&D. Head of CERN’s technology department, Jose Miguel Jimenez, who co-ordinates the collaboration between CERN and ITER, says that in addition to helping with the design of the cable, CERN played a big role in advising for high-voltage testing of the cable insulation and, in particular, with the metallurgical aspect. “Metallurgy is one of the key areas of technology transfer from CERN to ITER. Another is the HTS current leads, which CERN has helped to design in collaboration with the Chinese group working on the ITER tokamak, and in simulating the heat transfer under real conditions,” he explains. “We also helped with the cryoplants, magnetic-field quality, and on central interlocks and safety systems based on our experience with the LHC.”

Slovenia accedes to associate membership

On 4 July, the Republic of Slovenia became an associate member of CERN in the pre-stage to membership. It follows official notification to CERN that Slovenia has completed internal approval procedures, entering into force an agreement signed in December 2016. “It is a great pleasure to welcome Slovenia into our ever-growing CERN family as an associate Member State in the pre-stage to membership,” said CERN Director-General Fabiola Gianotti. “This now moves CERN’s relationship with Slovenia to a higher level.”

Slovenian physicists contributed to CERN’s programme long before Slovenia became an independent state in 1991, participating in an experiment at LEAR (the Low Energy Antiproton Ring) and on the DELPHI experiment at CERN’s previous large accelerator, the Large Electron–Positron collider (LEP). In 1991, CERN and Slovenia concluded a co-operation agreement concerning the further development of scientific and technical co-operation in the research projects of CERN. In 2009, Slovenia applied to become a Member State of CERN. For the past 20 years, Slovenian physicists have participated in the ATLAS experiment at the Large Hadron Collider. Their focus has been on silicon tracking, protection devices and computing at the Slovenian TIER-2 data centre, and on the tracker upgrade, making use of the research reactor in Ljubljana for neutron irradiation studies.   

“Sloveniaʼs membership in CERN will on the one hand facilitate, strengthen and broaden the participation and activities of Slovenian scientists (especially in the field of experimental physics), on the other it will bring full access of Slovenian industry to CERN orders, which will help to break through in demanding markets with products with a high degree of embedded knowledge,” said Maja Makovec Brenčič, Slovenian minister of education, science and sport.

Slovenia joins Cyprus and Serbia as an associate Member State in the pre-stage to membership. After a period of five years, the CERN Council will decide on the admission of Slovenia to full membership.

Revamped HIE-ISOLDE serves experiments

CERN’s long-running radioactive-ion-beam facility ISOLDE, which produces beams for a wide range of scientific communities, has recently been upgraded to allow higher-energy beams.

In July, the second phase of the High-Intensity and Energy upgrade (HIE-ISOLDE) saw its first user experiments get under way using the high-resolution Miniball germanium detector, which is specially designed for studying nuclear reactions with low-intensity radioactive ion beams. One of the first experiments looked at electromagnetic interactions between selenium-70 and a platinum target, which allow researchers to determine the shape of this radioactive nucleus. It was carried out by a team from the University of the Western Cape in South Africa, marking the first African-led experiment to be carried out at CERN.

Although HIE-ISOLDE’s first physics experiments began in late 2016, earlier this year the facility added a further cryomodule that had to be calibrated, aligned and tested. Each cryomodule contains five superconducting radio-frequency cavities to accelerate the beam to higher energies, and the facility is now able to accelerate nuclei up to an average energy of 7.5 MeV per nucleon, compared with 5.5 MeV last year. The higher energy allows physicists to study the properties of heavier isotopes, and in 2018 a fourth cryomodule will be added to the HIE-ISOLDE linac to reach the final design energy of 10 MeV per nucleon.

The HIE-ISOLDE beams will be available until the end of November, with 13 experiments hoping to use the facility during that time – more than double the number that took data last year.

Precision study reveals proton to be lighter

A team in Germany has made the most precise measurement to date of the mass of a single proton, achieving a precision of 32 parts-per-trillion (ppt). The result not only improves on the precision of the accepted CODATA value by a factor of three but also disagrees with its central value at a level of 3.3 standard deviations, potentially shedding light on other mysteries surrounding the proton.

The proton mass is a fundamental parameter in atomic and particle physics, influencing atomic spectra and allowing tests of ultra-precise QED calculations. In particular, a detailed comparison between the masses of the proton and the antiproton offers a stringent test of the fundamental CPT invariance of the Standard Model.

The team at the Max Planck Institute for Nuclear Physics (MPIK) in Heidelberg and collaborators from RIKEN in Japan used a bespoke electromagnetic Penning trap cooled to 4 K to store individual protons and highly charged carbon ions. By measuring the characteristic cyclotron frequencies of the trapped particles using ultra-sensitive image-current detectors, the mass of the proton in natural units follows directly.

For the new measurement, the team stored one proton and one highly charged carbon ion in separate compartments of the apparatus and then transported them alternately into the central measurement compartment. Purpose-built electronics allowed the proton to be interrogated under identical conditions as the carbon ion, despite its 12-fold lower mass and six-fold smaller charge, and the ratio of the two measured values results directly in the proton mass in atomic units: 1.007276466583±15 (stat)±29 (syst).

The sensitive single-particle detectors were partly developed by the RIKEN group, drawing on experience gained with similar traps for antimatter research at CERN’s Antiproton Decelerator (AD) – specifically the BASE experiment. “The group around Sven Sturm and Klaus Blaum from MPIK Heidelberg, which did the measurement, has great expertise with carbon, whereas the BASE group contributed proton expertise based on 12 years dealing with protons and antiprotons,” explains RIKEN group leader and BASE spokesperson Stefan Ulmer. “We shared knowledge such as know-how on ultra-sensitive proton detectors and the ‘fast-shuttling’ method developed by BASE to perform the proton–antiproton charge-to-mass ratio measurement.”

Interestingly, the new value of the proton mass is significantly smaller than the accepted one and could therefore be linked to well-known discrepancies in the mass of the heaviest hydrogen isotope, tritium. “Our result contributes to solving this puzzle, since it corrects the proton’s mass in the proper direction,” says Blaum. The result also improves the proton–electron mass ratio by a factor two, achieving a relative precision of 43 ppt, where the uncertainty arises nearly equally from the proton and the electron mass.

Although carefully conducted cross-check measurements confirmed a series of previously published values of the proton mass and showed that no unexpected systematic effects were imposed by the new method, such a striking departure from the accepted value will likely challenge other teams to revisit the proton mass. The discrepancy has already inspired the MPIK-RIKEN team to further improve the precision of its measurement, for instance by storing a third ion in the trap and measuring it simultaneously to eliminate uncertainties originating from magnetic-field fluctuations, which are the main source of the systematic error using the new technique.

“It is also planned to tune the magnetic field to even higher homogeneity, which will reduce additional sources of systematic error,” explains BASE member Andreas Mooser. “The methods that will be pioneered in the next step of this experiment will have immediate positive feedback to future BASE measurements, for example to improve the precision in the antiproton-to-proton charge-to-mass ratio.”

KEDR pins down R at low energies

The KEDR collaboration has used the VEPP-4M electron–positron collider at the Budker Institute in Russia to make the most precise measurement of the quantity “R” in the low-energy range. R is defined as the ratio of the radiatively corrected total hadronic cross-section in electron–positron annihilation to the Born cross-section of muon pair production. The dependence of R on the centre-of-mass energy is critical for determining the running strong coupling constant and heavy-quark masses, the anomalous magnetic moment of the muon and the value of the electromagnetic fine structure constant at the Z peak. A substantial contribution to the uncertainties on these quantities comes from the energy region below charm threshold, where KEDR measurements were made.

The KEDR team performed a precise measurement of R at 20 points: in the energy ranges 1.84–3.05 and 3.12–3.72 GeV the weighted averages of R are 2.225±0.051 and 2.189±0.047, respectively, in good agreement with perturbative QCD. At present, it is the most accurate measurement of R in this energy range, to which more than 10 experiments have contributed. It involved a challenging analysis in which the hadronisation of light quarks at low energies was modelled by tuning distributions of parameters essential for the event selection in the various generator codes.

The collaboration now plans to measure R in the range 5–7 GeV, where the last similar experiment was carried out more than a quarter of a century ago.

SKA and CERN co-operate on extreme computing

On 14 July, the Square Kilometre Array (SKA) organisation signed an agreement with CERN to formalize their collaboration in the area of extreme-scale computing. The agreement will address the challenges of “exascale” computing and data storage, with the SKA and the Large Hadron Collider (LHC) to generate an overwhelming volume of data in the coming years.

When completed, SKA will be the world’s largest radio telescope with a total collecting area of more than 1 km2 using thousands of high-frequency dishes and many more low- and mid-frequency aperture array telescopes distributed across Africa, Australia and the UK. Phase 1 of the project, representing approximately 10% of the final array, will generate around 300 PB of data every year – 50% more than has been collected by the LHC experiments in the last seven years. As is the case at CERN, SKA data will be analysed by scientific collaborations distributed across the planet. The acquisition, storage, management, distribution and analysis of such volumes of scientific data is a major technological challenge.

“Both CERN and SKA are and will be pushing the limits of what is possible technologically, and by working together and with industry, we are ensuring that we are ready to make the most of this upcoming data and computing surge,”says SKA director-general Philip Diamond.

CERN and SKA have agreed to hold regular meetings to discuss the strategic direction of their collaborations, and develop demonstrator projects or prototypes to investigate concepts for managing and analysing exascale data sets in a globally distributed environment. “The LHC computing demands are tackled by the Worldwide LHC computing grid, which employs more than half a million computing cores around the globe interconnected by a powerful network,” says CERN’s director of research and computing Eckhard Elsen. “As our demands increase with the planned intensity upgrade of the LHC, we want to expand this concept by using common ideas and infrastructure into a scientific cloud. SKA will be an ideal partner in this endeavour.”

bright-rec iop pub iop-science physcis connect