Dimensional analysis is a mathematical technique that allows one to deduce the relationship between different physical quantities from the dimensions of the variables involved in the system under study. It provides a method to simplify – when possible – the resolution of complex physical problems.
This short book provides an introduction to dimensional analysis, covering its history, methods and formalisation, and shows its application to a number of physics and engineering problems. As the author explains, the foundation principle of dimensional analysis is essentially a more precise version of the well known rule against “adding apples and oranges”; nevertheless, the successful application of this technique requires physical intuition and some experience. Most of the time it does not lead to the solution of the problem, but it can provide important hints about the direction to take, constraints on the relationship between physical variables and constants, or a confirmation of the correctness of calculations.
After a chapter covering the basics of the method and some historical notions about it, the book offers application examples of dimensional analysis in several areas: mechanics, hydrodynamics, thermal physics, electrodynamics and quantum physics. Through the solution of these real problems, the author shows the possibilities and limitations of this technique. In the final chapter, dimensional analysis is used to take a few steps in the direction of uncovering the dimensional structure of the universe.
Aimed primarily at physics and engineering students in their first university courses, it can also be useful to experienced students and professionals. Being concise and providing problems with solutions at the end of each chapter, the book is ideal for self study.
This textbook aims to provide a concise introduction to string theory for undergraduate and graduate students.
String theory was first proposed in the 1960s and has become one of the main candidates for a possible quantum theory of gravity. While going through alternate phases of highs and lows, it has influenced numerous areas of physics and mathematics, and many theoretical developments have sprung from it.
It was the intention of the author to include in the book just the fundamental concepts and tools of string theory, rather than to be exhaustive. As Schomerus states, there are already various textbooks available that cover this field in detail, from its roots to its most modern developments, but these might be dispersive and overwhelming for students approaching the topic for the first time.
The volume is composed of a brief historical introduction and two parts, each including various chapters. The first part is dedicated to the dynamics of strings moving in a flat Minkowski space. While these string theories do not describe nature, their study is helpful to understand many basic concepts and constructions, and to explore the relation between string theory and field theory on a two-dimensional “world”.
The second part deals with string theories for four-dimensional physics, which can be relevant to the description of our universe. In particular, the motion of superstrings on backgrounds in which some of the dimensions are curled up is studied (this phenomenon is called compactification). This part, in turn, includes three sections devoted to as many subtopics.
First, the author discusses conformal field theory, also dealing with the SU(2) Wess–Zumino–Novikov–Witten model. Then, he passes on to treat Calabi–Yau spaces and the associated string compactification. Finally, he focuses on string dualities, giving special emphasis to the AdS/CFT correspondence and its application to gauge theory.
This book, the 27th volume in the “Advanced Series on Directions in High Energy Physics”, presents a robust and accessible summary of 60 years of technological development at CERN. Over this period, the foundations of today’s understanding of matter, its fundamental constituents and the forces that govern its behaviour were laid and, piece by piece, the Standard Model of particle physics was established. All this was possible thanks to spectacular advances in the field of particle accelerators and detectors, which are the focus of this volume. Each of the 12 chapters is built using contributions from the physicists and engineers who played key roles in this great scientific endeavour.
After a brief historical introduction, the story starts with the Synchrocyclotron (SC), CERN’s first accelerator, which allowed – among other things – innovative experiments on pion decay and a measurement of the anomalous magnetic dipole moment of the muon. While the SC was a development of techniques employed elsewhere, the Proton Synchroton (PS), the second accelerator constructed at CERN and now the cornerstone of the laboratory’s accelerator complex, was built using the new and “disruptive” strong-focusing technique. Fast extraction from the PS combined with the van der Meer focussing horn were key to the success of a number of experiments with bubble chambers and, in particular, to the discovery of the weak neutral current using the large heavy-liquid bubble chamber Gargamelle.
The book goes on to present the technological developments that led to the discovery of the Higgs boson by the ATLAS and CMS collaborations at the LHC, and the study of heavy-quark physics as a means to understand the dynamics of flavour and the search for phenomena not described by the SM. The taut framework that the SM provides is evident in the concise reviews of the experimental programme of LEP: the exquisitely precise measurements of the properties of the W and Z bosons, as well as of the quarks and the leptons – made by the ALEPH, DELPHI, OPAL and L3 experiments – were used to demonstrate the internal consistency of the SM and to correctly predict the mass of the Higgs boson. An intriguing insight into the breadth of expertise required to deliver this programme is given by the discussion of the construction of the LEP/LHC tunnel, where the alignment requirements were such that the geodesy needed to account for local variations in the gravitational potential and measurements were verified by observations of the stars.
The rich scientific programme of the LHC and of LEP before it have their roots in the systematic development of the accelerator and detector techniques. The accelerator complex at CERN has grown out of the SC.
The book concisely presents the painstaking work required to deliver the PS, the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS). Experimentation at these facilities established the quark-parton model and quantum chromodynamics (QCD), demonstrated the existence of charged and neutral weak currents, and pointed out weaknesses in our understanding of the structure of the nucleon and the nucleus. The building of the SPS was expedited by the decision to use single-function magnets that enabled a staged approach to its construction. The description of the technological innovations that were required to realise the SPS includes the need for a distributed, user-friendly control-and-monitoring system. A novel solution was adopted that exploited an early implementation of a local-area network and for which a new, interpretative programming language was developed.
The book also describes the introduction of the new isotope separation online technique, which allows highly unstable nuclei to be studied, and its evolution into research on nuclear matter in extreme conditions at ISOLDE and its upgrades. The study of heavy-ion collisions in fixed target experiments at the SPS collider and now in the ALICE experiment at the LHC, has its roots in the early nuclear-physics programme as well. The SC, and later the PS, were ideal tools to create the intense low-energy beams used to test fundamental symmetries, to search for rare decays of hadrons and leptons, and to measure the parameters of the SM.
Reading this chronicle of CERN’s outstanding record, I was struck by its extraordinary pedigree of innovation in accelerator and detector technology. Among the many examples of groundbreaking innovation discussed in the book is the construction of the ISR which, by colliding beams head on, opened the path to today’s energy
frontier. The ISR programme created the conditions for pioneering developments such as the multi-wire proportional chamber, and the transition radiation detector as well as large-acceptance magnetic spectrometers for colliding-beam experiments. Many of the technologies that underpin the success of the proton–antiproton (Spp S) collider, LEP and the LHC, were innovations pioneered at the ISR. For example, the discovery of the W and Z bosons at the Spp S relied on the demonstration of stochastic cooling and antiproton accumulation. The development of these techniques allowed CERN to establish its antiproton programme, which encompassed the search for new phenomena at the energy frontier, as well as the study of discrete symmetries using neutral kaons at CPLEAR and the detailed study of the properties of antimatter.
The volume includes contributions on the development of the computing, data-handling and networking systems necessary to maximise the scientific output of the accelerator and detector facilities. From the digitisation and handling of bubble- and spark-chamber images in the SC era, to the distributed processing possible on the worldwide LHC computing grid, the CERN community has always developed imaginative solutions to its data-processing needs.
The book concludes with thoughtful chapters that describe the impact on society of the technological innovations driven by the CERN programme, the science and art of managing large, technologically challenging and internationally collaborative projects, and a discussion of the R&D programme required to secure the next 60 years of discovery.
The contributions from leading scientists of the day collected in this relatively slim book document CERN’s 60-year voyage of innovation and discovery, the repercussions of which vindicate the vision of those who drove the foundation of the laboratory – European in constitution, but global in impact. The spirit of inclusive collaboration, which was a key element of the original vision for the laboratory, together with the aim of technical innovation and scientific excellence, are reflected in each of the articles in this unique volume.
Superconductivity underpins large particle accelerators such as the LHC. It is also a key enabling technology for a future circular proton–proton collider reaching energies of 100 TeV, as is currently being explored by the Future Circular Collider (FCC) study. To address the considerable challenges of this project, a conductor development workshop was held at CERN on 5 and 6 March to create momentum for the FCC study and bring together industrial and academic partners.
The alloy niobium titanium is the most successful practical superconductor to date, and has been used in all superconducting particle accelerators and detectors. But the higher magnetic fields required for the high-luminosity LHC (11 T) and FCC (16 T) call for new materials. A potential superconducting technology suitable for accelerator magnets beyond fields of 10 T is the compound niobium tin (Nb3Sn), which is the workhorse of the 16 T magnet-development programme at CERN.
The FCC conductor programme aims to develop Nb3Sn multi-filamentary wires with a critical current-density performance of at least 1500 A/mm2 at 16 T and at a temperature of 4.2 K. This is 30 to 50% higher than the conductor for the HL-LHC, and a significant R&D effort – including fundamental research on superconductors – is needed to meet the magnet requirements of future higher-energy accelerators. The FCC magnets will also require thousands of tonnes of superconductor, calling for a wire design suitable for industrial-scale production at a considerably lower cost than current high-field conductors.
CERN is engaged in collaborative conductor development activities with a number of industrial and academic partners to achieve these challenging targets, and the initial phase of the programme will last for four years. Representatives from five research institutes and seven companies from the US, Japan, Korea, Russia, China and Europe attended the March meeting to discuss progress and opportunities. Firms already producing Nb3Sn superconducting wire for the FCC programme are Kiswire Advanced Technology (KAT); the TVEL Fuel Company working with the Bochvar Institute (JSC VNIINM); and, from Japan, Furukawa Electric and Japan Superconductor Technology (JASTEC), both coordinated by the KEK laboratory. Columbus Superconductor SpA is participating in the programme for other superconducting materials, while two additional companies – Luvata and Western Superconducting Technologies (WST) – expressed their interest in the CERN conductor programme and attended the workshop.
The early involvement of industry is crucial and the event provided an environment in which industrial partners were free to discuss their proposed technical solutions openly. In the past, most companies produced a bronze-route Nb3Sn superconductor, which has no potential to reach the target for FCC. Thanks to their commitment to the programme, and with CERN’s support, companies are now investing in a transition to internal tin processes. Innovative approaches for characterising superconducting wires are also coming out of academia. Developments include the correlation of microstructures, compositional variations and superconducting properties at TU Wien, research into promising internal-oxidation routes at the University of Geneva, phase transformation studies at TU Bergakademie Freiberg and research of novel superconductors for high-fields at SPIN in Genova.
The FCC initiative is of key importance for future high-energy accelerators. Participants agreed that this could result in a new class of high-performance Nb3Sn material suitable not only for accelerator magnets, but also for other large-scale applications such as high-field NMR and laboratory solenoids.
The 2018 Moriond sessions took place in La Thuile, Italy, from 10 to 24 March. The annual conference is an opportunity to review the progress taking place over the breadth of particle physics, from B physics to gravitational waves and from advances in electroweak precision tests to exploratory searches for dark matter. The quest for new particles covers an impressive 40 orders of magnitude, from the 10–22 eV region explored via neutron-spin precession to the 13 TeV energy of the LHC and the highest-energy phenomena in cosmic rays.
Anomalies in the decays of beauty quarks found by the LHCb and B-factory experiments continue to entice theorists to look for explanations for these possible hints of lepton non-universalities, and experimental updates are eagerly awaited (CERN Courier April 2018 p23). Progress continues in the field of CP violation in B and D mesons, while quantitative tests of the CKM matrix are being helped by precise calculations in lattice QCD. Progress on leptonic and semi-leptonic D-meson decays was reported from BES-III, while Belle showed hints of the decay B+→μ+ν and evidence of isospin violation. In the classic field of rare kaon decays, the CERN SPS experiment NA62 showed its first results, presenting one candidate event for the elusive decay K+→π+νν– obtained using a novel in-flight technique.
Fundamental parameters of the Standard Model (SM), such as the masses of the top quark and W boson, are being measured with increasing precision. The SM is in very good shape, apart from the long-standing exception of forward–backward asymmetries. These asymmetries are also being studied at the LHC, and precise results continue to be produced at the Tevatron.
Results on top-quark production and properties are constantly being improved, while hadron spectroscopy is as lively as ever, both in the light meson sector (BESIII) and in heavy quarks (BaBar, Belle and LHCb). Data from HERA are still providing new inputs into structure functions, with c and b quarks now being included. Heavy-ion collisions at LHC and RHIC continue to explore the behaviour of the hot, dense quark–gluon plasma, while proton–ion collisions at fixed-target experiments (LHCb) provide useful inputs to constrain Monte Carlo event generators.
The news on the Brout–Englert–Higgs mechanism is good, with progress on many fronts. The amount of new results presented by ATLAS and CMS, including evidence of ttH production, and global combinations of production and decay channels shows that the precision on the couplings between the Higgs and other particles is improving fast. The study of rare decays of the Higgs boson is advancing rapidly, with the H →μ+μ–decay within reach.
The search for heavy resonances is continuing at full speed, with CMS presenting one Z´ analysis employing the full, available LHC data set (77.3 fb–1), including 2017 data. Is supersymmetry hiding somewhere? Several analyses at ATLAS and CMS are now being recast to include more elusive signatures with various amounts of R-parity violation and degenerate spectra, and there is an emerging interest in performing searches beyond narrow-width approximations.
The search for dark matter is on, with WIMP direct searches maturing rapidly (XENON1T) and including novel experiments like DARKSIDE which, with just 20 l of very pure liquid argon, presented a new best limit at low masses. This field shows that, with ingenuity, there is still room to have an impact. Bounds on extremely light axion-like particles were presented by ADMX for QCD axions, and for neutron electric dipole moments. The interplay between these dedicated experiments and the search for directly produced dark matter at the LHC are highly complementary.
The field of neutrinos continues to offer steady progress with new and old puzzles being addressed. The latest results from T2K disfavour CP-conservation at the level of two sigma, while NOvA disfavours the inverted hierarchy at a similar level. A revival of decay-at-rest techniques and the measurement of coherent elastic neutrino–nucleus scattering by COHERENT (CERN Courier October 2017 p8) were noticeable. The search for heavy neutral leptons is taking place at both fixed-target and collider experiments, while reactor experiments (like DAYA BAY and STEREO) are meant to clarify the reactor antineutrino anomaly. The puzzle of sterile neutrinos is not yet completely clarified after 20 years. Deep-sea (ANTARES) and South Pole (IceCube) experiments are now mature, with ANTARES showing, among other things, searches for point-like sources. IceCube presented a brand new analysis looking for tau-neutrino appearance that is competitive with existing results. Neutrinoless double-beta decay experiments are now biting into the sensitivity of the inverted mass hierarchy (CUORE and EXO-200), with promising developments in the pipeline (CUPID).
Completing the programme of the electroweak session was a glimpse into the physics of cosmic rays and gravitation. The sensitivity of AUGER is now such that mapping the origin of the cosmic rays on the sky becomes feasible. With the observation of a binary neutron-star collapse by LIGO and VIRGO, 2017 saw the birth of multi-messenger astronomy.
On the theory side, one continues to learn from the abundance of experimental results, and there is still so much to be learned by the study of the Higgs and further high-energy exploration. SM computations are breaking records in terms of the numbers of loops and legs involved. Electroweak and flavour physics can indicate the way to new physics scales and extend the motivation to search for dark matter at very low energies. The case to study neutrinos remains as compelling as ever, with many outstanding questions still waiting for answers.
Shortly after midday on 30 March, protons circulated in the Large Hadron Collider (LHC) for the first time in 2018. Following its annual winter shutdown for maintenance and upgrades, the machine now enters its seventh year of data taking and its fourth year operating at a centre-of-mass energy of 13 TeV.
The LHC restart, which involves numerous other links in the CERN accelerator chain, went smoothly. At the beginning of March, the first protons were injected into Linac2, and then into the Proton Synchrotron (PS) Booster. On 8 March the PS received beams, followed by the Super Proton Synchrotron (SPS) one week later. In parallel, the teams had been checking all the LHC hardware and safety installations. No fewer than 1560 electrical circuits had to be powered and about 10,000 tests performed before the LHC was deemed ready to accept protons.
The first beams circulating contain just one bunch, each of which contains 20 times fewer protons than in normal operation; the energy of the beam is also limited to the SPS injection energy of 450 GeV. Further adjustments and tests were undertaken in early April to allow the energy and density of the bunches to be increased.
Bunching up
As the Courier went to press, a few bunches had been injected and accelerated at full energy for optics and collimator commissioning. The first stable beams with only a few bunches are scheduled for 23 April, but could take place earlier thanks to the good progress made so far. This will be followed by a period of gradual intensity ramp-up, during which the number of bunches will be increased stepwise. Between each step, a formal check and validation will take place. The target is to fill each ring with 2556 bunches, and the experiments will be able to undertake serious data collection as soon as the number rises above 1200 bunches – which is expected in early May.
Since early December 2017, when the CERN accelerator complex entered its end of year technical stop, numerous important activities were completed on the LHC and other accelerators. Alongside standard maintenance, the LHC injectors underwent significant preparatory work for the LHC Injector Upgrade project (LIU) foreseen for 2019 and 2020 (CERN Courier October 2017 p32). In the LHC, an important activity was the partial warm-up of sector 1-2 to solve the so-called 16L2 issue, wherein frozen air from an accidental ingress caused beam instabilities and losses during last year’s run: a total of 7 l of frozen air was removed from each beam vacuum chamber during the warm up.
The objective for the 2018 run is to accumulate more data than was collected last year, targeting an integrated luminosity of 60 fb–1 (as opposed to the 50 fb–1 recorded in 2017). While the intensity of collisions is being ramped up in the LHC, data taking is already under way at various fixed-target experiments at CERN that are served by beams from the PS Booster, PS and SPS. The first beams for physics at the n_TOF experiment and the PS East Area started on 30 March. The nuclear-physics programme at ISOLDE restarted on 9 April, followed closely by that of the SPS North Area and, later, the Antiproton Decelerator.
2018 is an important year for the main LHC experiments (ALICE, ATLAS, CMS and LHCb) because it marks the last year of Run 2. In December, the accelerator complex will be shut down for a period of two years to allow significant upgrade work for the High-Luminosity LHC, with the deployment of the LIU project and the start of civil-engineering work. Operations of the HL-LHC will begin in earnest in the mid-2020s, promising an integrated luminosity of 3000 fb–1 by circa 2035.
On 20 March, the DESY laboratory in Germany presented its strategy for the coming decade, outlining the areas of science and innovation it intends to focus on. DESY is a member of the Helmholtz Association, a union of 18 scientific-technical and medical-biological research centres in Germany with a workforce of 39,000 and annual budget of €4.5 billion. The laboratory’s plans for the 2020s include building the world’s most powerful X-ray microscope (PETRA IV), expanding the European X-ray free-electron laser (XFEL), and constructing a new centre for data and computing science.
Founded in 1959, DESY became a leading high-energy-physics laboratory and today remains among the world’s top accelerator centres. Since the closure of the HERA collider in 2007, the lab’s main accelerators have been used to generate synchrotron radiation for research into the structure of matter, while DESY’s particle-physics division carries out experiments at other labs such as those at CERN’s Large Hadron Collider.
Together with other facilities on the Hamburg campus, DESY aims to strengthen its role as a leading international centre for research into the structure, dynamics and function of matter using X rays. PETRA IV is a major upgrade to the existing light source at DESY that will allow users to study materials and other samples in 100 times more detail than currently achievable, approaching the limit of what is physically possible with X rays. A technical design report will be submitted in 2021 and first experiments could be carried out in 2026.
Together with the international partners and operating company of the European XFEL, DESY is planning to comprehensively expand this advanced X-ray facility (which starts at the DESY campus and extends 3.4 km northwest). This includes developing the technology to increase the number of X-ray pulses from 27,000 to one million per second (CERN Courier July/August 2017 p18).
As Germany’s most important centre for particle physics, DESY will continue to be a key partner in international projects and to set up an attractive research and development programme. DESY’s Zeuthen site, located near Berlin, is being expanded to become an international centre for astroparticle physics, focusing on gamma-ray and neutrino astronomy as well as on theoretical astroparticle physics. A key contribution to this effort is a new science data-management centre for the planned Cherenkov Telescope Array (CTA), the next-generation gamma-ray observatory. DESY is also responsible for building CTA’s medium-sized telescopes and, as Europe’s biggest partner in the neutrino telescope IceCube located in the Antarctic, is playing an important role in upgrades to the facility.
The centre for data and computing science will be established at the Hamburg campus to meet the increasing demands of data-intensive research. It will start working as a virtual centre this year and there are plans to accommodate up to six scientific groups by 2025. The centre is being planned together with universities to integrate computer science and applied mathematics.
Finally, the DESY 2030 report lists plans to substantially increase technology transfer to allow further start-ups in the Hamburg and Brandenburg regions. DESY will also continue to develop and test new concepts for building compact accelerators in the future, and is developing a new generation of high-resolution detector systems.
“We are developing the campus in Hamburg together with partners at all levels to become an international port for science. This could involve investments worth billions over the next 15 years, to set up new research centres and facilities,” said Helmut Dosch, chairman of DESY’s board of directors, at the launch event. “The Zeuthen site, which we are expanding to become an international centre for astroparticle physics, is undergoing a similarly spectacular development.”
Following severe damage caused by flooding on 9 November, the INFN-CNAF Tier-1 data centre of the Worldwide LHC Computing Grid (WLCG) in Bologna, Italy, has been fully repaired and is back in business crunching LHC data. The incident was caused by the burst of a large water pipe at high pressure in a nearby street, which rapidly flooded the area where the data centre is located. Although the centre was designed to be waterproof against natural events, the volume of water was overwhelming: some 500 m3 of water and mud entered the various rooms, seriously damaging electronic appliances, computing servers, network and storage equipment. A room hosting four 1.4 MW electrical-power panels was filled first, leaving the centre without electricity.
The Bologna centre, which is one of 14 Tier-1 WLCG centres located around the world, hosts a good fraction of LHC data and associated computing resources. It is equipped with around 20,000 CPU cores, 25 PB of disk storage, and a tape library presently filled with about 50 PB of data. Offline computing activities for the LHC experiments were immediately affected. About 10% of the servers, disks, tape cartridges and computing nodes were reached by floodwater, and the mechanics of the tape library were also affected.
Despite the scale of the damage, INFN-CNAF personnel were not discouraged, quickly defining a roadmap to recovery and then attacking one by one all the affected subsystems. First, the rooms at the centre had to be dried and then meticulously cleaned to remove residual mud. Then, within a few weeks, new electrical panels were installed to allow subsystems to be turned back on.
Although all LHC disk-storage systems were reached by the water, the INFN-CNAF personnel were able to recover the data in their entirety, without losing a single bit. This was thanks in part to the available level of redundancy of the disk arrays and to their vertical layout. Wet tape cartridges hosting critical LHC data had to be sent to a specialised laboratory for data recovery.
A dedicated computing farm was set up very quickly at the nearby Cineca computing centre and connected to INFN-CNAF via a high-speed 400 Gbps link to enable the centre to reach the required LHC capacity for 2018. During March, three months since the incident, all LHC experiments were progressively put back online. Following the successful recovery, INFN is planning to move the centre to a new site in the coming years.
The NA62 collaboration at CERN has found a candidate event for the ultra-rare decay K+→ π+ ν ν, demonstrating the experiment’s potential to test heavily-suppressed corners of the Standard Model (SM).
The SM prediction for the K+→ π+ ν ν branching fraction is 0.84 ± 0.03 × 10–10. The very small value arises from the underlying coupling between s and d quarks, which only occurs in loops and is suppressed by the couplings of the quark-mixing CKM matrix. The SM prediction for this process is very clean, so finding even a small deviation would be a strong indicator of new physics.
NA62 was approved a decade ago and builds on a long tradition of kaon experiments at CERN (CERN Courier June 2016 p24). The experiment acts as a kaon factory, producing kaon-rich beams by firing high-energy protons from the Super Proton Synchrotron into a beryllium target and then using advanced Cherenkov and straw trackers to identify and measure the particles (see figure). Following pilot and commissioning runs in 2014 and 2015, the full NA62 detector was installed in 2016 enabling the first analysis of the K+→ π+ ν ν channel.
Finding one candidate event from a sample of around 1.2 × 1011 events allowed the NA62 team to put an upper limit on the branching fraction of 14 × 10–10 at a confidence level of 95%. The result, first presented at Moriond in March, is thus compatible with the SM prediction, although the statistical errors are too large to probe beyond-SM physics.
Several candidate K+→ π+ ν ν events have been previously reported by the E949 and E787 experiments at Brookhaven National Laboratory in the US, inferring a branching fraction of 1.73 ± 1.1 × 10–10 – again consistent, within large errors, with the SM prediction. Whereas the Brookhaven experiments observed kaon decays at rest in a target, however, NA62 observes them in-flight as they travel through a large vacuum tank and therefore creates a cleaner environment with less background events.
The NA62 collaboration expects to identify more events in the ongoing analysis of a 20-fold-larger dataset recorded in 2017. In mid-April the experiment began its 2018 operations with the aim of running for a record number of 218 days. If the SM prediction is correct, the experiment is expected to see about 20 events with the data collected before the end of this year.
“The K+→ π+ ν ν decay is special because, within the SM, it allows one to extract the CKM element |Vtd| with a small theoretical uncertainty,” explains NA62 spokesperson Augusto Ceccucci. “Developing the necessary experimental sensitivity to be able to observe this decay in-flight has involved a long R&D programme over a period of five years, and this effort is now starting to pay off.”
The ALPHA collaboration at CERN’s Antiproton Decelerator (AD) has reported the most precise direct measurement of antimatter ever made. The team has determined the spectral structure of the antihydrogen 1S–2S transition with a precision of 2 × 10–12, heralding a new era of high-precision tests between matter and antimatter and marking a milestone in the AD’s scientific programme (CERN Courier March 2018 p30).
Measurements of the hydrogen atom’s spectral structure agree with theoretical predictions at the level of a few parts in 1015. Researchers have long sought to match this stunning level of precision for antihydrogen, offering unprecedented tests of CPT invariance and searches for physics beyond the Standard Model. Until recently, the difficulty in producing and trapping sufficient numbers of delicate antihydrogen atoms, and acquiring the necessary optical laser technology to interrogate their spectral characteristics, has kept serious antihydrogen spectroscopy out of reach. Following a major programme by the low-energy-antimatter community at CERN during the past two decades and more, these obstacles have now been overcome.
“This is real laser spectroscopy with antimatter, and the matter community will take notice,” says ALPHA spokesperson Jeffrey Hangst. “We are realising the whole promise of CERN’s AD facility; it’s a paradigm change.”
ALPHA confines antihydrogen atoms in a magnetic trap and then measures their response to a laser with a frequency corresponding to a specific spectral transition. In late 2016, the collaboration used this approach to measure the frequency of the 1S–2S transition (between the lowest-energy state and the first excited state) of antihydrogen with a precision of 2 × 10–10, finding good agreement with the equivalent transition in hydrogen (CERN Courier January/February 2017 p8).
The latest result from ALPHA takes antihydrogen spectroscopy to the next level, using not just one but several detuned laser frequencies with slightly lower and higher frequencies than the 1S–2S transition frequency in hydrogen. This allowed the team to measure the spectral shape, or spread in colours, of the 1S–2S antihydrogen transition and get a more precise measurement of its frequency (see figure). The shape of the spectral line agrees very well with that expected for hydrogen, while the 1S–2S resonance frequency agrees at the level of 5 kHz out of 2.5 × 1015 Hz. This is consistent with CPT invariance at a relative precision of 2 × 10−12 and corresponds to an absolute energy sensitivity of 2 × 10−20 GeV.
Although the precision still falls short of that for ordinary hydrogen, the rapid progress made by ALPHA suggests hydrogen-like precision in antihydrogen is now within reach. The collaboration has also used its unique setup at the AD to tackle the hyperfine and other key transitions in the antihydrogen spectrum, with further seminal results expected this year. “When you look at the lineshape, you feel you have to pinch yourself – we are doing real spectroscopy with antimatter!” says Hangst.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.