Comsol -leaderboard other pages

Topics

Vector-boson scattering probes quartic coupling

Figure 1

The electroweak (EW) sector of the Standard Model (SM) predicts self-interactions between W and Z gauge bosons through triple and quartic gauge couplings. Following first measurements at LEP and at the Tevatron during the 1990s, these interactions are now a core part of the LHC physics programme, as they offer key insights into EW symmetry breaking, which, in the case of the SM, causes the W and Z bosons to acquire mass as a result of the Brout–Englert–Higgs mechanism. The quartic coupling can be probed at colliders via rare processes such as tri-boson production, which the CMS collaboration observed for the first time earlier this year, and vector-boson scattering (VBS).

The scattering of longitudinally polarised W and Z bosons is a particularly interesting probe of the SM, as its tree-level amplitudes would violate unitarity at high energies without delicate cancellations from quartic gauge couplings and Higgs-boson contributions. Thus, the study of VBS processes provides key insight into the quartic gauge couplings as well as the Higgs sector. These processes offer sensitivity to enhancements caused by models of physics beyond the SM, which modify the Higgs sector with additional Higgs bosons contributing to VBS.

Vector-boson scattering is characterised by the presence of two forward jets, with a large di-jet invariant mass and a large rapidity separation. CMS previously reported the first observation of same-sign W±W± production using the data collected in 2016. The same-sign W±W± process is chosen because of the smaller background yield from other SM processes compared to the opposite-sign W±W process. The collaboration has now updated this analysis and performed new studies of the EW production of two jets produced in association with WZ, and ZZ boson pairs using data collected between 2016 and 2018 at a centre-of-mass energy of 13 TeV, corresponding to 137 fb–1. Vector-boson pairs were selected by their decays to electrons and muons. The W±W± and WZ production modes were studied by simultaneously measuring their production cross sections using several kinematical observables. The measured total cross section for W±W± production of 3.98 ± 0.45 (± 0.37 stat. only) fb is the most accurate to date, with a precision of roughly 10%. No deviation from SM predictions is evident.

Though the contribution from background processes induced by the strong interaction is considerably larger in the WZ and ZZ final states, the scattering centre-of-mass energy and the polarisation of the final-state bosons can be measured as these final states can be more fully reconstructed than in W±W± production. To optimally isolate signal from background, the kinematical information of the WZ and ZZ candidate events is exploited with a boosted decision tree and matrix element likelihood techniques, respectively (see figure). The observed statistical significances for the WZ and ZZ processes are 6.8 and 4.0 standard deviations, respectively, in line with the expected SM significances of 5.3 and 3.5 standard deviations. The possible presence of anomalous quartic gauge couplings could result in an excess of events with respect to the SM predictions. Strong new constraints on the structure of quartic gauge couplings have been set within the framework of dimension-eight effective-field-theory operators.

The observation of the EW production of W±W±, WZ and ZZ boson pairs is an essential milestone towards pre­cision tests of VBS at the LHC, and there is much more to be learned from the future LHC Run-3 data. The High-Lumin­osity LHC should allow for very precise investigations of VBS, including finding evidence for the scattering of longitudinally polarised W bosons.

Common baryon source found in proton collisions

Figure 1

High-energy hadronic collisions, such as those delivered by the LHC, result in the production of a large number of particles. Particle pairs produced close together in both coordinate and momentum space are subject to final-state effects, such as quantum statistics, Coulomb forces and, in the case of hadrons, strong interactions. Femtoscopy uses the correlation of such pairs in momentum space to gain insights into the interaction potential and the spatial extent of an effective particle-emitting source.

Abundantly produced pion pairs are used to assess the size and evolution of the high-density and strongly interacting quark–gluon plasmas, which are formed in heavy-ion collisions. Recently, high-multiplicity pp collisions at the LHC have raised the possibility of observing collective effects similar to those seen in heavy-ion collisions, motivating detailed investigations of the particle source in such systems as well. A universal description of the emission source for all baryon species, independent of the specific quark composition, would open new possibilities to study the baryon–baryon interaction, and would impose strong constraints on particle-production models.

The ALICE collaboration has recently used p–p and p–Λ pairs to perform the first study of the particle-emitting source for baryons produced in pp collisions. The chosen data sample isolates the 1.7 permille highest-multiplicity collisions in the 13 TeV data set, yielding events with 30 to 40 charged particles reconstructed, on average, per unit of rapidity. The yields of protons and Λ baryons are dominated by contributions from short-lived resonances, accounting for about two thirds of all produced particles. A basic thermal model (the statistical hadronisation model) was used to estimate the number and composition of these resonances, indicating that the average lifetime of those feeding to protons (1.7 fm) is significantly shorter than those feeding to Λ baryons (4.7 fm) – this would have led to a substantial broadening of the source shape if not properly accounted for. An explicit treatment of the effect of short-lived resonances was developed by assuming that all primordial particles and resonances are emitted from a common core source with a Gaussian shape. The core source was then folded with the exponential tails introduced by the resonance decays. The resulting root-mean-square width of the Gaussian core scales from 1.3 fm to 0.85 fm as a function of an increase in the pair’s transverse mass (mT) from 1.1 to 2.2 GeV, for both p–p and p–Λ pairs (see figure). The transverse mass of a particle is its total energy in a coordinate system in which its velocity is zero along the beam axis. The two systems exhibit a common scaling of the source size, indicating a common emission source for all baryons. The observed scaling of the source size with mT is very similar to that observed in heavy-ion collisions, wherein the effect is attributed to the collective evolution of the system.

This result is a milestone in the field of correlation studies, as it directly relates to important topics in physics. The common source size observed for p–p and p–Λ pairs implies that the spatial- temporal properties of the hadronisation process are independent of the particle species. This observation can be exploited by coalescence models studying the production of light nuclei, such as deuterons or 3He, in hadronic collisions. Moreover, the femtoscopy formalism relates the emission source to the interaction potential between pairs of particles, enabling the study of the strong nuclear force between hadrons, such as p–K, p–Ξ, p–Ω and ΛΛ, with unprecedented precision.

A price worth paying

The LHC

Science, from the immutable logic of its mathematical underpinnings to the more fluid realms of the social sciences, has carried us from our humble origins to an understanding of such esoteric notions as gravitation and quantum mechanics. This knowledge has been applied to develop devices such as GPS trackers and smartphones – a story repeated in countless domains for a century or more – and it has delivered new tools for basic research along the way in a virtuous circle.

While it is undeniable that science has led us to a better world than that inhabited by our ancestors, and that it will continue to deliver intellectual, utilitarian and economic progress, advancement is not always linear. Research has led us up blind alleys, and taken wrong turnings, yet its strength is its ability to process data, to self-correct and to form choices based on the best available evidence. The current coronavirus pandemic could prove to be a great educator in the methods of science, demonstrating how the right course of action evolves as the evidence accumulates. We’ve seen all too clearly how badly things can go wrong when individuals and governments fail to grasp the importance of evidence-based decision making.

Fundamental science has to make its case not only on the basis of cultural wealth, but also in terms of socioeconomic benefit. In particle physics, we also have no shortage of examples. These go well beyond the web, although an economic impact assessment of that particular invention is one that I would be very interested in seeing. As of 2014, there were some 42,200 particle accelerators worldwide, 64% of which were used in industry, a third for medical purposes and just 3% in research – not bad for a technology invented for fundamental exploration. It’s a similar story for techniques developed for particle detection, which have found their way into numerous applications, especially in medicine and biology.

The benefits of Big Science for economic prosperity become more pertinent if we consider the cumulative contributions to the 21st-century knowledge economy, which relies heavily on research and innovation. In 2018, more than 40% of the CERN budget was returned to industry in its member-state countries through the procurement of supplies and services, generating corollary benefits such as opening new markets. Increasing efforts, for example by the European Commission, to require research infrastructures to estimate their socioeconomic impact are a welcome opportunity to quantify and demonstrate our impact.

CERN has been subject to economic impact assessments since the 1970s, with one recent cost–benefit analysis of the LHC, conducted by economists at the University of Milan, concluding with 92% probability that benefits exceed costs, even when attaching the very conservative figure of zero to the value of the organisation’s scientific discoveries. More recent  studies (CERN Courier September 2018 p51) by the Milan group, focusing on the High-Luminosity LHC, revealed a quantifiable return to society well in excess of the project’s costs, again, not including its scientific output. Extrapolating these results, the authors show that future colliders at CERN would bring similar societal benefits on an even bigger scale.

Across physics more broadly, a 2019 report commissioned by the European Physical Society found that physics-based industries generate more than 16% of total turnover and 12% of overall employment in Europe – represen­ting a net annual contribution of at least €1.45 trillion, and topping contributions from the financial services and retail sectors (CERN Courier January/February 2020 p9).

Of course, there are some who feel that limited resources for science should be deployed in areas such as addressing climate change, rather than blue-sky research. These views can be persuasive, but are misleading. Fundamental research is every bit as important as directed research, and through the virtuous circle of science, they are mutually dependent. The open questions and mind-bending concepts explored by particle physics and astronomy also serve to draw bright young minds into science, even if individuals go on to work in other areas. Surveys of the career paths taken by PhD students working on CERN experiments fully bear this out (CERN Courier April 2019 p55).

In April 2020, as a curtain-raiser to the update of the European Strategy for Particle Physics, Nature Physics published a series of articles about potential future directions for CERN. An editorial pointed out the strong scientific and utilitarian case for future colliders, concluding that: “Even if the associated price tag may seem high – roughly as high as that of the Tokyo Olympic Games – it is one worth paying.” This is precisely the kind of argument that we as a community should be prepared to make if we are to ensure continuing exploration of fundamental physics in the 21st century and beyond.

CLOUD clarifies cause of urban smog

Urban flow patterns

Urban particle pollution ranks fifth in the risk factors for mortality worldwide, and is a growing problem in many built-up areas. In a result that could help shape policies for reducing such pollution, the CLOUD collaboration at CERN has uncovered a new mechanism that drives winter smog episodes in cities.

Winter urban smog episodes occur when new particles form in polluted air trapped below a temperature inversion: warm air above the inversion inhibits convection, causing pollution to build up near the ground. However, how additional aerosol particles form and grow in this highly polluted air has puzzled researchers because they should be rapidly lost through scavenging by pre-existing aerosol particles. CLOUD, which uses an ultraclean cloud chamber situated in a beamline at CERN’s Proton Synchrotron to study the formation of aerosol particles and their effect on clouds and climate, has found that ammonia and nitric acid can provide the answer.

Deriving in cities mainly from vehicle emissions, ammonia and nitric acid were previously thought to play a passive role in particle formation, simply exchanging with ammonium nitrate in the particles. However, the new CLOUD study finds that small inhomogeneities in the concentrations of ammonia and nitric acid can drive the growth rates of newly formed particles up to more than 100 times faster than seen before, but only in short spurts that have previously escaped detection. These ultrafast growth rates are sufficient to rapidly transform the newly formed particles to larger sizes, where they are less prone to being lost through scavenging, leading to a dense smog
episode with a high number of particles.

“Although the emission of nitrogen oxides is regulated, ammonia emissions are not and may even be increasing with the latest catalytic converters used in gasoline and diesel vehicles,” explains CLOUD spokesperson Jasper Kirkby. “Our study shows that regulating ammonia emissions from vehicles could contribute to reducing urban smog.”

Lofty thinking

Jasper Kirkby

What, in a nutshell, is CLOUD?

It’s basically a cloud chamber, but not a conventional one as used in particle physics. We realistically simulate selected atmospheric environments in an ultraclean chamber and study the formation of aerosol particles from trace vapours, and how they grow to become the seeds for cloud droplets. We can precisely control all the conditions found throughout the atmosphere such as gas concentrations, temperature, ultraviolet illumination and “cosmic ray” intensity with a beam from CERN’s Proton Synchrotron (PS). The aerosol processes we study in CLOUD are poorly known yet climatically important because they create the seeds for more than 50% of global cloud droplets.

We have 22 institutes and the crème de la crème of European and US atmospheric and aerosol scientists. It’s a fabulous mixture of physicists and chemists, and the skills we’ve learned from particle physics in terms of cooperating and pooling resources have been incredibly important for the success of CLOUD. It’s the CERN model, the CERN culture that we’ve conveyed to another discipline. We implemented the best of CERN’s know-how in ultra-clean materials and built the cleanest atmospheric chamber in the world.

How did CLOUD get off the ground?

The idea came to me in 1997 during a lecture at CERN given by Nigel Calder, a former editor of New Scientist magazine, who pointed out a new result from satellite data about possible links between cosmic rays and cloud formation. That Christmas, while we visited relatives in Paris, I read a lot of related papers and came up with the idea to test the cosmic ray–cloud link at CERN with an experiment I named CLOUD. I did not want to ride into another field telling those guys how to do their stuff, so I wrote a note of my ideas and started to make contact with the atmospheric community in Europe and build support from lab directors in particle physics. I managed to assemble a dream team to propose the experiment to CERN. The hard part was convincing CERN that they should do this crazy experiment. We proposed it in 2000 and it was finally approved in 2006, which I think is a record for CERN to approve an experiment. There were some people in the climate community who were against the idea that cosmic rays could influence clouds. But we persevered and, once approved, things went very fast. We started taking data in 2009 and have been in discovery mode ever since.

Do you consider yourself a particle physicist or an atmospheric scientist?

An experimental physicist! My training and my love is particle physics, but judging by the papers I write and review, I am now an atmospheric scientist. It was not difficult to make this transition. It was a case of going back to my undergraduate physics and high-school chemistry and learning on the job. It’s also very rewarding. We do experiments, like we all do at CERN, on a 24/7 basis, but with CLOUD I can calculate things in my notebook and see the science that we are doing, so we know immediately what the new stuff is and we can adapt our experiments continuously during our run.

On the other hand, in particle physics the detectors are running all the time but we really don’t know what is in the data without years of very careful analysis afterwards, so there is this decoupling of the result from the actual measurement. Also, in CLOUD we don’t need a separate discipline to tell us about the underlying theory or beauty of what we are doing. In CLOUD you’re the theorist and the experimentalist at the same time – like it was in the early days of particle physics.

How would you compare the Standard Model to state-of-the-art climate models?

It’s night and day. The Standard Model (SM) is such a well formed theory and remarkably high-quality quantitatively that we can see incredibly subtle signals in detectors against a background of something that is extremely well understood. Climate models, on the other hand, are trying to simulate a very complex system about what’s happening on Earth’s surface, involving energy exchanges between the atmosphere, the oceans, the biosphere, the cryosphere … and the influence of human beings. The models involve many parameters that are poorly understood, so modellers have to make plausible yet uncertain choices. As a result, there is much more flexibility in climate models, whereas there is almost none in the SM. Unfortunately, this flexibility means that the predictive power of such models is much weaker than it is in particle physics.

The CLOUD detector

There are skills such as the handling of data, statistics and software optimisation where particle physics is probably the leading science in the world, so I would love to see CERN sponsor a workshop where the two communities could exchange ideas and perhaps even begin to collaborate. This is what CLOUD has done. It’s politically correct to talk about the power of interdisciplinary research, but it’s very difficult in practical terms – especially when it comes to funding because experiments often fall into the cracks between funding agencies.

How has CLOUDs focus evolved during a decade of running?

CLOUD was designed to explore whether variations of cosmic rays in the atmosphere affect clouds and climate, and that’s still a major goal. What I didn’t realise at the beginning is how important aerosol–particle formation is for climate and health, and just how much is not yet understood. The largest uncertainty facing predictions of global warming is not due to a lack of understanding about greenhouse gases, but about how much aerosols and clouds have increased since pre-industrial times from human activities. Aerosol changes have offset some of the warming from greenhouse gases but we don’t know by how much – it could have offset almost nothing, or as much as one half of the warming effect. Consequently, when we project forwards, we don’t know how much Earth will warm later this century to better than a factor of three.

Many of our experiments are now aimed at reducing the aerosol uncertainties in anthropogenic climate change. Since all CLOUD experiments are performed under different ionisation conditions, we are also able to quantify the effect of cosmic rays on the process under study. A third major focus concerns the formation of smog under polluted urban conditions.

What have CLOUD’s biggest contributions been?

We have made several major discoveries and it’s hard to rank them. Our latest result (CLOUD clarifies cause of urban smog) on the role of ammonia and nitric acid in urban environments is very important for human health. We have found that ammonia and nitric acid can drive the growth rates of newly formed particles up to more than 100 times faster than seen before, but only in short spurts that have previously escaped detection. This can explain the puzzling observation of bursts of new particles that form and grow under highly polluted urban conditions, producing winter smog episodes. An earlier CLOUD result, also in Nature, showed that a few parts-per-trillion of amine vapours lead to extremely rapid formation of sulphuric acid particles, limited only by the kinetic collision rate. We had a huge fight with one of the referees of this paper, who claimed that it couldn’t be atmospherically important because no-one had previously observed it. Finally, a paper appeared in Science last year showing that sulphuric acid–amine nucleation is the key process driving new particle formation in Chinese megacities.

In CLOUD youre the theorist and the experimentalist at the same time – like it was in the early days of particle physics

A big result from the point of view of climate change came in 2016 when we showed that trees alone are capable of producing abundant particles and thus cloud seeds. Prior to that it was thought that sulphuric acid was essential to form aerosol particles. Since sulphuric acid was five times lower in the pre-industrial atmosphere, climate models assumed that clouds were fewer and thinner back then. This is important because the pre-industrial era is the baseline aerosol state from which we assess anthropogenic impacts. The fact that biogenic vapours make lots of aerosols and cloud droplets reduces the contrast in cloud coverage (and thus the amount of cooling offset) between then and now. The formation rate of these pure biogenic particles is enhanced by up to a factor 100 by galactic cosmic rays, so the pristine pre-industrial atmosphere was more sensitive to cosmic rays than today’s polluted atmosphere.

There was an important result the very first week we turned on CLOUD, when we saw that sulphuric acid does not nucleate on its own but requires ammonia. Before CLOUD started, people were measuring particles but they weren’t able to measure the molecular composition, so many experiments were being fooled by unknown contaminants.

Have CLOUD results impacted climate policy?

The global climate models that inform the Intergovernmental Panel on Climate Change (IPCC) have begun to incorporate CLOUD aerosol parameterisations, and they are impacting estimates of Earth’s climate sensitivity. The IPCC assessments are hugely impressive works of the highest scientific quality. Yet, there is something of a disconnect between what climate modellers do and what we do in the experimental and observational world. The modellers tend to work in national centres and connect with experiments through the latter’s publications, at the end of the chain. I would like to see much closer linkage between the models and the measurements, as we do in particle physics where there is a fluid connection between theory, experiment and modelling. We do this already in CLOUD, where we have several institutes who are primarily working on regional and global aerosol-cloud models.

What’s next on CLOUD’s horizon?

The East Hall at the PS is being completely rebuilt during CERN’s current long shutdown, but the CLOUD chamber itself is pretty much the only item that is untouched. When the East Area is rebuilt there will be a new beamline and a new experimental zone for CLOUD. We think we have a 10-year programme ahead to address the questions we want to and to settle the cosmic ray–cloud–climate question. That will take me up to just over 80 years old!

Will humanity succeed in preventing catastrophic climate change?

I am an optimist, so I believe there is always a way out of everything. It’s very understandable that people want to freeze the exact temperature of Earth as it is now because we don’t want to see a flood or desert in our back garden. But I’m afraid that’s not how Earth is, even without the anthropogenic influence. Earth has gone through much larger natural climate oscillations, even on the recent timescale of homo sapiens. That being said, I think Earth’s climate is fundamentally stable. Oceans cover two thirds of Earth’s surface and their latent heat of vaporisation is a huge stabiliser of climate – they have never evaporated nor completely frozen over. Also, only around 2% of CO2 is in the atmosphere and most of the rest is dissolved in the oceans, so eventually, over the course of several centuries, CO2 in the atmosphere will equilibrate at near pre-industrial levels. The current warming is an important change – and some argue it could produce a climate tipping point – but Earth has gone through larger changes in the past and life has continued. So we should not be too pessimistic about Earth’s future. And we shouldn’t conflate pollution and climate change. Reducing pollution is an absolute no brainer, but environmental pollution is a separate issue from climate change and should be treated as such.

New Perspectives on Einstein’s E = mc2

New Perspectives on Einstein’s E = mc2 mixes historical notes with theoretical aspects of the Lorentz group that impact relativity and quantum mechanics. The title is a little perplexing, however, as one can hardly expect nowadays to discover new perspectives on an equation such as E = mc2. The book’s true aim is to convey to a broader audience the formal work done by the authors on group theory. Therefore, a better-suited title may have been “Group theoretical perspectives on relativity”, or even, more poetically, “When Wigner met Einstein”.

The first third of the book is an essay on Einstein’s life, with historical notes on topics discussed in the subsequent chapters, which are more mathematical and draw heavily on publications by the authors – a well-established writing team who have co-authored many papers relating to group theory. The initial part is easy to read and includes entertaining stories, such as Einstein’s mistakes when filing his US tax declaration. Einstein, according to this story, was calculating his taxes erroneously, but the US taxpayer agency was kind enough not to raise the issue. The reader has to be warned, however, that the authors, professors at the University of Maryland and New York University, have a tendency to make questionable statements about certain aspects of the development of physics that may not be backed up by the relevant literature, and may even contradict known facts. They have a repeated tendency to interpret the development of physical theories in terms of a Hegelian synthesis of a thesis and an antithesis, without any cited sources in support, which seems, in most cases, to be a somewhat arbitrary a posteriori assessment.

There is a sharp distinction in the style of the second part of the book, which requires training in physics or maths at advanced undergraduate level. These chapters begin with a discussion of the Lorentz group. The interest then quickly shifts to Wigner’s “little groups”, which are subgroups of the Lorentz group with the property of leaving the momentum of a system invariant. Armed with this mathematical machinery, the authors proceed to Dirac spinors and give a Lorentz-invariant formulation of the harmonic oscillator that is eventually applied to the parton model. The last chapter is devoted to a short discussion on optical applications of the concepts advanced previously. Unfortunately, the book finishes abruptly at this point, without a much-needed final chapter to summarise the material and discuss future work, which, the previous chapters imply, should be plentiful.

Young Suh Kim and Marilyn Noz’s book may struggle to find its audience. The contrast between the lay and expert parts of this short book, and the very specialised topics it explores, do not make it suitable for a university course, though sections could be incorporated as additional material. It may well serve, however, as an interesting pastime for mathematically inclined audiences who will certainly appreciate the formalism and clarity of the presentation of the mathematics.

Surveying the surveyors

Alban Vieille

A career as a surveyor offers the best of two worlds, thinks Dominique Missiaen, a senior member of CERN’s survey, mechatronics and measurements (SMM) group: “I wanted to be a surveyor because I felt I would like to be inside part of the time and outside the other, though being at CERN is the opposite because the field is in the tunnels!” After qualifying as a surveyor and spending time doing metrology for a cement plant in Burma and for the Sorbonne in Paris, Missiaen arrived at CERN as a stagier in 1986. He never left, starting in a staff position working on the alignment of the pre-injector for LEP, then of LEP itself, and then leading the internal metrology of the magnets for the LHC. From 2009–2018 he was in charge of the whole survey section, and since last year has a new role as a coordinator for special projects, such as the development of a train to remotely survey the magnets in the arcs of the LHC.

“Being a surveyor at CERN is completely different to other surveying jobs,” explains Missiaen. “We are asked to align components within a couple of tenths of a millimetre, whereas in the normal world they tend to work with an accuracy of 1–2  cm, so we have to develop new and special techniques.”

A history of precision

When building the Proton Synchrotron in the 1950s, engineers needed an instrument to align components to 50 microns in the horizontal plane. A device to measure such distances did not exist on the market, so the early CERN team invented the “distinvar” – an instrument to ensure the nominal tension of an invar wire while measuring the small length to be added to obtain the distance between two points. It was still used as recently as 10 years ago, says Missiaen. Another “stretched wire” technique developed for the ISR in the 1960s and still in use today replaces small-angle measurements by a short-distance measurement: instead of measuring the angle between two directions, AB and AC, using a theodolite, it measures the distance between the point B and the line AC. The AC line is realised by a nylon wire, while the distance is measured using a device invented at CERN called the “ecartometer”.

Invention and innovation haven’t stopped. The SMM group recently adapted a metrology technique called frequency sweeping interferometry for use in a cryogenic environment to align components inside the sealed cryostats of the future High-Luminosity LHC (HL-LHC), which contract by up to 12 mm when cooled to operational temperatures. Another recent innovation, in collaboration with the Institute of Plasma Physics in Prague that came about while developing the challenging alignment system for HIE-ISOLDE, is a non-diffractive laser beam with a central axis that diverges by just a few mm over distances of several hundred metres and which can “reconstruct” itself after meeting an obstacle.

The specialised nature of surveying at CERN means the team has to spend a lot of time finding the right people and training newcomers. “It’s hard to measure at this level and to maintain the accuracy over long distances, so when we recruit we look for people who have a feeling for this level of precision,” says Missiaen, adding that a constant feed of students is important. “Every year I go back to my engineering school and give a talk about metrology, geodesy and topometry at CERN so that the students understand there is something special they can do in their career. Some are not interested at all, while others are very interested – I never find students in between!”

We see the physics results as a success that we share in too

CERN’s SMM group has more than 120 people, with around 35 staff members. Contractors push the numbers up further during periods such as the current long-shutdown two (LS2), during which the group is tasked with measuring all the components of the LHC in the radial and vertical direction. “It takes two years,” says Jean-Frederic Fuchs, who is section leader for accelerators, survey and geodesy. “During a technical stop, we are in charge of the 3D-position determination of the components in the tunnels and their alignment at the level of a few tenths of a millimetre. There is a huge number of various accelerator elements along the 63 km of beam lines at CERN.”

Fuchs did his master’s thesis at CERN in the domain of photogrammetry and then left to work in Portugal, where he was in charge of guiding a tunnel-boring machine for a railway project. He returned to CERN in the early 2000s as a fellow, followed by a position as a project associate working on the assembly and alignment of the CMS experiment. He then left to join EDF where he worked on metrology inside nuclear power plants, finally returning to CERN as a staff member in 2011 working on accelerator alignment. “I too sought a career in which I didn’t have to spend too much time in the office. I also liked the balance between measurements and calculations. Using theodolites and other equipment to get the data is just one aspect of a surveyor’s job – post-treatment of the data and planning for measurement campaigns is also a big part of what we do.”

With experience in both experiment and accelerator alignment, Fuchs knows all too well the importance of surveying at CERN. Some areas of the LHC tunnel are moving by about 1 mm per year due to underground movement inside the rock. The tunnel is rising at point 5 (where CMS is located) and falling between P7 and P8, near ATLAS, while the huge mass of the LHC experiments largely keeps them at the same vertical position, therefore requiring significant realignment of the LHC magnets. During LS2, the SMM group plans to lower the LHC at point 5 by 3 mm to better match the CMS interaction point by adjusting jacks that allow the LHC to be raised or lowered by around 20 mm in each direction. For newer installations, the movement can be much greater. For example, LINAC4 has moved up by 5 mm in the source area, leading to a slope that must be corrected. The new beam-dump tunnels in the LHC and the freshly excavated HL-LHC tunnels in points 1 and 5 are also moving slightly  compared to the main LHC tunnel. “Today we almost know all the places where it moves,” says Fuchs. “For sure, if you want to run the LHC for another 18 years there will be a lot of measurement and realignment work to be done.” His team also works closely with machine physicists to compare its measurements to those performed with the beams themselves.

It is clear that CERN’s accelerator infrastructure could not function at the level it does without the field and office work of surveyors. “We see the physics results as a success that we share in too,” says Missiaen. “When the LHC turned on you couldn’t know if a mistake had been made somewhere, so in seeing the beam go from one point to another, we take pride that we have made that possible.”

Pierre Lazeyras 1931–2020

Pierre Lazeyras

Pierre Lazeyras, who played leading roles in the ALEPH experiment, neutrino beams and silicon detectors during a 35-year-long career at CERN, passed away on 4 April aged 88.

Pierre graduated from the École supérieure de physique et chimie industrielle (ESPCI) in Paris in 1954 and, after working in Anatole Abragam’s group at CEA Saclay, he joined CERN as a staff member in October 1961. He was one of the early collaborators in the Track Chamber (TC) division, which built the two-metre bubble chamber and the Big European Bubble Chamber (BEBC). In parallel, he headed the team that developed one of the first superconducting bending magnets for BEBC’s “beam s3”.

Pierre directed the TC SPS neutrino beam group from 1972, which included the construction of the horns, the 185 m-long iron muon shielding and the beam monitoring, for which silicon-diode particle detectors were employed. After some initial teething troubles, the SPS neutrino beams operated for nearly 20 years without major problems. The silicon monitors were found to be more precise than the early gas-filled ion chambers, and this was the beginning of the era of silicon micro-strip detectors. Pierre encouraged the microelectronics developments for this new technology and its integrated readout circuits. These advances also came just in time for the UA2 experiment at the SPS and for wider applications in the LEP experiments.

Pierre was instrumental in the formation and success of ALEPH. From the conception of the experiment in 1982 right through to the LEP2 phase in 1996, he was ALEPH technical coordinator – a role that was quite new to those of us coming from smaller experiments. Pierre made sure we were realistic in our ambitions and our estimates of the difficulties and planning constraints, and we owe it mainly to him that the various parts of ALEPH were assembled without major problems. He was always available for advice even if, in his careful and reserved style, he did not try to direct or micro-manage everything.

In addition to being responsible for general safety in the experiment (which had no major incidents during its 11 years of operation), Pierre ensured that the construction of ALEPH was completed within budget. He also played an essential role at a crucial moment for the experiment in the early 1990s: the problem with the superconducting magnet cryostat. Under Pierre’s supervision, a vacuum leak was located, close to the edge of the magnet, and the cryostat then underwent “surgery” using a milling machine suspended from a crane. It was a wonderful exercise in imagination and, to the relief of all, a complete success. Pierre had always insisted that such a huge superconducting magnet and cryostat inherently constituted a fragile device, and had objected to the idea of warming up the magnet during annual shutdowns, citing the mechanical stress resulting from this procedure. He was absolutely right.

Pierre was also involved in the design of the large stabilised superconductors for the LHC-experiment magnets and served as a member of the magnet advisory group of the LHC into his retirement, his wisdom being highly appreciated. He was also an active member of the CERN Staff Association. Following his retirement in 1996, he joined the Groupement des Anciens and was a representative on the CERN health insurance supervisory committee, where his advice and opinions were always wise and measured.

Pierre was not only highly talented and used his experience most effectively, he was also a warm person, someone on whom one could always rely. He would always tell you straight how things were and then suggest how any problems could be tackled. A typical remark by Pierre would be: “Ask me to approve or reject your ideas, do not ask me what work I have for you.” We will remember him as a very dear friend and colleague.

Aldo Michelini 1930–2020

Aldo Michelini

Aldo Michelini, who led OPAL and other important experiments at CERN, passed away at Easter at the age of 89. He was known as much for his kindness and care for his colleagues, particularly those embarking on their careers, as for the physics at which he excelled.

Aldo first came to CERN in 1960, bringing experience from several tracking-chamber experiments, including a stint with Jack Steinberger at Columbia University, and he lost no time in making an impact. One of his earliest contributions was to equip CERN’s Wilson chamber magnet with spark chambers, which he then used as part of a CERN/ETH/Imperial College/Saclay collaboration to measure properties of the K02 meson and pp and Kp charge-exchange interactions using a polarised target.

As the 1960s advanced, Aldo formed a partnership and life-long friendship with his compatriot, Mario Morpurgo, who was an early pioneer of superconducting magnet technology. The two were part of the small team spearheading the development of the Omega spectrometer, a general-purpose device built around a large superconducting magnet that could be arranged and configured according to the physics to be studied. Omega was initially equipped with spark chambers and installed on a PS beamline, receiving its first beam in 1972, and moved to the SPS in 1976 where it became the backbone of the fixed-target programme there for 20 years.

In 1973, Aldo headed a similar project to build a general-purpose spectrometer for the North Area. This became NA3, which was the first experiment to receive beam in the new SPS hadron hall, EHN1, in May 1978. NA3 embarked on a programme of high-mass dimuon production with π+, π, K+, K, p and p beams, enabling the first observation of upsilon production by pions. It also probed the structure of the incoming particles via the Drell–Yan process. The spectrometer carried out a string of valuable experiments under Aldo’s guidance until 1981, when he became spokesperson of the OPAL experiment being planned for LEP. Aldo remained at the helm of OPAL right up to his retirement in 1995.

OPAL was built around tried and tested technology, including a paradoxical novelty for Morpurgo: a warm magnet. Huge for its time, with a collaboration of some 300 people, OPAL was nevertheless the smallest of the four LEP experiments. It was a scale that lent itself well to Aldo’s unique style of management – leading through example and consensus. Colleagues remember him smiling and looking very worried, or more often than not, the other way around. This was strangely motivational, with team members striving to make him smile more and worry less. His personality shaped the unique OPAL team spirit. Despite his gentle nature, Aldo was more than capable of making tough choices, and winning over those who might initially have disagreed with him.

When OPAL detected the first Z boson at LEP on 13 August 1989, Aldo was heard to remark that the young people had taken over. The average age of those in the control room that day was well under 30, and that youthfulness was no accident. Aldo actively supported the young members of the collaboration, making sure that they were visible at collaboration meetings and conferences. He also imbued them and the whole collaboration with a culture of never publishing even preliminary results before being absolutely certain of them. As a result, OPAL’s scientists built a strong reputation, with many conference conversations including the words, “let’s wait and see what OPAL has to say”. Aldo’s faith in the younger generation was rewarded by some 300 successful PhD theses from OPAL, while more than 100 CERN fellows passed through the collaboration over its lifetime.

Aldo was a great leader, commanding respect and affection in equal measure. That the collaboration was still able to gather more than 100 members in 2019 to celebrate the 30th anniversary of that first Z decay is testimony to the kind of person Aldo was, and to the spirit that he engendered. Although he was unable to attend that gathering, he sent a message, and was loudly cheered. He will be sorely missed.

Adolf Minten 1931–2020

Adolf Minten

Distinguished CERN physicist Adolf Minten passed away on 21 March at the age of 88.

After graduating from the University of Bonn, where he worked in the team of Wolfgang Paul on the 500 MeV electron synchrotron, Adolf joined the CERN Track Chamber division in 1962. Working under Charles Peyrou, he set up beamlines for the two-metre bubble chamber and actively participated in its broad physics programme. Another important milestone of his career was his time as a visiting scientist at SLAC from 1966 to 1967, where he took part in the early experiments on hadron electro-production and electron scattering at the new two-mile accelerator.

Adolf returned to CERN at a time of decisive developments in accelerator and detector technologies. In parallel to his continued participation in bubble-chamber experiments, he became interested in the physics programme of the Intersecting Storage Rings, the world’s first proton–proton collider, which started operation in 1971. To cope with the high interaction rates expected at this new machine, the development of track detectors focused on the multi-wire proportional chamber (MWPC) developed by Georges Charpak. One of the designs was a large multi-purpose spectrometer called the split-field magnet (SFM). At that time, a large-scale application of the revolutionary MWPC technology, hitherto available only in single-wire devices or small-surface detectors, presented a formidable challenge. In 1969, Adolf became responsible for the construction of the SFM facility, which covered the full solid angle with an unprecedented 300 m2 detector surface, and 70,000 wires and electronics channels. Major detector, electronics and software developments were needed to bring this project into operation in 1974.

In 1975, to prepare for the next generation of experiments at the new SPS machine, the CERN management proposed the creation of a new Experimental Facilities (EF) division. Adolf was elected to lead the new EF division, a position that required a combination of strong scientific and technical authority, and in which he commanded the unreserved respect of his collaborators. Following support provided to the major facilities for the SPS fixed-target programme, such as BEBC, the Omega spectrometer and the neutrino, muon and other experiments, his new division soon became involved in the successful experiments at the SPS proton–antiproton collider.

In 1984 Adolf stepped down from his position as EF division leader and joined the ALEPH experiment at LEP. The LEP experiments were a quantum leap in size and complexity when compared to previous experiments, and demanded new organisational structures. As head of the ALEPH steering committee, Adolf was instrumental in setting up an organisation whose role he compared to an “orchestra, where it is not sufficient that all the instruments be properly tuned, they must also harmonise”. However, his true role of an “elder statesman” went far beyond organisational responsibilities; equally important were his human qualities, which were remarkable indeed and for which he was respected by both young and old.

Adolf maintained a constant interest in DESY, where he was highly appreciated. In 1981 Bjorn Wiik’s study group had finished the HERA design report, and DESY set up an international evaluation committee to analyse it in detail. Adolf was invited to chair this committee. Its positive recommendation was a significant step towards the approval of the HERA project. He chaired the DESY scientific council from 1987 until 1990, during the main construction phase of the storage rings and the H1 and ZEUS multi-purpose detectors.

Adolf retired from CERN in 1996. We remember him as a supremely well-organised scientist of deep and incisive intelligence, unafraid to challenge and question preconceived ideas, and always inspiring others to do the same. At the same time, he was a modest person who cared profoundly for all the people around him, and their families.

bright-rec iop pub iop-science physcis connect