Topics

Synergy at the Higgs frontier

Sally Dawson

What impact did the discovery of the Higgs boson have on your work? 

It was huge because before then it was possible that maybe there was no Higgs. You could have some kind of dynamical symmetry breaking, or maybe a heavy Higgs, at 400 GeV say, which would be extremely interesting but completely different. So once you knew that the Higgs was at the same mass scale as the W and the Z, our thinking changed because that comes out of only a certain kind of model. And of course once you had it, everyone, including myself, was motivated to calculate everything we could. 

I am working on how you tease out new physics from the Higgs boson. It’s the idea that even if we don’t see new particles at the LHC, precision measurements of the Higgs couplings are going to tell us something about what is happening at very high energy scales. I’m using what’s called an effective field theory approach, which is the standard these days for trying to find out what we can learn from combining Higgs measurements with other types of measurements, such as gauge-boson pair production and top-quark physics. 

Aside from the early formal work, what was the role of Standard Model calculations in the discovery of the Higgs boson?

You had to know what you were looking for, because there’s so many events at the LHC. Otherwise, it would be like looking for a needle in a haystack. The Higgs was discovered, for example, by its decay to two photons and there are millions of two-photon events at the LHC that have nothing to do with the Higgs. Theory told you how to look for this particle, and I think it was really important that a trail was set out to follow. This involves calculating how often you make a Higgs boson and what the background might look like. It wasn’t until the late 1980s that people began taking this seriously. It was really the Superconducting Super Collider that started us thinking about how to observe a Higgs at a hadron collider. And then there were the LEP and Tevatron programmes that actively searched for the Higgs boson. 

To what order in perturbation theory were those initial calculations performed?

For the initial searches you didn’t need the complicated calculations because you weren’t looking for precision measurements such as those required at the Z-pole, for example. You really just needed the basic rate and background information. We weren’t inspired to do higher order calculations until later in the game. When I was a postdoc at Berkeley in 1986, that’s when I really started to calculate things about the Higgs. But there was a long gap between the time when the Brout–Englert–Higgs mechanism was proposed and when people really started doing some hard calculations. There’s the famous paper in 1976 by Ellis, Gaillard and Dimopoulos that calculated how the Higgs might be observed, but in essence it said: why bother looking for this thing, we don’t know where it is! So people were thinking we could see the Higgs in kaon decays, if it was very light, and in other ways, and were looking at the problem in a global kind of way. 

Was this what drove your involvement with The Higgs Hunter’s Guide in 1990?

We were further along in terms of calculating things precisely by then, and I suppose there was a bit of a generation gap. It was a wonderful collaboration to produce the guide. We still went through the idea of how you would find the Higgs at different energy scales because we still had no idea where it was. The calculations went into high gear around that time, which was well before the Higgs was discovered. Partly it was the motivation that we were pretty sure we would see it at the LHC. But partly it was developments in theory which meant we could calculate things that we never would have imagined was possible 30 years earlier. The capability of theorists to calculate has grown exponentially. 

What have these improvements been?

It’s what they call the next-to-next-to-leading order (NNLO) revolution – a new frontier in perturbative QCD where diagrams with two extra emissions of real or extra loops of virtual partons are accounted for. These were new mathematical techniques for evaluating the integrals that come into the quantum field theory, so not just turning the crank computationally but really an intellectual advance in understanding the structure of these calculations. It started with Bern, Dixon and Kosower, who understood the needed amplitudes in a formal way. This enabled all sorts of calculations, and now we have N3LO calculations for certain Higgs-boson production modes. 

What is driving greater precision on Higgs calculations today?

Actually it’s really exciting because at the high-luminosity LHC (HL-LHC), experimentalists will be limited in their understanding of the Higgs boson by theory – the theory and experimental uncertainties will be roughly the same. This is truly impressive. You might think that these higher order corrections, which have quite small errors, are enough but they need to be even smaller to match the expected experimental precision. As theorists we have to keep going and do even better, which from my point of view is wonderful. It’s the synergy between experiment and theory that is the real story. We’re co-dependent. Even now, theory is not so different from ATLAS and CMS in terms of precision. Theory errors are hard things to pin down because you never really know what they are. Unlike an absolute statistical uncertainty, they’re always an estimate. 

How do the calculations look for measurements beyond the LHC? 

It’s a very different situation at e+e colliders compared to hadron colliders. The LHC runs with protons containing gluons, so that’s why you need the higher order corrections. At a future e+e+ collider, you need higher-order corrections but they are much more straightforward because you don’t have parton distribution functions to worry about. We know how to do the calculations needed for an e+e Future Circular Collider, for example, but there is not a huge community of people working on them. That’s because they are really hard: you can’t just sit down and do them as a hobby, they really need a lot of skills. 

You are currently leading the Higgs properties working group of the current Snowmass planning exercise. What has been the gist of discussions? 

This is really exciting because our job has essentially been to put together the pieces of the puzzle after the European strategy update in 2020. That process did a very careful job of looking at the future Higgs programme, but there have been developments in our understanding since then. For example, the muon collider might be able to measure the Higgs couplings to muons very precisely, and there has been some good work on how to measure the couplings to strange quarks, which is very hard to do. 

The Higgs Hunters Guide

I would like to see an e+e collider built somewhere, anywhere. In point of fact, when you look at the proposals they’re roughly the same in terms of Higgs physics. This was clear from the European strategy report and will be clear from the upcoming Snowmass report. Personally, I don’t much care whether there is a precision of 1% or 1.5% on some coupling. I care that you can get down to that order of magnitude, and that e+e machines will significantly improve on the precision of HL-LHC measurements. The electroweak programme of large circular e+e colliders is extremely interesting. At the Z-pole you get some very precise measurements of Standard Model quantities that feed into the whole theory because everything is connected. And at the WW threshold you get very precise measurements in the effective field theory of things that connect the Higgs and WW pairs. As a theorist, it doesn’t make sense to think of the Higgs in a vacuum. The Higgs is part of this whole electroweak programme. 

What are the prospects for finding new physics via the Higgs?

The fact that we haven’t seen anything unexpected yet is probably because we haven’t probed enough. I’m absolutely convinced we are going to see something, I just don’t know what (or where) it is. So I can’t believe in the alternative “nightmare” scenario of a Standard-Model Higgs and nothing else because there are just so many things we don’t know. You can make pretty strong arguments that we haven’t yet reached the precision where we would expect to see something new in precision measurements. It’s a case of hard work.  

What’s next in the meantime?

The next big thing is measuring two Higgs bosons at a time. That’s what theorists are super excited about because we haven’t yet seen the production of two Higgses and that’s a fundamental prediction of our theory. If we don’t see it, and it’s extremely difficult to do so experimentally, it tells us something about the underlying model. It’s a matter of getting the statistics. If we actually saw it, then we would do more calculations. For the trilinear Higgs coupling we now have a complete calculation at next-to-leading order, which is a real tour de force. The calculations are sufficient for a discovery, and because it’s so rare it’s unlikely we will be doing precision measurements, so it is probably okay for the foreseeable future. For the quartic coupling there are some studies that suggest you might see it at a 100 TeV hadron collider.

With all the Standard Model particles in the bag, does theory take more of a back seat from here? 

The hope is that we will see something that doesn’t fit our theory, which is of course what we’re really looking for. We are not making these measurements at ever higher precisions for the sake of it. We care about measuring something we don’t expect, as an indicator of new physics. The Higgs is the only tool we have at the moment. It’s the only way we know how to go.

You have to be able to explain ‘why’

Sean Carroll

On 4 July 2012, Sean Carroll was at CERN to witness the momentous announcements by ATLAS and CMS – but not in his usual capacity as a physicist. He was there as an accredited member of the media, sharing an overflow room with journalists to get first-hand footage for the final chapter of his book. The Particle at the End of the Universe ended up being the first big title on the discovery and went on to win the 2013 Royal Society Science Books Prize. “It got reviewed everywhere, so I am really grateful to the Higgs boson and CERN!”

Carroll’s publisher sensed an opportunity for a timely, expert-authored title in 2011, as excitement in ATLAS and CMS grew. He initially said “No” – it wasn’t his research area, and he preferred to present a particular point of view, as he did in his first popular work From Eternity to Here: The Quest for the Ultimate Theory of Time. “With the Higgs boson, there is no disagreement, he says. “Everyone knows what the boson is, what it does and why is it important.” After some negotiation, he received an offer he couldn’t refuse. It also delved into the LHC, the experiments and how it all works, with a dash of quantum field theory and particle physics more generally. “We were hoping the book would come out by the time they announced the discovery, but on the other hand at least I got to include the discovery in the book, and was there to see it.”

Show me the money

Books are not very lucrative, he says. “Back in the 1980s and 1990s, when the success of Hawking’s A Brief History of Time awoke the interest of publishers, if you had a good idea for a physics book you could make a million dollars. But it is very hard to earn enough to make a living. “It takes roughly a year, or more depending on how much you have to learn, and depends on luck, the book and the person writing it.” His next project is a series of three books aimed at explaining physics to the general reader. The first, The Biggest Ideas in the Universe: Space, Time and Motion, due out in September, covers Newtonian mechanics and relativity; the second covers quantum mechanics and quantum field theory, and the third complexity, emergence and large-scale phenomena. 

Meanwhile, Carroll’s podcast Mindscape, in which he invites experts from different fields to discuss a range of topics, has produced 200 episodes since it launched in 2018 and attracts around 100,000 listeners weekly. “I thought that it was a very fascinating idea, basically your personal radio show, but I quickly learned that I didn’t have that many things to say all by myself,” he explains. “Then I realised it would give me an excuse to talk to lot of interesting people and stretch my brain a lot, and that worked out really well.” 

Reaching out

As someone who fell in love with science at a young age and enjoyed speaking and writing, Carroll has clearly found his ideal career. But stepping outside the confines of research is not without its downsides. “Overall, I think it has been negative actually, as it’s hard for some scientists to think that somebody is both writing books and giving talks, and also doing research at the same time. There is a prejudice that if you are a really good researcher then that’s all you do, and anything else is a waste of time. But whatever it does to my career, it has been good in many ways, and I think for the field, because I have reached people who wouldn’t know about physics otherwise.”

We need to take seriously the responsibility to tell people what it is that we have learned about the universe, and why it’s exciting to explore further

Moreover, he says, scientists are obligated to communicate the results of their work. “When it comes to asking the public for lots of money you have to be able to explain why it’s needed, and if they understand some of the physics and they have been excited by other discoveries they are much more likely to appreciate that,” he says, citing the episode of the Superconducting Super Collider. “When we were trying to build the SSC, physicists were trying their best to explain why we needed it and it didn’t work. Big editorials in the New York Times clearly revealed that people did not understand the reasons why this was interesting, and furthermore thought that the kind of physics we do does not have any immediate or technological benefit. But they are all also curious like we are. And while we don’t all have to become pop-science writers or podcasters (just like I am not going to turn up on Tik Tok or do a demo in the street), as a field we really need to take seriously the responsibility to tell people what it is that we have learned about the universe, and why it’s exciting to explore further.”

Accelerating aerosol production

A simulation of aerosol-particle formation

The CLOUD collaboration at CERN has uncovered a new mechanism accelerating the formation of aerosol particles in the upper troposphere, with potential implications for air-pollution regulations. The results, published in Nature on 18 May, show that an unexpected synergy between nitric acid, sulphuric acid and ammonia leads to the formation of aerosols at significantly faster rates than those from any two of the three components. The mechanism may represent a major source of cloud and ice seed particles in certain regions of the globe, says the team.

Aerosol particles are known to generally cool the climate by reflecting sunlight back into space and by seeding cloud droplets. But the vapours driving their formation are not well understood. The CLOUD (Cosmics Leaving Outdoor Droplets) facility at CERN’s East Area replicates the atmosphere in an ultraclean chamber to study, under precisely-controlled atmospheric conditions, the formation of aerosol particles from trace vapours and how they grow to become the seeds for clouds.

Three is key

Building on earlier findings that ammonia and nitric acid can accelerate the growth rates of newly formed particles, the CLOUD team introduced mixtures of sulphuric acid, nitric acid and ammonia vapours to the chamber and observed the rates at which particles formed. They found that the three vapours together form new particles 10–1000 times faster than a sulphuric acid–ammonia mixture, which previous CLOUD measurements suggested was the dominant source of upper tropospheric particles. Once the three-component particles form, they grow rapidly from the condensation of nitric acid and ammonia alone to sizes where they seed clouds. 

Moreover, the team found these particles to be highly efficient at seeding ice crystals, comparable to desert dust particles, which are thought to be the most widespread and effective ice seeds in the atmosphere. When a supercooled cloud droplet freezes, the resulting ice particle will grow at the expense of any unfrozen droplets nearby, making ice a major factor in the microphysical properties of clouds and precipitation. Around three-quarters of global precipitation is estimated to originate from ice particles.

Feeding their measurements into global aerosol models that include vertical transport of ammonia by deep convective clouds, the CLOUD researchers found that although the particles form locally in ammonia-rich regions of the upper troposphere, such as over the Asian monsoon regions, they travel from Asia to North America in just three days via the subtropical jet stream, potentially influencing Earth’s climate on an intercontinental scale (see “Enhancement” figure). The importance of the new synergistic mechanism depends on the availability of ammonia in the upper troposphere, which originates mainly from livestock and fertiliser emissions. Atmospheric concentrations of all three compounds are much higher today than in the pre-industrial era.

“Our results will improve the reliability of global climate models in accounting for aerosol formation in the upper troposphere and in predicting how the climate will change in the future,” says CLOUD spokesperson Jasper Kirkby. “Once again, CLOUD is finding that anthropogenic ammonia has a major influence on atmospheric aerosol particles, and our studies are informing policies for future air-pollution regulations.”

Our results will improve the reliability of global climate models

Working at the intersection between atmospheric science and particle physics, CLOUD has published several important results since it started operations in 2009. These include new mechanisms responsible for driving winter smog episodes in cities and for potentially accelerating the loss of Arctic sea ice, in addition to studies of the impact of cosmic rays on clouds and climate (CERN Courier July/August 2020 p48). 

“When CLOUD started operation, the prevailing understanding was that sulphuric acid vapour alone could account for almost all observations of new-particle formation in the atmosphere,” says Kirkby. “Our first experiments showed that it was around one million times too slow, and CLOUD went on to discover that additional vapours – especially biogenic vapours from trees – form particles together with stabilisers like ammonia, amines or ions from cosmic rays. CLOUD has now established a mechanistic understanding of aerosol particle formation for global climate models – but our work isn’t finished yet.”

Top quark weighs in with unparalleled precision

A top-quark pair at the LHC

The CMS collaboration has substantially improved on its measurement of the top-quark mass. The latest result, 171.77 ± 0.38 GeV, presented at CERN on 5 April, represents a precision of about 0.22% – compared to the 0.36% obtained in 2018 with the same data. The gain comes from new analysis methods and improved procedures to consistently treat uncertainties in the measurement simultaneously.

As the heaviest elementary particle, precise knowledge of the top-quark mass is of paramount importance to test the internal consistency of the Standard Model. Together with accurate knowledge of the masses of the W and Higgs bosons, the top-quark mass is no longer a free parameter but a clear prediction of the Standard Model. Since the top-quark mass dominates higher-order corrections to the Higgs-boson mass, a precise measurement of the top mass also places strong constraints on the stability of the electroweak vacuum (see The Higgs and the fate of the universe). 

Since its discovery at Fermilab in 1995, the mass of the top quark has been measured with increasing precision using the invariant mass of different combinations of its decay products. Measurements by the Tevatron experiments resulted in a combined value of 174.30 ± 0.65 GeV, while the ATLAS and CMS collaborations measured 172.69 ± 0.48 GeV and 172.44 ± 0.48 GeV, respectively, from the combination of their most precise results from LHC Run 1 recorded at a centre-of-mass energy of 8 TeV. The latter measurement achieved a relative precision of about 0.28%. In 2019, the CMS collaboration also experimentally investigated the running of the top quark mass – a prediction of QCD that causes the mass to vary as a function of energy – for the first time at the LHC. 

The LHC produces top quarks predominantly in quark–antiquark pairs via gluon fusion, which then decay almost exclusively to a bottom quark and a W boson. Each tt event is classified by the subsequent decay of the W bosons. The latest CMS analysis uses semileptonic events – where one W decays into jets and the other into a lepton and a neutrino – selected from 36 fb–1 of Run 2 data collected at a centre-of-mass energy of 13 TeV. Five kinematical variables, as opposed to up to three in previous analy­ses, were used to extract the top-quark mass. While the extra information in the fit improved the precision of the measurement in a novel and unconventional way, it made the analysis significantly more complicated. In addition, the measurement required an extremely precise calibration of the CMS data and an in-depth understanding of the remaining experimental and theoretical uncertainties and their interdependencies. 

The final result, 171.77 ± 0.38 GeV, which includes 0.04 GeV statistical uncertainty, is a considerable improvement compared to all previously published top-quark mass measurements and supersedes the previously published measurement in this channel using the same data set. 

“The cutting-edge statistical treatment of uncertainties and the use of more information have vastly improved this new measurement from CMS,” says Hartmut Stadie of the University of Hamburg, who contributed to the result. “Another big step is expected when the new approach is applied to the more extensive dataset recorded in 2017 and 2018.”

Gérard Bachy 1942–2022

Gérard Bachy

Gérard Bachy arrived at CERN in 1967, straight after graduating from ETH Zurich, and spent his entire 35-year career there. He started off as a mechanical engineer with the Big European Bubble Chamber, where he was in charge of the design and manufacture of the expansion system. In 1972 he joined the team of John Adams that was building CERN’s new flagship facility, the Super Proton Synchrotron (SPS), taking on responsibility for its coordination and installation. The first protons were injected into the SPS on 3 May 1976. Gérard was then approached by Giorgio Brianti, deputy head of the SPS division, to set up a section in charge of the underground-area infrastructure and installation of the experiments. He formed a motivated team where new ideas thrived and were put into practice – including a bicycle-driven system for moving detector components weighing several dozen tonnes using air cushions. 

In 1981, when the huge Large Electron–Positron (LEP) collider project was taking shape, Gérard and his team were brought in by director-in-charge Emilio Picasso. They were soon merged with the engineering group to become the LEP–IM group, which went on to play a key role in the realisation of LEP. More innovations were in store to solve the many challenges associated with this project: modular access shafts; a monorail to facilitate the installation of various components; highly precise planning, logistics and others. The project moved fast, culminating in the start-up of LEP on 14 July 1989.

The engineering for the accelerators was spread across the various CERN divisions, which hampered efficiency. In 1990, Director-General Carlo Rubbia entrusted Gérard with bringing all the different activities together under one umbrella, and the mechanical technologies division was born. Over the next five years, the focus was on modernising the facilities, infrastructures and working methods, first for the LEP200 project and then for the LHC preparations. Gérard fostered the development of the engineering and equipment data-management service, encouraged the creation of quality assurance plans and promoted a project-management culture.

In 1996, Hans Hoffmann, the technical coordinator for ATLAS, appointed Gérard as project engineer in his technical coordination and integration team. Gérard’s experience was to have a big impact on important technical choices, such as the “large wheel” concept for the ATLAS muon spectrometer. He retired in June 2001 to be able to devote more time to his other great passions, sailing and travel. 

Gérard Bachy was a brilliant engineer and a charismatic leader. He played an undisputed role at the top level of engineering at CERN and acted as a mentor for many of us.

Jean-Charles Chollet 1938–2021

Jean-Charles Chollet

Experimental particle physicist Jean-Charles (Charlie) Chollet passed away on 24 August 2021. He had spent his whole scientific career at CERN, working as a member of the Orsay Laboratoire de l’Accélérateur Linéaire. His work was always in the area of precision measurements involving subtle analyses.

Charlie started at the CERN Proton Synchrotron with his thesis, defended in 1969 under the supervision of Jean-Marc Gaillard, on the observation of the interference between KL and KS in the π0π0 decay mode. He then contributed to the WA2 experiment at the Super Proton Synchrotron (SPS) studying leptonic decays of hyperons, where he took care of one of the most difficult components of the detector, the DISC Cherenkov counter, which led to the impressive achievement of separating ~200 GeV/c Σ and Ξ hyperons thanks to a combination of subtle optics and of a complex system of photodetection. He then participated in the UA2 experiment at the SPS pp̅ collider, where he was in charge of the pre-shower detector calibration and performance. 

Later he engaged himself in the preparation of the ATLAS experiment at the LHC, where he performed several studies, notably on the pileup background properties and their expected impact on the design of the liquid-argon calorimeter electronics. He also participated in test-beam analysis of early “accordion calorimeters”, prototypes of this same calorimeter. He ended his career at the NA48 experiment, which was measuring the direct CP violation parameter ε´/ε in neutral kaon decays and where he made an important contribution with the analysis of kaon scattering in the collimator. From small inconsistencies in the data, he managed to find and understand the source of this background, thereby allowing it to be precisely taken into account in the measurement.

He was a great sportsman, especially sailing, skiing and cycling. Those who worked with Jean-Charles Chollet will always remember the pleasure of his company, his dry sense of humour and the depth and refinement of his work, which was always presented with the utmost modesty.

Dead-cone effect exposed by ALICE

A charm quark in a parton shower

More than 30 years after it was predicted, a phenomenon in quantum chromodynamics (QCD) called the dead-cone effect has been directly observed by the ALICE collaboration. The result, reported in Nature on 18 May, not only confirms a fundamental feature of the theory of the strong force, but enables a direct experimental observation of the non-zero mass of the charm quark in the partonic phase.

In QCD, the dead-cone effect predicts a suppression of gluon bremsstrahlung from a quark within a cone centred on the quark’s flight direction. This cone has an angular size mq/E, where mq is the mass of the quark and E is its energy. The effect arises due to the conservation of angular momentum during the gluon emission and is significant for low-energy heavy-flavour quarks. 

The dead cone has been indirectly observed at particle colliders. A direct observation from the parton shower’s radiation pattern has remained challenging, however, because it relies on the determination of the emission angle of the gluon, as well as the emitting heavy-flavour quark’s energy, at each emission vertex in the parton shower (see “Showering” figure). This requires a dynamic reconstruction of the cascading quarks and gluons in the shower from experimentally accessible hadrons, which had not been possible until now. In addition, the dead-cone region can be obscured and filled by other sources such as the decay products of heavy-flavour hadrons, which must be removed during the measurement.

To observe the dead-cone effect directly, ALICE used jets tagged with a reconstructed D0-meson in a 25 nb–1 sample of pp collisions at a centre-of-mass-energy of 13 TeV collected between 2016 and 2018. The D0-mesons were reconstructed with transverse momenta between 2 and 36 GeV/c through their decay into a kaon and pion pair. Jet-finding was then performed on the events with the “anti-kT” algorithm, and jets with the reconstructed D0-meson amongst their constituents were tagged. The team used recursive jet-clustering techniques to reconstruct the gluon emissions from the radiating charm quark by following the branch containing the D0-meson at each de-clustering step, which is equivalent to following the emitting charm quark through the shower. A similar procedure was carried out on a flavour-untagged sample of jets, which contain primarily gluon and light-quark emissions and form a baseline where the dead-cone effect is absent.

Comparisons between the gluon emissions from charm quarks and from light quarks and gluons directly reveal the dead-cone effect through a suppression of gluon emissions from the charm quark at small angles, compared to the emissions from light quarks and gluons. Since QCD predicts a mass-dependence of the dead cones, the result also directly exposes the mass of the charm quark, which is otherwise inaccessible due to confinement. ALICE’s successful technique to directly observe a parton shower’s dead cone may therefore offer a way to measure quark masses.

The upgraded ALICE detector in LHC Run 3 will enable an extension of the measurement to jets tagged with a B+ meson. This will allow the reconstruction of gluon emissions from beauty quarks which, due to their larger mass, are expected to have a larger dead cone than charm quarks. Comparisons between the angular distribution of gluon emissions from beauty quarks and those from charm quarks will isolate mass-dependent effects in the shower and remove the contribution from effects pertaining to the differences between quark and gluon fragmentation, bringing deeper insights into the intriguing workings of the strong force.

Tom Cormier 1947–2022

Tom Cormier

Long-time ALICE collaborator and authority in relativistic heavy-ion physics, Tom Cormier, passed away on 23 March after a brief illness. Tom was born in 1947 in Lexington, a suburb of Boston. After high school he went to MIT where he did both his undergraduate and graduate studies. He was an amazing physicist with a strong drive to explore the frontiers of relativistic nuclear physics, and a profound understanding of the field that enabled him to build the best tools to take us to those frontiers. 

After obtaining his PhD from MIT in 1974, Tom took up postdoc positions at Stony Brook and the Max Planck Institute. He then joined the University of Rochester, where he later became director of the Nuclear Structure Research Laboratory. In 1988 he moved to the Cyclotron Institute at Texas A&M University where he stayed for three years. Wayne State University was his next move, where he was chair of the physics and astronomy department. Tom joined the ORNL Physics Division in 2013, and reinvigorated the relativistic nuclear physics group and expanded ORNL’s very successful involvement in the ALICE experiment at the LHC, sPHENIX at RHIC and most recently in the Electron-Ion Collider (EIC) under construction at Brookhaven.

Tom’s work spanned an amazing breadth of physics and technology. Early on he worked on carbon–carbon inelastic scattering and scattering resonances; he then moved to experiments with recoil mass spectrometers at Brookhaven. Tom shifted his focus to relativistic heavy-ion physics with the AGS-E864 experiment at Brookhaven, followed by the STAR experiment at RHIC. He was the project manager for the construction of the STAR electromagnetic calorimeter and worked on the experiment from 1996 to 2005. 

Tom was one of the key scientists enabling the US heavy-ion community to join the LHC by proposing the large electromagnetic calorimeter EMCAL for ALICE and by forming the ALICE US collaboration. He was project manager for ALICE US, with a key responsibility for EMCAL and its later extension, the di-jet calorimeter, DCAL. Having successfully completed this project, he took on the leadership of the barrel tracker upgrade for ALICE. He was an architect of the TPC upgrade and was TPC deputy project leader from 2013. His true leadership and professionalism have been central to the success of ALICE in the past two decades. Tom most recently helped form the ECCE detector concept for the EIC. 

Both a great leader and project manager, Tom was a real inspiration, not only to his close colleagues but also to the broader community that held him in such high regard. He has been a wonderful mentor to many of us, and his contributions to the global physics programme, and to the ORNL physics division in particular, have been immense. He was an expert navigator of the various funding agencies and always showed immense calm during numerous DOE reviews, his dry sense of humour reflected in one of his memorable quips: “If I would wear a suit today, the DOE would be sure we screwed up badly.” He will be sorely missed but his legacy will remain.

Alberto Sirlin 1930–2022

Alberto Sirlin

Theorist Alberto Sirlin, a pioneer in electroweak radiative corrections, passed away on 23 February aged 91. His work played a key role in confirming predictions of the Standard Model (SM) at the ±0.1% level. He was a professor at New York University for 62 years, mentored 14 PhD students and remained an active researcher until shortly before his death. 

Born in Buenos Aires in 1930, Alberto received a physics and mathematics degree from the University of Buenos Aires in 1953. That year he went to Brazil where he took a quantum mechanics course taught by Richard Feynman. In a 2015 essay “Remembering a Great Teacher”, Alberto fondly recalled that experience and the enduring friendship that followed. In 1954 he travelled to UCLA and collaborated with Ralph Behrends and Robert Finkelstein on an early study of QED radiative corrections to muon decay in Fermi’s general theory of weak interactions. Alberto then moved to graduate school at Cornell University, where he collaborated with Toichiro Kinoshita on the QED corrections to muon and nuclear beta decays in the V-A Fermi theory. Their investigation showed that QED corrections increased the muon lifetime by about 0.4% – an effect still used to define the Fermi constant. For nuclear beta decay, where QED effects were logarithmically dependent on an arbitrary cutoff scale, Alberto would later show how electroweak unification determines this scale. After Cornell he spent two years (1957–1959) as a postdoc at Columbia University, supervised by T D Lee, before joining the faculty of New York University. He also held visiting appointments at BNL, CERN, Hamburg University, Rockefeller University and The Institute for Advanced Study.

When the SM came together in the early 1970s, Alberto’s early work on QED corrections to weak-interaction processes uniquely prepared him for a leading role in computing electroweak quantum loop corrections. For example, he showed how additional loop corrections involving W and Z bosons led to a replacement of the logarithmic cutoff found in semi-leptonic beta decays by the Z-boson mass, resulting in a ~2% increase for all semi-leptonic charged-current decay rates. This is essential for unitarity tests of the quark mixing matrix, and confirms the validity of the SM at more than 20σ!

In a 1980 paper that has been cited more than 1400 times, Alberto introduced the on-shell renormalisation scheme based on physical parameters and the quantity Δr, which encodes the radiative effects. This scheme has been used to study deep-inelastic neutrino–nucleus scattering, neutrino-electron scattering, atomic parity violation, polarised electron–electron scattering asymmetries, W&Z precise mass predictions, and more, not only by Alberto and his former students and collaborators, but by the entire particle-physics community in searches for new-physics effects. Together with his former student William Marciano, he won the 2002 J J Sakurai prize of the American Physical Society for their pioneering work on radiative corrections.

In witnessing the rise and then completion of the SM with the discovery of the Higgs boson in 2012, Alberto was able to enjoy the fruits of his labour. We, his students, have been inspired by Alberto’s dedication and enthusiasm. We are grateful that we could join his journey through life and physics. He was our great teacher.

Probing new physics with the Higgs boson

ATLAS figure 1

Due to its connection to the process of electroweak symmetry breaking, the Higgs boson plays a special role in the Standard Model (SM). Its properties, such as its mass and its couplings to fermions and bosons, have been measured with increasing precision. For these reasons, the Higgs boson has become an ideal tool to conduct new-physics searches. Prominent examples are direct searches for new heavy particles decaying into Higgs bosons or searches for exotic decays of the Higgs boson. Such phenomena have been predicted in many extensions of the SM motivated by long-standing open questions, including the hierarchy problem, dark matter and electroweak baryogenesis. Examples of new particles that couple to the Higgs boson are heavy vector bosons (as in models with Higgs compositeness or warped extra dimensions) and additional scalar particles (as in supersymmetric models or axion models).

Searches for resonances

The ATLAS collaboration recently released results of a search for a new heavy particle decaying into a Higgs and a W boson. The search was performed by probing for a localised excess in the invariant mass distribution of the ℓνbb final state. As no such excess was found, upper limits at 95% confidence level were set on the production-cross section times branching ratio of the new heavy resonance (figure 1). The results were also interpreted in the context of the heavy vector triplet (HVT) model, which extends the SM gauge group by an additional SU(2) group, to constrain the coupling strengths of heavy vector bosons to SM particles. In two HVT benchmark models, W masses below 2.95 and 3.15 TeV are excluded.

ATLAS figure 2

Rare or exotic decays are excellent candidates to search for weakly coupled new physics. The Higgs boson is particularly sensitive to such new physics owing to its narrow total width, which is three orders of magnitude smaller than that of the W and Z bosons and the top quark. Several searches for exotic decays of the Higgs boson have been carried out by ATLAS, and they may be broadly classified as those scenarios where the possible new daughter particle decays promptly to SM particles, and those where it would be long-lived or stable.

A recent search from ATLAS targeted exotic decays of the Higgs boson into a final state into four electrons or muons, which benefit from a very clean experimental signature. Although a signal was not observed, the search put stringent constraints on decays to new light scalar bosons – particularly in the low mass range of a few GeV – and to new vector bosons, dubbed dark Z bosons or dark photons, in the mass range up to a few tens of GeV. Depen­ding on the new-physics model, this search can exclude branching ratios of the Higgs boson to new particles as low as O(10–5).

Invisibles

Another interesting possibility is the case where the Higgs boson decays to particles that are invisible in the detector, such as dark-matter candidates. To select such events, different strategies are pursued depending on the particles produced in association with the Higgs boson. The most powerful channel for such a search is the vector-boson fusion production process, where two energetic jets from quarks are produced with large angular separation along­side the invisibly decaying Higgs boson (figure 2). Another sensitive channel is the associated production of a Higgs boson with a Z boson that decays to a pair of leptons. Improvements in background predictions have made it possible to reach a sensitivity down to 10% on the branching ratio of invisible Higgs-boson decays, while the corresponding observed limit amounts to 15%.

These searches will greatly benefit from the large datasets expected in Run 3 and later High-Luminosity LHC runs, and will enable searches for even more feeble couplings of new particles to the Higgs boson.

bright-rec iop pub iop-science physcis connect