Topics

Philippe Bernard 1935–2023

Philippe Bernard

Electrical engineer Philippe Bernard, who made notable technical and managerial contributions across the various sectors at CERN in which he worked, passed away on 10 October 2023.

Born in 1935, Philippe completed his studies at the prestigious Ecole supérieure d’éléctricité in 1956. He began working at CERN in 1962 as engineer-in-charge of the Proton Synchrotron. He went on to design and develop radio-frequency (RF) separators, making substantial contributions to the improvement of these devices that provide well-selected secondary beams. This was particularly important in the early 1970s for experiments with the CERN 2 m hydrogen chamber, the Saclay-built Mirabelle chamber at Serpukhov, and the Big European Bubble Chamber at CERN.

Realising the potential of superconductivity for RF structures, Philippe, together with Herbert Lengeler, was entrusted by CERN Director-General John Adams to develop RF cavities for CERN accelerators in 1978. A vigorous programme with international participation led to the development of five-cell cavities, first made of pure niobium and, later, of niobium sputtered on the more stable copper-substrate to produce robust cavities. This allowed accelerating fields of up to 7 MV/m to be reached.

After tests of prototypes at PETRA (DESY) and the Super Proton Synchrotron, 320 such cavities were produced for the Large Electron-Positron collider (LEP) using niobium-film technology. In the framework of the LEP2 upgrade programme, which started in 1987, these cavities were gradually added to the complement of normal-conducting cavities, which were partially replaced. This enabled an increase in the electron and positron beam energy from 46 GeV in 1989 to 104 GeV by 2000. In addition to this successful development, in the late 1990s Philippe took a strong interest in the design and development of a system of coupled superconducting cavities as a sensitive detector of gravitational waves.

Philippe was also involved in numerous CERN-wide activities, including chairing the purchasing policy monitoring board and serving as president of the CERN health insurance scheme (CHIS). He also served as president and vice-president of the CERN Pensioners’ Association during a critical period.

His open mind, his wide-ranging views and his solid technical knowledge made Philippe a recognised leader. His critical and thoughtful attitude made him a respected discussion partner for the CERN management. Philippe’s commitment to CHIS and to long-term improvements in the social conditions of CERN and ESO staff was widely appreciated and acknowledged. We remember him as a generous, witty and vivacious friend.

New CERNs for a fractured world

Although a brief period of hubris and short-sightedness at the end of the Cold War led some in the West to proclaim “the end of history” and a path to a unified global community, underlying and historically ever-present geopolitical tensions have surfaced again, perhaps as strongly as in the past. At the same time, the past decades have witnessed increased education of talented scientists and technologists across the globe, including in low- and middle-income countries that were once outside the leading science communities. To address the science and technology challenges of our time, we need to find ways to steady the ship to best navigate this changing global scene.

Just as CERN was born out of the ashes of global destruction and disarray – a condition that called for collaboration out of necessity – we propose that the resurgence of nationalism along with pressing challenges such as climate change, disease and artificial intelligence call for stronger scientific communities. At the time of CERN’s founding 70 years ago, European physicists, especially in sub-atomic physics, faced marginalisation. Devastated European countries could not separately fund the “big science” facilities necessary to do cutting-edge research. Moreover, physicists were divided by national loyalties to countries that had been enemies during the war. In the period that followed, it seemed that subatomic research would be dominated by the US and the USSR. Worse, it seemed all too likely that the nationalistic agendas in those nations would push for advances in catastrophic new military technologies.

Leonard Lynn

The creation and operation of CERN in that environment was monumental. CERN brought together scientists from various countries, eventually extending beyond Europe. It greatly advanced basic knowledge in fundamental physics and spun-off practical technologies such as the web and medical equipment. It has also served as a template (greatly underused in our view) for other international science and technology organisations such as SESAME in the Middle East. Today, the challenges for global cooperation in science and technology are different from those facing the founders of CERN. Mostly Western Europeans, with a few US supporters, they shared the discipline of subatomic physics and included Nobel Laureates and other highly respected people who were able to enlist the help of supportive diplomats in the various founding states.

Moment for change

The current geopolitical moment calls forth the need for more CERN-like organisations, just as occurred in that brief post-war moment. New global institutes and organisations to address global problems will have to span a broad range of countries and cultures. They will have to overcome techno-nationalistic opportunism and fears, and deal with potential capture by multinational enterprises (as happened with the response to COVID).

New global institutes and organisations to address global problems will have to span a broad range of countries and cultures

Since its founding, CERN has increasingly shown the ability to cross cultural and political boundaries – most nations of the world have sent scientists to participate in CERN projects, and non-European countries such as India, Pakistan and Turkey are associate members. Some mention the importance of facility cafeterias and other venues where scientists from different countries can meet and have unofficial discussions. CERN has striven to keep decision-making separate from national interests by having a convention that precludes its involvement in military technologies, and by having decisions about projects made primarily by scientists. It has strong policies regarding the sharing of intellectual property developed at its facilities.

Hal Salzman

CERN’s contributions to basic science and to various important technologies is undisputed. We suggest its potential contributions to the organisation of global science and technology cooperation also deserve greater attention. A systematic examination of CERN’s governance system and membership should be undertaken and compared with the experiences of others. Analysing how the CERN model fits social science-studies of design principles, it is clear that the CERN success brings important additional principles for when the common-pool resources are science and technology, and members come from diverse cultural backgrounds. CERN has addressed issues of bringing together scientists from countries that may have competing techno-nationalistic agendas, providing shelter against not only government but also multinational enterprises. It has focused on non-military technologies and on sharing its intellectual property. It is time that this organisational experience is rolled out for even greater common good.

Towards an unbiased digital world

What is Open Web Search?

The Open Web Search project was started by a group of people who were concerned that navigation in the digital world is led by a handful of big commercial players (the European search market is largely dominated by Google, for example), who don’t simply offer their services out of generosity but because they want to generate revenue from advertisements. To achieve that they put great effort into profiling users: they analyse what you are searching for and then use this information to create more targeted adverts that create more revenue for them. They also filter search results to present information that fits your world view, to make sure that you come back because you feel at home on those web pages. For some people, and for the European Commission in the context of striving for open access to information and digital sovereignty, as well as becoming independent of US-based tech giants, this is a big concern.

How did the project come about?

In 2017 the founder of the Open Search Foundation reached out to me because I was working on CERN’s institutional search. He had a visionary idea: an open web index that is free, accessible to everyone and completely transparent in terms of the algorithms that it uses. Another angle was to create a valuable resource for building future services, especially data services. Building an index of the web is a massive endeavour, especially when you consider that the estimated total number of web pages worldwide is around 50 billion.

You could argue that unbiased, transparent access to information in the digital world should be on the level of a basic right

A group of technical experts from different institutes and universities, along with the CERN IT department, began with a number of experiments that were used to get a feel for the scale of the project. For example, to see how many web pages a single server can index and to evaluate the open source projects used for crawling and indexing web pages. The results of these experiments were highly valuable when it came to replying to the Horizon Europe funding call later on.

In parallel, we started a conference series, the Open Search Symposia (OSSYM). Two years ago there was a call for funding in the framework of the European Union (EU) Horizon Europe programme dedicated to Open Web search. Together with 13 other institutions and organisations, the CERN IT department participated and we were awarded a grant. We were then able to start the project in September 2022.

Andreas Wagner

What are the technical challenges in building a new search engine?

We don’t want to copy what others are doing. For one, we don’t have the resources to build a new, massive data centre. The idea is a more collaborative approach, to have a distributed system where people can join depending on their means and interests. CERN is leading work-package five “federated data infrastructure”, in which we
and our four infrastructure partners (DLR and LRZ in Germany, CSC in Finland and IT4I in the Czech Republic) provide the infrastructure to set up the system that will ultimately allow the index itself to be built in a purely distributed way. At CERN we are running the so-called URL frontier – a system that oversees what is going on in terms of crawling and preparing this index, and has a long list of URLs that should be collected. When running the crawlers, they report back on what they have found on different web pages. It’s basically bookkeeping to ensure that we coordinate activities and don’t duplicate the efforts already made by others.

Open Web Search is said to be based on European values and jurisdiction. Who and what defines these?

That’s an interesting question. Within the project there is a dedicated work package six titled “open web search ecosystem and sustainability” that covers the ethical, legal and societal aspects of open search and addresses the need for building an ecosystem around open search, including the proper governance processes for the infrastructure.

The legal aspect is quite challenging because it is all new territory. The digital world evolves much faster than a legislator can keep up! Information on the web is freely available to anyone, but the moment you start downloading and redistributing it you are taking on ownership and responsibility. So you need you take copyright into account, which is regulated by most EU countries. Criminal law is more delicate in terms of the legal content. Every country has its own rules and there is no conformity. Overall, European values include transparency, fairness for data availability and adhering to democratic core principles. We are aiming at including these European values into the core design of our solution from the very beginning.

What is the status of the project right now?

The project was launched just over a year ago. On the infrastructure side the aim was to have the components in place, meaning having workflows ready and running. It’s not fully automated yet and there is still a lot of challenging work to do, but we have a fully functional set-up, so some institutes have been able to start crawling; they feed the data and it gets stored and distributed to the participating infrastructure partners including CERN. At the CERN data centre we coordinate the crawling efforts and provide advanced monitoring. As we go forward, we will work on aspects of scalability so that there won’t be any problems when we go bigger.

The Open Web Search project

What would a long-term funding model look like for this project?

You could argue that unbiased, transparent access to information in the digital world that has become so omnipresent in our daily lives should be on the level of a basic right. With that in mind, one could imagine a governmental funding scheme. Additionally, this index would be open to companies that can use it to build commercial applications on top of it, and for this use-case a back-charging model might be suitable. So, I could imagine a combination of public and usage-based funding.

In October last year the Open Search Symposium was hosted by the CERN IT department. What was the main focus there?

This is purposely not focused on one single aspect but is an interdisciplinary meeting. Participants include researchers, data centres, libraries, policy makers, legal and ethical experts, and society. This year we had some brilliant keynote speakers such as Věra Jourová, the vice president of the European Commission for Values and Transparency, and Christoph Schumann from LAION, a non-profit organisation that looks to democratise artificial intelligence models.

Ricardo Baeza-Yates (Institute for Experiential Artificial Intelligence, Northeastern University) gave a keynote speech about “Bias in Search and Recommender Systems” and Angella Ndaka (The Centre for Africa Epistemic Justice and University of Otago) talked about “Inclusion by whose terms? When being in doesn’t mean digital and web search inclusion”, the challenges of providing equal access to information to all parts of the world. We also had some of the founders of alternative search engines joining, and it was very interesting and inspiring to see what they are working on. And we had representatives from different universities looking at how research is advancing in different areas.

I see the purpose of Open Web Search as being an invaluable investment in the future

In general, OSSYM 2023 was about a wide range of topics related to internet search and information access in the digital world. We will shortly publish the proceedings of the nearly 25 scientific papers that were submitted and presented.

How realistic is it for this type of search engine to compete with the big players?

I don’t see it as our aim or purpose to compete with the big players. They have unlimited resources so they will continue what they are doing now. I see the purpose of Open Web Search as being an invaluable investment in the future. The Open Web Index could pave the way for upcoming competitors, creating new ideas and questioning the monopoly or gatekeeper roles of the big players. This could make accessing digital information more competitive and a fairer marketplace. I like the analogy of cartography: in the physical world, having access to (unbiased) maps is a common good. If you compare maps from different suppliers you still get basically the same information, which you can rely on. At present, in the digital world there is no unbiased, independent cartography available. For instance, if you look up the way to travel from Geneva to Paris online, you might have the most straightforward option suggested to you, but you might also be pointed towards diversions via restaurants, where you then might consider stopping for a drink or some food, all to support a commercial interest. An unbiased map of the digital world should give you the opportunity to decide for yourself where and how you wish to get to your destination.

The project will also help CERN to improve its own search capabilities and will provide an open-science search across CERN’s multiple information repositories. For me, it’s nice to think that we are helping to develop this tool at the place where the web was born. We want to make sure, just as CERN gave the web to the world, that this is a public right and to steer it in the right direction.

Magnetic monopoles where art thou?

ATLAS figure 1

Magnetic monopoles are hypothetical particles that possess a magnetic charge. In 1864 James Clerk Maxwell assumed that magnetic monopoles didn’t exist because no one had ever observed one. Hence, he did not incorporate the concept of magnetic charges in his unified theory of electricity and magnetism, despite their being fully consistent with classical electrodynamics. Interest in magnetic monopoles intensified in 1931 when Dirac showed that quantum mechanics can accommodate magnetic charges, g, allowed by the quantisation condition g = Ne  = NgD, where e is the elementary electric charge, α is the fine structure constant, gD is the fundamental magnetic charge and N is an integer. Grand unified theories predict very massive magnetic monopoles, but several recent extensions of the Standard Model feature monopoles in a mass range accessible at the LHC. Scientists have explored cosmic rays, particle collisions, polar volcanic rocks and lunar materials in their quest for magnetic monopoles, yet no experiment has found conclusive evidence thus far.

Signature strategy

The ATLAS collaboration recently reported the results of the search for magnetic monopoles using the full LHC Run 2 dataset recorded in 2015–2018. Magnetic charge conservation dictates that magnetic monopoles are stable and would be created in pairs of oppositely charged particles. Point-like magnetic monopoles could be produced in proton–proton collisions via two mechanisms: Drell–Yan, in which a virtual photon from the collision creates a magnetic mono­pole pair; or photon-fusion, whereby two virtual photons scattering off proton collisions interact to create a magnetic monopole pair. Dirac’s quantisation condition implies that a 1gD monopole would ionise matter in a similar way as a high-electric-charge object (HECO) of charge 68.5e. Hence, magnetic monopoles and HECOs are expected to be highly ionising. In contrast to the behaviour of electrically charged particles, however, the Lorentz force on a monopole in the solenoidal magnetic field encompassing the ATLAS inner tracking detector would cause it to be accelerated in the direction of the field rather than in the orthogonal plane – a trajectory that precludes the application of usual track-reconstruction methods. The ATLAS detection strategy therefore relies on characterising the highly ionising signature of magnetic monopoles and HECOs in the electromagnetic calorimeter and in the transition radiation tracker.

This is the first ATLAS analysis to consider the photon-fusion production mechanism

The ATLAS search considered magnetic monopoles of magnetic charge 1gD and 2gD, and HECOs of 20e, 40e, 60e, 80e and 100e of both spin-0 and spin-½ in the mass range 0.2–4 TeV. ATLAS is not sensitive to higher charge monopoles or HECOs because they stop before the calorimeter due to their higher ionisation. Since particles in the considered mass range are too heavy to produce significant electromagnetic showers in the calorimeter, their narrow high-energy deposits are readily distinguished from the broader lower-energy ones of electrons and photons. Events with multiple high-energy deposits in the transition radiation tracker aligned with a narrow high-energy deposit in the calorimeter are therefore characteristic of magnetic monopoles and HECOs.

Random combinations of rare processes, such as superpositions of high-energy electrons, could potentially mimic such a signature. Since such rare processes cannot be easily simulated, the background in the signal region is estimated to be 0.15 ± 0.04 (stat) ± 0.05 (syst) events through extrapolation from the lower ionisation event yields in the data.

With no magnetic monopole or HECO candidate observed in the analysed ATLAS data, upper cross-section limits and lower mass limits on these particles were set at 95% confidence level. The Drell–Yan cross-section limits are approximately a factor of three better than those from the previous search using the 2015–2016 Run 2 data.

This is the first ATLAS analysis to consider the photon-fusion production mechanism, the results of which are shown in figure 1 (left) for spin-½ monopoles. ATLAS is also currently the most sensitive experiment to magnetic monopoles in the charge range 1-2gD, as shown in figure 1 (right), and to HECOs in the charge range of 20–100e. The collaboration is further refining search techniques and developing new strategies to search for magnetic monopoles and HECOs in both Run 2 and Run 3 data.

QGP production studied at record energies

CMS figure 1

The very-high-energy densities reached in heavy-ion collisions at the LHC result in the production of an extremely hot form of matter, known as the quark-gluon plasma (QGP), consisting of freely roaming quarks and gluons. This medium undergoes a dynamic evolution before eventually transitioning to a collection of hadrons. But the details of this temporal evolution and phase transition are very challenging to calculate from first principles using quantum chromodynamics. The experimental study of the final-state hadrons produced in heavy-ion collisions therefore provides important insights into the nature of these processes. In particular, measurements of the pseudorapidity (η) distributions of charged hadrons help in understanding the initial energy density of the produced QGP and how this energy is transported throughout the event. These measurements involve different classes of collisions, sorted according to the degree of overlap between the two colliding nuclei; collisions with the largest overlap have the highest energy densities.

In 2022 the LHC entered Run 3, with higher collision energies and integrated luminosities than previous running periods. The CMS collaboration has now reported the first measurement using Run 3 heavy-ion data. Charged hadrons produced in lead–lead collisions at the record nucleon–nucleon centre-of-mass collision energy of 5.36 TeV were reconstructed by exploiting the pixel layers of the silicon tracker. At mid-rapidity and in the 5% most central collisions (which have the largest overlap between the two colliding nuclei), 2032 ± 91 charged hadrons are produced per unit of pseu­dorapidity. The data-to-theory comparisons show that models can successfully predict either the total charged-hadron multiplicity or the shape of its η distribution, but struggle to simultaneously describe both aspects.

Previous measurements have shown that the mid-rapidity yield of charged hadrons in proton-proton and heavy-ion collisions are comparable when scaled by the average number of nucleons parti­cipating in the collisions,Npart. Figure 1 shows measurements of this quantity in several collision systems as a function of collision energy. It was previously observed that central nucleus–nucleus collisions exhibit a power-law scaling, as illustrated by the blue dashed curve; the new CMS result agrees with this trend. In addition, the measurement is about two times larger than the values of proton–proton collisions at similar energies, indicating that heavy-ion collisions are more efficient at converting initial-state energy into final-state hadrons at mid-rapidity.

This measurement opens a new chapter in the CMS heavy-ion programme. At the end of 2023 the LHC delivered an integrated luminosity of around 2 nb–1 to CMS, and more data will be collected in the coming years, enabling more precise analyses of the QGP features.

Dielectrons take the temperature of Pb–Pb collisions

ALICE figure 1

Collisions between lead ions at the LHC produce the hottest system ever created in the lab, exceeding those in stellar interiors by about a factor of 105. At such temperatures, nucleons no longer exist and quark–gluon plasma (QGP) is formed. Yet, a precise measurement of the initial temperature of the QGP created in these collisions remains challenging. Information about the early stage of the collision gets washed out because the system constituents continue to interact as it evolves. As a result, deriving the initial temperature from the hadronic final state requires a model-dependent extrapolation of system properties (such as energy density) by more than an order of magnitude.

In contrast, electromagnetic radiation in the form of real and virtual photons escapes the strongly interacting system. Moreover, virtual photons – emerging in the final state as electron–positron pairs (dielectrons) – carry mass, which allows early and late emission stages to be separated.

Radiation from the late hadronic phase dominates the thermal dielectron spectrum at invariant masses below 1 GeV. The yield and spectral shape in this mass window reflects the in-medium properties of vector mesons, mainly the ρ, and can be connected to the restoration of chiral symmetry in hot and dense matter. In the intermediate-mass region (IMR) between about 1 and 3 GeV, thermal radiation is expected to originate predominantly from the QGP, and an estimate of the initial QGP temperature can be derived from the slope of the exponential spectrum. This makes dielectrons a unique tool to study the properties of the system at its hottest and densest stage.

A new approach to separate the heavy-flavour contribution experimentally has been employed for the first time at the LHC

At the LHC, this measurement is challenging because the expected thermal dielectron yield in the IMR is outshined by a physical background that is about 10 times larger, mainly from semileptonic decays of correlated pairs of cc or bb hadrons. In ALICE, the electron and positron candidates are selected in the central barrel using complementary information provided by the inner tracking system (ITS), time projection chamber and time-of-flight measurements. Figure 1 (left) shows the dielectron invariant-mass spectrum in central lead–lead (Pb–Pb) collisions. The measured distribution is compared with a “cocktail” of all known contributions from hadronic decays. At masses below 0.5 GeV, an enhancement of the dielectron yield over the cocktail expectation is observed, which is consistent with calculations that include thermal radiation from the hadronic phase and an in-medium modification of the ρ-meson. Between 0.5 GeV and the ρ mass (0.77 GeV) a small discrepancy between the data and calculations is observed.

In the IMR, however, systematic uncertainties on the cocktail contributions from charm and beauty prevent any conclusion being drawn about thermal radiation from QGP. To overcome this limitation, a new approach to separate the heavy-flavour contribution experimentally has been employed for the first time at the LHC. This approach exploits the high-precision vertexing capabilities of the ITS to measure the displaced vertices of heavy-quark pairs. Figure 1 (right) shows the dielectron distribution in the IMR compared to template distributions from Monte Carlo simulations. The best fit includes templates from heavy-quark pairs and an additional prompt dielectron contribution, presumably from thermal radiation. This is the first experimental hint of thermal radiation from the QGP in Pb–Pb collisions at the LHC, albeit with a significance of 1σ.

Ongoing measurements with the upgraded ALICE detector will provide an unprecedented improvement in precision, paving the way for a detailed study of thermal radiation from hot QGP.

Resolving asymmetries in B0 and B0s oscillations

In the Standard Model (SM), CP violation originates from a single complex phase in the 3 × 3 Cabibbo–Kobayashi–Maskawa (CKM) quark-mixing matrix. The unitarity condition of the CKM matrix (Vud V*ub + Vcd V*cb + Vtd V*tb = 0, where Vij are the CKM matrix elements) can be represented as a triangle in the complex plane, with an area proportional to the amount of CP violation in the quark sector. One angle of this triangle, γ = arg (–Vud V*ub/ Vcd V*cb), is of particular interest as it can be probed both indirectly under the assumption of unitarity and in tree-level processes that make no such assumption. Its most sensitive direct experimental determination is currently given by a combination of LHCb measurements of B+, B0, B0s decays to final states containing a D(s) meson and one or more light mesons. Decay-time-dependent analyses of tree-level B0s Ds K± and B0 Dπ± decays are sensitive to the angle γ through CP violation in the interference between mixing and decay amplitudes. Thus, comparing the value of γ obtained from tree-level processes with indirect measurements of γ and other unitary triangle parameters in loop-level processes provides an important consistency check of the SM.

LHCb figure 1

Measurements using neutral B0 and B0s mesons are particularly powerful because they resolve ambiguities that other measurements cannot. Due to the interference between B0(s) – B0(S) mixing and decay amplitudes, the physical CP-violating parameters in these decays are functions of a combination of γ and the relevant mixing phase, namely γ + 2β in the B0 system, where β = arg(–Vcd V*cb/ Vtd V*tb), and γ–2βs in the B0s system, where βs = arg(–Vts V*tb/ Vtd V*tb). Measurements of these physical quantities can therefore be interpreted in terms of the angles γ and β(s), and γ can be derived using independent determinations of the other parameter as input.

The LHCb collaboration recently presented a new measurement of B0s Ds K± decays collected during Run 2. This is a challenging analysis, as it requires a decay time-dependent fit to extract the CP-violating observables expressed as amplitudes of the four different decay paths that arise from B0s and – B0s to Ds K± final states. Previously, LHCb measured γ in this decay using the Run 1 dataset, obtaining γ = 128 +17–22°. The B0s – B0s oscillation frequency ∆ms must be precisely constrained in order to determine the phase differences between the amplitudes. In the Run 2 measurement, the established uncertainty on ∆ms would have been a limiting systematic uncertainty, which motivated the recent LHCb measurement of ∆ms using the flavour-specific B0s Ds π+ decays from the same dataset. Combined with Run 1 measurements of ∆ms, this has led to the most precise contribution to the world average and has greatly improved the precision on γ in the B0s Ds K± analysis. Indeed, for the first time the four amplitudes are resolved with sufficient precision to show the decay rates separately (see figure 1).

The angle γ is determined using inputs from other LHCb measurements of the CP-violating weak phase –2βs, along with measurements of the decay width and decay-width difference. The final result, γ = 74 ± 11°, is compatible with the SM and is the most precise determination of γ using B0s meson decays to date.

Leading in collaborations

Are we at the vanguard of every facet of our field? In our quest for knowledge, physicists have charted nebulae, quantified quarks and built instruments and machines at the edge of technology. Yet, there is a frontier that remains less explored: leadership. As a field, particle physics has only just begun to navigate the complexities of guiding our brightest minds.

Large-experiment collaborations such as those at the LHC achieve remarkable feats. Indeed, social scientists have praised our ability to coordinate thousands of researchers with limited “power” while retaining individual independence. Similarly, as we continuously optimise experiments for performance and quality, and there also exist opportunities to refine behaviours and practices to facilitate progress and collective success.

A voice for all

Hierarchies in any organisation can inadvertently become a barrier rather than a facilitator of open idea exchange. Often, decision-making is confined to higher levels, reducing the agency of those implementing actions and leading to disconnects in roles and responsibilities. Excellence in physics doesn’t guarantee the interpersonal skills that are essential for inspiring teams. Moreover, imposter syndrome infects us all, especially junior collaborators who may lack soft-skills training. While striving for diversity we sometimes overlook the need to embrace different personality types, which, for example, can make large meetings daunting for the less outspoken. Good leadership can help navigate these challenges, ensuring that every voice contributes to our collective progress.

Leadership is not management (using resources to get a particular job done), nor is it rank (merely a line on a CV). It is guidance and influence of others towards a shared vision – a pivotal force as essential as any tool in our research arsenal. Good leadership is a combination of strategic foresight, emotional intelligence and adaptive communication; it creates an inclusive environment where individual contributions are not commanded but empowered. These practices would improve any collaboration. In large physics experiments this type of leadership is incidental instead of being broadly acknowledged and pursued.

Luckily, leadership is a skill that can be taught and developed through training. True training is a craft and is best delivered by experts who are not just versed in theory but are also skilled practitioners. Launched in autumn 2023 based on the innovative training approach of Resilient Leaders Elements, a new course “Leading in Collaborations” is tailored specifically for our community. The three-month expert-facilitated course includes four half-day workshops and two one-hour clinics, addressing two main themes: “what I do”, which equips participants with decision-making skills to set clear goals and navigate the path to achieving them; and “who I am”, which encourages participants to channel their emotions positively and motivate both themselves and others effectively. The course confronts participants with the question “What is leadership in a large physics collaboration?” and provides a new framework of concepts. Through self-assessment, peer-feedback sessions, individualised challenges and buddy-coaching, participants are able to identify blind spots and hidden talents. A final assessment shows measurable change in each skill.

The first cohort of 20 participants, displaying a diverse mix of physics experience from various institutions and nationalities, was welcomed to the programme at University College London on 14 and 15 November 2023. More than half of the participants were women – in line with the programme’s aim to ensure that those often overshadowed are given the visibility and support to become more impactful leaders. The lead facilitator, Chris Russell, masterfully connected with the audience via his technical physics background and proceeded to build trust and impart knowledge in an open and supportive atmosphere. When discussing leadership, the initial examples given cited military and political figures; reframing led to a participant’s description of a conductor giving their orchestra space to play through an often-rehearsed tough section as an example of great leadership.

Crucial catalyst

Building on the experience of the first cohort, the aim is to offer the programme more broadly so that we can encourage common practice and change the culture of leadership in large collaborations. Given that the LHC hosts the largest collaborations in physics, the programme also hopes to find a home within CERN’s learning and development portfolio.

The Leading in Collaborations programme is a crucial catalyst in the endeavour to ensure that our precious resources are wielded with precision and purpose, and thus to amplify our collective capacity for discovery. Join the leadership revolution by being the leader you wish you had, no matter your rank. Together, we will become the cultural vanguard!

First TIPP in Africa a roaring success

The Conference of Technology and Instrumentation in Particle Physics (TIPP) is the largest conference of its kind. The sixth edition, which took place in Cape Town from 4 to 8 September 2023 and attracted 250 participants, was the first in Africa. More than 200 presentations covered state-of-the-art developments in detector development and instrumentation in particle physics, astroparticle physics and closely related fields. 

“As South Africa, we regard this opportunity as a great privilege for us to host this year’s edition of the TIPP conference,” said minister of higher education, science and innovation Blade Nzimande during an opening address. He was followed by speeches from Angus Paterson, deputy CEO of the National Research Foundation, and Makondelele Victor Tshivhase, director of the national research facility iThemba LABS.

The South African CERN (SA–CERN) programme within the National Research Foundation and iThemba LABS supports more than 120 physicists, engineers and students that contribute to the ALICE, ATLAS and ISOLDE experiments, and to theoretical particle physics. The SA–CERN programme identifies technology transfer in particle physics as key to South African society. This aligns symbiotically with the technology innovation platform of iThemba LABS to create a platform for innovation, incubation, industry collaboration and growth. For the first time, TIPP 2023 included a dedicated parallel session on technology transfer, which was chaired by Massimo Caccia (University of Insubria), Paolo Giacomelli (INFN Bologna) and Christophe De La Taille (CNRS/IN2P3).

The scientific programme kicked off with a plenary presentation on the implementation of the ECFA detector R&D roadmap in Europe by Thomas Bergauer (HEPHY). Other plenary presentations included overviews on bolometers for neutrinos, the Square Kilometre Array (SKA), technological advances by the LHC experiments, NaI experiments, advances in instrumentation at iThemba LABS, micro-pattern gaseous detectors, inorganic and liquid scintillator detectors, noble liquid experiments, axion detection, water cherenkov detectors for neutrinos, superconducting technology for future colliders and detectors, and the PAUL facility in South Africa.

A panel discussion between former CERN Director-General Rolf Heuer (DESY), Michel Spiro (IRFU) and Manfred Krammer (CERN), Imraan Patel (deputy director general of the Department of Science and Innovation), Angus Paterson and Rob Adam (SKA) triggered an exchange of insights about international research infrastructures such as CERN and SESAME for particle physics and science diplomacy.

Prior to TIPP2023, 25 graduate students from Botswana, Cameroon, Ghana, South Africa and Zambia participated in a school of instrumentation in particle, nuclear and medical physics held at iThemba LABS, comprising lectures, hands-on demonstrations, and insightful presentations by researchers from CERN, DESY and IJCLAB, which provided a global perspective on instrumentation.

A bright future for the Higgs sector

The 13th Higgs Hunting workshop, organised in Orsay and Paris from 11 to 13 September 2023, was a timely opportunity to gather theorists and experimentalists interested in recent results related to the Higgs sector. While the large 140 fb–1 dataset collected by the ATLAS and CMS experiments during LHC Run 2 is still being exploited to measure the Higgs-boson properties in more detail, the first results based on Run 3 data collected since 2022 were also shown, along with searches for phenomena beyond the Standard Model.

Experimental highlights focused on the latest results from CMS and ATLAS. CMS presented a new measurement of the associated production of a Higgs boson with top quarks decaying into b quarks, while ATLAS showed a new measurement of the associated production of a vector boson and a boosted Higgs boson in fully hadronic final states. A major highlight was a new CMS measurement of the Higgs-boson mass in the four-lepton decay channel, reaching the highest precision to date in a single decay channel as well as placing indirect constraints on the Higgs-boson width. Precision measurements were also shown in the framework of effective field theory, which allows potential subtle deviations with respect to the Standard Model to be probed. A small number of intriguing excesses observed, for instance, in the search for partners of the Higgs boson decaying into W-boson or photon pairs were also extensively discussed.

Following a historical talk on the “long and winding road” that led particle physicists from LEP to the discovery of the Higgs boson by Steve Myers, who was CERN director of accelerators and technology when the LHC started up, a dedicated session discussed Higgs-physics prospects at colliders beyond the High-Luminosity LHC (HL-LHC). Patrizia Azzi (INFN Padova) presented the experimental prospects at the proposed Future Circular Collider, and Daniel Schulte (CERN) described the status of muon colliders, highlighting the strong interest within the community and leading to a lively discussion.

The latest theory developments related to Higgs physics were discussed in detail, starting with state-of-the-art predictions for the various Higgs-boson production modes by Aude Gehrmann-De Ridder (ETH Zurich). Andrea Wulzer (CERN) overviewed the theory prospects relevant for future collider projects, while Raffaele Tito D’Agnolo (IPhT, Saclay) presented the connections between the properties of the Higgs boson and cosmology and Arttu Rajantie (Imperial College) focused on implications of the Higgs vacuum metastability on new physics. Finally, a “vision” talk by Matthew McCullough (CERN) questioned our common assumption that the Higgs boson discovered at the LHC is really compatible with Standard Model expectations, considering the current precision of the measurements of its properties.

During several experimental sessions, recent results covering a wide range of topics were presented – in particular those related to vector-boson scattering, since their high-energy behaviour is driven by the properties of the Higgs boson. The Higgs-boson self-coupling was another topic of interest. The best precision on this measurement is currently achieved by combining indirect constraints from processes involving a single Higgs boson together with direct searches for the rare production of a Higgs-boson pair. While the Run 3 data set will provide an opportunity to further improve the sensitivity to the latter, its observation is expected to take place towards the end of HL-LHC operations. Finally, Stéphanie Roccia (LPSC) presented the implications of experimental measurements of the neutron electron dipole moment on the CP-violating couplings of the Higgs boson to fermions, absent in the Standard Model. Concluding talks were given by Massimiliano Grazzini (University of Zurich) and Andrea Rizzi (University and INFN Pisa). The next Higgs Hunting workshop will be held in Orsay and Paris from 23 to 25 September 2024.

bright-rec iop pub iop-science physcis connect