Comsol -leaderboard other pages

Topics

A celebration of physics in the Balkans

The 11th General Conference of the Balkan Physical Union (BPU11 Congress) took place from 28 August to 1 September 2022 in Belgrade, with the Serbian Academy of Science and Arts as the main host. Initiated in 1991 in Thessaloniki, Greece, and open to participants globally, the series provides a platform for reviewing, disseminating and discussing novel research results in physics and related fields. 

The scientific scope of BPU11 covered the full landscape of physics via 139 lectures (12 plenary and 23 invited) and 150 poster presentations. A novel addition was five roundtables dedicated to high-energy physics (HEP), widening participation, careers in physics, quantum and new technologies, and models of studying physics in European universities with a focus on Balkan countries. The hybrid event attracted about 476 participants (325 on site) from 31 countries, 159 of whom were students, and demonstrated the high level of research conducted in the Balkan states.

Roadmaps to the future

The first roundtable “HEP – roadmaps to the future” showed the strong collaboration between CERN and the Balkan states. Four out of 23 CERN Member States come from the region (Bulgaria, Greece, Serbia and Romania); two out of three Associate Member States in the pre-stage to membership are Cyprus and Slovenia; and two out of seven Associate Member States are Croatia and Turkey. A further four countries have cooperation agreements with CERN, and more than 400 CERN users come from the Balkans. 

Kicking off the HEP roundtable discussions, CERN director for research and computing Joachim Mnich presented the recently launched accelerator and detector R&D roadmaps in Europe. Paris Sphicas (CERN and the University of Athens) reported on the future of particle-physics research, during which he underlined the current challenges and opportunities. These included: dark matter (for example the search for WIMPs in the thermal parameter region, the need to check simplified models such as axial-vector and di-lepton resonances, and indirect searches); supersymmetry (the search for “holes” in the low-mass region that will exist even after the LHC); neutrinos (whether neutrinos are Majorana or Dirac particles, their mass measurement and exploration of a possible “sterile” sector); as well as a comprehensive review of the Higgs sector. 

CERN’s Emmanuel Tsesmelis, who was awarded the Balkan Physical Union charter and honorary membership in recognition of his contributions to cooperation between the Balkan states and CERN, reflected on the proposed Future Circular Collider (FCC). Describing the status of the FCC feasibility study, due to be completed by the end of 2025, he stressed that the success of the project relies on strong global participation. His presentation initiated a substantial discussion about the role of the Balkan countries, which will be continued in May 2023 at the 11th LHCP conference in Belgrade.

The roundtable devoted to quantum technologies (QTs), chaired by Enrique Sanchez of the European Physical Society (EPS), was another highlight with strong relevance to HEP. Various perspectives on the different QT sectors – computing and simulation, communication, metrology and sensing – were discussed, touching upon the impact they could have on society at large. Europe plays a leading role in quantum research, concluded the panel. However, despite increased interest in QTs, including at CERN, issues such as how to obtain appropriate funding to enhance European technological leadership, remain. Discussions highlighted the opportunities for new generations of physicists from the Balkans to help build this “second quantum revolution”. 

In addition to the roundtables, four high-level scientific satellite events took place, attracting a further 150 on-site participants: the COST Workshop on Theoretical Aspects of Quantum Gravity; the SEENET–MTP Assessment Meeting and Workshop; the COST School on Quantum Gravity Phenomenology in the Multi-Messenger Approach; and the CERN–SEENET–MTP–ICTP PhD School on Gravitation, Cosmology and Astroparticle Physics. The latter is part of a unique regional programme in HEP initiated by SEENET–MTP (Southeastern European Network in Mathematical and Theoretical Physics) and CERN in 2015, and joined by the ICTP in 2018, which has contributed to the training of more than 200 students in 12 SEENET countries. 

The BPU11 Congress, the largest event of its type in the region since the beginning of the COVID-19 pandemic, contributed to closer cooperation between the Balkan countries and CERN, ICTP, SISSA, the Central European Initiative and others. It was possible thanks to the support of the EPS, ICTP and CEI-Trieste, CERN, EPJ, as well as the Serbian ministry of science and institutions active in physics and mathematics in Serbia. In addition to the BPU11 PoS Proceedings, several articles based on invited lectures will be published in a focus issue of EPJ Plus “On Physics in the Balkans: Perspectives and Challenges”, as well as in a special issue of IJMPA.

Unconventional music @ CERN

Honouring the 100th anniversary of Einstein’s Nobel prize, the Swedish embassy in Bern collaborated with CERN for an event connecting science and music, held at the CERN Globe of Science and Innovation on 19 October. The event was originally planned for 2021 but was postponed due to the pandemic.

Brian Foster (University of Oxford) talked about Einstein’s love for music and playing the violin, which was underlined with many photos showing Einstein with some of the well-known violinists of the time. Around the period Einstein was awarded the Nobel prize, Russian engineer Lev Termen invented the theremin, consisting of two antennae and played without physical contact. This caught Einstein’s attention and it is said that he even played the theremin himself once.

Delving further into the unconventional, LHC physicists performed Domenico Vicinanza’s (GEANT and Anglia Ruskin University) “Sonification of the LHC”, for which the physicist-turned composer mapped data recorded by the LHC experiments between 2010 and 2013 into music. First performed in 2014 on the occasion of CERN’s 60th anniversary, Vicinanza’s piece is intended as a metaphor for scientific cooperation, in which different voices and perspectives can reach the same goal only by playing together.

There followed the debut of an even more unconventional piece of music by The Stone Martens – a Swiss and Swedish “noise collaboration” improvised by Henrik Rylander and Roland Bucher. By sending the output of his theremin through guitar-effects pedals, Rylander created a unique sound. Together with Bucher’s self-made “noise table”, with which he sampled acoustic instruments and everyday objects, the duo created a captivating, otherworldly sound collage that was well received by the 160-strong audience. The event closed with an unconventional Bach concerto for two violins in which these unique sounds were fused with traditional instruments. Anyone interested in experiencing the music for themselves can find a recorded version at https://indico.cern.ch/event/1199556/.

A powerful eye opener into the world of AI

The appearance of the word “for” rather than “in” in the title of this collection raises the bar from an academic description to a primer. It is neither the book’s length (more than 800 pages), nor the fact that the author list resembles a who’s who in artificial intelligence (AI) research carried out in high-energy physics that makes this book live up to its premise; it is the careful crafting of its content and structure.

Artificial intelligence is not new to our field. On the contrary, some of the concepts and algorithms have been pioneered in high-energy physics. Artificial Intelligence for High Energy Physics credits this as well as reaching into very recent AI research. It covers topics ranging from unsupervised machine-learning techniques in clustering to workhorse tools such as boosted decision trees in analyses, and from recent applications of AI in event reconstruction to simulations at the boundary where AI can help us to understand physics.

Each chapter follows a similar structure: after setting the broader context, a short theoretical introduction into the tools (and, where possible, the available software) is given, which is then applied and adapted to a high-energy physics problem. The ratio of in-depth theoretical background to AI concepts and the focus on applications is well balanced, and underlines the work of the editors, who avoided duplication and cross-reference individual chapters and topics. The editors and authors have not only created a selection of high-quality review articles, but a coherent and remarkably good read. Takeaway messages in the chapter for distributed training and optimisation stand out, and one might wish that this concept found more resonance throughout the book.

Artificial Intelligence for High Energy Physics

Sometimes, the book can be used as a glossary, which helps to bridge the gaps that seem to exist simply because high-energy physicists and data scientists use different names for similar or even identical things. While the book can certainly be used as a guide for a physicist in AI, an AI researcher with the necessary physics knowledge may not be served quite so well.

In an ideal world, each chapter would have a reference dataset to allow the reader to follow the stated problems and learn through building and exercising the described pipelines. This, however, would turn the book from a primer into a textbook for AI in high-energy physics. To be fair, wherever possible the authors of the chapters have used and referred to publicly available datasets, and one chapter is devoted to the issue of arranging  a community data competition, such as the TrackML challenge in 2018.

As for the most important question – have I learned something new? – the answer is a resounding “yes”. While none of the broad topics and their application to high-energy physics will come as a surprise to those who have been following the field in recent years, there are neat projects and detailed applications showcased in this book. Furthermore, reading about a familiar topic in someone else’s words can be a powerful eye opener.

Innovation on show for future ep/eA colliders

Following the publication of an updated conceptual design report in 2021, CERN continues to support studies for the proposed electron–hadron colliders LHeC and FCC-eh as potential options for the future, and to provide input to the next update of the European strategy for particle physics, with emphasis on FCC. LHeC would require the LHC to be modified, while FCC-eh is a possible operational mode of the proposed Future Circular Collider at CERN. A key factor in studies for a possible future “ep/eA” collider is power consumption, for which researchers around the world are exploring the use of energy recovery linacs at the high-energy frontier.

The ep/eA programme finds itself at a crossroad between nuclear and particle physics, with synergies with astroparticle physics. It has the potential to empower the High-Luminosity LHC (HL-LHC) physics programme in a unique way, and allows for a deeper exploration of the electroweak and strong sectors of the Standard Model beyond what can be achieved with proton–proton collisions alone. In many cases, adding LHeC to HL-LHC data can significantly improve the precision of Higgs-boson measurements – similar to the improvements expected when moving from the LHC to HL-LHC.

The innovative spirit of the ep/eA community was demonstrated during the workshop “Electrons for the LHC – LHeC/FCCeh and PERLE” held at IJCLab from 26 to 28 October. As the ep/eA community moves from the former HERA facility and from the Electron-Ion Collider, currently under construction at Brookhaven, to higher energies at LHeC and FCC-eh, the threshold will be reached to study electroweak, top and Higgs physics in deep-inelastic scattering (DIS) processes for the first time. In addition, these programmes enable the exploration of the low Bjorken-x frontier orders of magnitude beyond current DIS results. At this stage, it is unclear what physics will be unlocked if hadronic matter is broken into even smaller pieces. In recent years, particle physicists have learned that the ultimate precision for Higgs-boson physics lies in the complementarity of e+e, pp and ep collisions, as embedded in the FCC programme, for example. Exploiting this complementarity is key to exploring new territories via the Higgs and dark sectors, as well as zooming in on potential anomalies in our data.

The October workshop underlined the advantage of a joint ep/pp/eA/AA/pA interaction experiment, and the need to further document its added scientific value. For example, a precision of 1 MeV on the W-boson mass could be within reach. In short, the ep data allows constraints to be placed on the most important systematic uncertainty when measuring the W-boson mass with pp data.

Reduced power 

Participants also addressed how to reduce the power consumption of LHeC and FCC-eh. PERLE, an expanding international collaboration revolving around a multi-turn demonstrator facility being pursued at IJCLab in Orsay for energy recovery linacs (ERLs) at high beam currents, is ready to become Europe’s leading centre for developing and testing sustainable accelerating systems.  

At this stage, it is unclear what physics will be unlocked if hadronic matter is broken into even smaller pieces

As demonstrated at the workshop, with additional R&D on ERLs and ep colliders we might be able to further reduce the power consumption of the LHeC (and FCC-eh) to as low as 50 MW. These values are to be compared with the GW power consumption if there was no energy recovery and therefore provide a power-economic avenue to extend the Higgs precision frontier beyond the HL-LHC. ERLs are not uniquely applicable to eA colliders, but have been discussed for future linear and circular e+e colliders too. With PERLE and other sustainable accelerating systems, the ep/eA programme has the ambition to deliver a demonstration of ERL technology at high beam current, potentially towards options for an ERL-based Higgs factory.

Workshop participants are engaged to further develop an ep/eA programme with the ability to significantly enrich this overall strategy with a view to finding cracks in the Standard Model and/or finding new phenomena that further our understanding of nature at the smallest and largest scales.

Lost in the landscape

What is string theory?

I take a view that a lot of my colleagues will not be too happy with. String theory is a very precise mathematical structure, so precise that many mathematicians have won Fields medals by making contributions that were string-theory motivated. It’s supersymmetric. It exists in flat or anti-de Sitter space (that is, a space–time with a negative curvature in the absence of matter or energy). And although we may not understand it fully at present, there does appear to be an exact mathematical structure there. I call that string theory with a capital “S”, and I can tell you with 100% confidence that we don’t live in that world. And then there’s string theory with a small “s” – you might call it string-inspired theory, or think of it as expanding the boundaries of this very precise theory in ways that we don’t know how to at present. We don’t know with any precision how to expand the boundaries into non-supersymmetric string theory or de Sitter space, for example, so we make guesses. The string landscape is one such guess. It’s not based on absolutely precise capital-S string theory, but on some conjectures about what this expanded small-s string theory might be. I guess my prejudice is that some expanded version of string theory is probably the right theory to describe particle physics. But it’s an expanded version, it’s not supersymmetric. Everything we do in anti-de-Sitter-space string theory is based on the assumption of absolute perfect supersymmetry. Without that, the models we investigate are rather speculative. 

How has the lack of supersymmetric discoveries at the LHC impacted your thinking?

All of the string theories we know about with any precision are exactly supersymmetric. So if supersymmetry is broken at the weak scale or beyond, it doesn’t help because we’re still facing a world that is not exactly supersymmetric. This only gets worse as we find out that supersymmetry doesn’t seem to even govern the world at the weak scale. It doesn’t even seem to govern it at the TeV scale. But that, I think, is secondary. The first primary fact is that the world is not exactly supersymmetric and string theory with a capital S is. So where are we? Who knows! But it’s exciting to be in a situation where there is confusion. Anything that can be said about how string theory can be precisely expanded beyond the supersymmetric bounds would be very interesting. 

What led you to coin the string theory “landscape” in 2003? 

A variety of things, among them the work of other people, in particular Polchinski and Bousso, who conjectured that string theories have a huge number of solutions and possible behaviours. This was a consequence, later articulated in a 2003 paper abbreviated “KKLT” after its authors, of the innumerable (initial estimates put it at more than 10500) different ways the additional dimensions of string theory can be hidden or “compactified”. Each solution has different properties, coupling constants, particle spectra and so forth. And they describe different kinds of universes. This was something of a shock and a surprise; not that string theory has many solutions, but that the numbers of these possibilities could be so enormous, and that among those possibilities were worlds with parameters, in particular the cosmological constant, which formed a discretuum as opposed to a continuum. From one point of view that’s troubling because some of us, me less than others, had hoped there was some kind of uniqueness to the solutions of string theory. Maybe there was a small number of solutions and among them we would find the world that we live in, but instead we found this huge number of possibilities in which almost anything could be found. On the other hand, we knew that the parameters of our world are unusual, exceptional, fine-tuned – not generic, but very special. And if the string landscape could say that there would be solutions containing the peculiar numbers that we face in physics, that was interesting. Another motivation came from cosmology: we knew on the basis of cosmic-microwave-background experiments and other things that the portion of the universe we see is very flat, implying that it is only a small part of the total. Together with the peculiar fine-tunings of the numbers in physics, it all fitted a pattern: the spectrum of possibilities would not only be large, but the spectrum of things we could find in the much bigger universe that would be implied by inflation and the flatness of the universe might just include all of these various possibilities. 

So that’s how anthropic reasoning entered the picture?

All this fits together well with the anthropic principle – the idea that the patterns of coupling constants and particle spectra were conditioned on our own existence. Weinberg was very influential in putting forward the idea that the anthropic principle might explain a lot of things. But at that time, and probably still now, many people hated the idea. It’s a speculation or conjecture that the world works this way. The one thing I learned over the course of my career is not to underestimate the potential for surprises. Surprises will happen, patterns that look like they fit together so nicely turn out to be just an illusion. This could happen here, but at the moment I would say the best explanation for the patterns we see in cosmology and particle physics is a very diverse landscape of possibilities and an extremely large universe – a multiverse, if you like – that somehow manifests all of these possibilities in different places. Is it possible that it’s wrong? Oh yes! We might just discover that this very logical, compelling set of arguments is not technically right and we have to go in some other direction. Witten, who had negative thoughts about the anthropic idea, eventually gave up and accepted that it seems to be the best possibility. And I think that’s probably true for a lot of other people. But it can’t have the ultimate influence that a real theory with quantitative predictions can have. At present it’s a set of ideas that fit together and are somewhat compelling, but unfortunately nobody really knows how to use this in a technical way to be able to precisely confirm it. That hasn’t changed in 20 years. In the meantime, theoretical physicists have gone off in the important direction of quantum gravity and holography. 

Possible string-theory solutions

What do you mean by holography in the string-theory context?

Holography predates the idea of the landscape. It was based on Bekenstein’s observation that the entropy of a black hole is proportional to the area of the horizon and not the volume of the black hole. It conjectures that the 3D world of ordinary experience is an image of reality coded on a distant 2D surface. A few years after the holographic principle was first conjectured, two precise versions of it were discovered; so called M(atrix) theory in 1996 and Maldacena’s “AdS/CFT” correspondence in 1997. The latter has been especially informative. It holds that there is a holographic duality between anti-de Sitter space formulated in terms of string theory, and quantum field theories that are similar to those that describe elementary particles. I don’t think string theory and holography are inconsistent with each other. String theory is a quantum theory that contains gravity, and all quantum mechanical gravity theories have to be holographic. String theory and holographic theory could well be the same thing. 

Almost anything we learn will be a large fraction of what we know

One of the things that troubles me about the standard model of cosmology, with inflation and a positive cosmological constant, is that the world, or at least the portion of it that we see, is de Sitter space. We do not have a good quantum understanding of de Sitter space. If we ultimately learn that de Sitter space is impossible, that would be very interesting. We are in a situation now that is similar to 20 years ago, where very little progress has been made in the quantum foundations of cosmology and in particular in the so-called measurement problem, where we don’t know how to use these ideas quantitatively to make predictions. 

What does the measurement problem have to do with it? 

The usual methodology of physics, in particular quantum mechanics, is to imagine systems that are outside the systems we are studying. We call these systems observers, apparatuses or measuring devices, and we sort of divide the world into those measuring devices and the things we’re interested in. But it’s quite clear that in the world of cosmology/de Sitter space/eternal inflation, that we’re all part of the same thing. And I think that’s partly why we are having trouble understanding the quantum mechanics of these things. In AdS/CFT, it’s perfectly logical to think about observers outside the system or observers on the boundary. But in de Sitter space there is no boundary; there’s only everything that’s inside the de Sitter space. And we don’t really understand the foundations or the methodology of how to think about a quantum world from the inside. What we’re really lacking is the kind of precise examples we have in the context of anti-de Sitter space, which we can analyse. This is something I’ve been looking for, as have many others including Witten, without much success. So that’s the downside: we don’t know very much.

What about the upsides? 

The upside is that almost anything we learn will be a large fraction of what we know. So there’s potential for great developments by simply understanding a few things about the quantum mechanics of de Sitter space. When I talk about this to some of my young friends, they say that de Sitter space is too hard. They are afraid of it. People have been burned over the years by trying to understand inflation, eternal inflation, de Sitter space, etc, so it’s much safer to work on anti-de Sitter space. My answer to that is: yes, you’re right, but it’s also true that a huge amount is known about anti-de Sitter space and it’s hard to find new things that haven’t been said before, whereas in de Sitter space the opposite is true. We will see, or at least the young people will see. I am getting to the point where it is hard to absorb new ideas.

To what extent can the “swampland” programme constrain the landscape?

The swampland is a good idea. It’s the idea that you can write down all sorts of naive semi-classical theories with practically infinite options, but that the consistency with quantum mechanics constrains the things that are possible, and those that violate the constraints are called the swampland. For example, the idea that there can’t be exact global symmetries in a quantum theory of gravity, so any theory you write down that has gravity and has a global symmetry in it, without having a corresponding gauge symmetry, will be in the swampland. The weak-gravity conjecture, which enables you to say something about the relative strengths of gauge forces and gravity acting on certain particles, is another good idea. It’s good to try to separate those things you can write down from a semi-classical point of view and those that are constrained by whatever the principles of quantum gravity are. The detailed example of the cosmological constant I am much less impressed by. The argument seems to be: let’s put a constraint on parameters in cosmology so that we can put de Sitter space in the swampland. But the world looks very much like de Sitter space, so I don’t understand the argument and I suspect people are wrong here.

What have been the most important and/or surprising physics results in your career?

I had one big negative surprise, as did much of the community. This was a while ago when the idea of “technicolour” – a dynamical way to break electroweak symmetry via new gauge interactions – turned out to be wrong. Everybody I knew was absolutely convinced that technicolour was right, and it wasn’t. I was surprised and shocked. As for positive surprises, I think it’s the whole collection of ideas called “it from qubit”. This has shown us that quantum mechanics and gravity are much more closely entangled with each other than we ever thought, and that the apparent difficulty in unifying them was because they were already unified; so to separate and then try to put them back together using the quantisation technique was wrong. Quantum mechanics and gravity are so closely related that in some sense they’re almost the same thing. I think that’s the message from the past 20 – and in particular the past 10 – years of it–from-qubit physics, which has largely been dominated by people like Maldacena and a whole group of younger physicists. This intimate connection between entanglement and spatial structure – the whole holographic and “ER equals EPR” ideas – is very bold. It has given people the ability to understand Hawking radiation, among other things, which I find extremely exciting. But as I said, and this is not always stated, in order to have real confidence in the results, it all ultimately rests on the assumption of theories that have exact supersymmetry. 

What are the near-term prospects to empirically test these ideas?

One extremely interesting idea is “quantum gravity in the lab” – the idea that it is possible to construct systems, for example a large sphere of material engineered to support surface excitations that look like conformal field theory, and then to see if that system describes a bulk world with gravity. There are already signs that this is true. For example, the recent claim, involving Google, that two entangled quantum computers have been used to send information through the analogue of a wormhole shows how the methods of gravity can influence the way quantum communication is viewed. It’s a sign that quantum mechanics and gravity
are not so different.

Do you have a view about which collider should follow the LHC? 

You know, I haven’t done real particle physics for a long time. Colliders fall into two categories: high-precision e+e colliders and high-energy proton–proton ones. So the question is: do we need a precision Higgs factory at the TeV scale or do we want to search for new phenomena at higher energies? My prejudice is the latter. I’ve always been a “slam ‘em together and see what comes out” sort of physicist. Analysing high-precision data is always more clouded. But I sure wouldn’t like anyone to take my advice on this too seriously.

τ-lepton polarisation measured in Z-boson decays

CMS figure 1

Precision electroweak measurements are a powerful way to probe new physics, through the indirect effects predicted by quantum field theory. The electroweak mixing angle θWeff is particularly sensitive to new phenomena related to electroweak symmetry breaking and the Brout–Englert–Higgs mechanism. It was measured at LEP in different processes; and at the LHC, thanks to the large number of collected events with Z-boson decays, the experiments can probe these effects with comparable sensitivity.

The CMS collaboration has reported a new measurement of the tau-lepton polarisation in the decay of Z bosons to a pair of tau leptons in proton–proton collisions at 13 TeV. The polarisation is defined as the asymmetry between the cross sections for the production of τ with positive and negative helicities, and is directly related to the electroweak mixing angle via the relation Pτ ≈ –2(1–4 sin2θWeff). The polarisation of the tau lepton is determined from the angular distributions of the visible tau decay products, leptonic or hadronic, with respect to the τ flight direction or relative to each other. A so-called optimal polarisation observable is constructed using all of these angular properties of the tau decay products. Since the spin states in Z0ττ+ are almost 100% anti-correlated, the sensitivity is improved by combining the spin observables of both τ leptons of the pair.

CMS figure 2

The average polarisationPτis obtained by a template fit to the observed optimal τ-polarisation observables, using tau-lepton pairs with an invariant mass in the range 75–120 GeV. As summarised in figure 1, the best sensitivity of Pτ is found in the channel where one tau decays to a muon and the other decays hadronically, thanks to the good selection efficiency and reconstruction of the spin observable in this channel. The fully hadronic final state suffers from higher trigger thresholds, which lead to fewer events and distortions of the templates.

The average τ polarisation is corrected to the value at the Z pole, Pτ (Z0) = –0.144 ± 0.006 (stat.) ± 0.014 (syst.), where the systematic uncertainty is dominated by the incorrect identification of the products of hadronically decaying tau leptons. The effective weak mixing angle is then determined as sin2θWeff= 0.2319 ± 0.0019, in agreement with the Standard Model (SM) prediction. Figure 2 compares the tau– lepton asymmetry parameter (the negative of the polarisation) with results from previous experiments, demonstrating that the CMS measurement is nearly as precise as those of single LEP experiments.

This measurement shows that LHC collision events, although much more complex than those collected at LEP, can provide precise determinations of the polarisation of the τ lepton, as well as of spin correlations between τ-lepton pairs. Such measurements are crucial to probe the CP properties of the Higgs boson’s Yukawa coupling to τ leptons, which is an important step in the path to understand the Higgs sector of the SM.

Meenakshi Narain 1964–2023

Meenakshi Narain

Experimental particle physicist Meenakshi Narain, an inspirational leader and champion of diversity, died unexpectedly on 1 January 2023 in Providence, RI. Considered by many as a “force of nature”, Meenakshi’s impact on the physics community has left an indelible mark.

Meenakshi grew up in Gorakhpur, India and emigrated to the US in 1984 for graduate school at SUNY Stony Brook. Her PhD thesis, based on data taken by the CUSB-II detector at CESR, utilised inclusive photon spectra from upsilon decays for both spectroscopy measurements and searches for exotic particles, including the Higgs boson. In 1991 Meenakshi joined Fermilab as a postdoc on the DØ experiment, where she was a principal player in the 1995 discovery of the top quark, leading a group searching for top anti–top pair production in the dilepton channel. Over the next decade, as a Fermilab Wilson Fellow and a faculty member at Boston University, she made seminal contributions to measurements of top-quark pair and single-top production, as well as to the top-quark mass, width and couplings. 

In 2007, upon joining the faculty at Brown University, Meenakshi joined the CMS experiment at the LHC. In addition to pioneering a number of exotic searches for high-mass resonances, new heavy gauge bosons and top-quark partners, she continued to make innovative contributions to precision top-quark measurements. Her foundational work on b- and c-quark identification also paved the way for Higgs boson searches and measurements. As a leader of the CMS upgrade studies group, Meenakshi coordinated physics studies for several CMS technical design reports for the High-Luminosity LHC upgrade, and an impressive number of results for the CERN yellow reports. She was also a key contributor to the US CMS outer tracker upgrade. 

The tutorials and workshops Meenakshi organised as co-coordinator of the LHC Physics Center (LPC) were pivotal in advancing the careers of many young scientists, whom she cared about deeply. As chair of the US CMS collaboration board, she was a passionate advocate for the LHC research programme. She created an inclusive, supportive community that participated in movements such as Black Lives Matter, and tackled numerous challenges imposed by the COVID-19 pandemic.

A strong voice for women and under-represented minorities in physics, Meenakshi was the founding co-chair of the CMS diversity office and the driving force behind the CMS task force on diversity and inclusion and the CMS women’s forum. She mentored a large group of students, post-docs and scientists from diverse backgrounds, and created PURSUE – an internship programme that provides summer research opportunities at CMS to students from minority-serving institutions.

Meenakshi’s illustrious career has been recognised via numerous accolades and positions of responsibility. She is remembered for her recent co-leadership of the Snowmass energy-frontier study, her service on HEPAP and her new appointment to the P5 subpanel, in addition to her new position as the first woman to chair the physics department at Brown. She will be remembered as a brilliant scientist, a beloved mentor and an inspiring leader who made the world a better, more equitable and inclusive place.

Lars Brink 1943–2022

Lars Brink

It is with great sadness that we learnt of the passing of Lars Brink on 29 October 2022 at the age of 78. Lars Brink was an emeritus professor at Chalmers University Göteborg, Sweden and a member of the Royal Swedish Academy. He started his career as a fellow in the CERN theory group (1971–1973), which was followed by a stay at Caltech as a scientific associate (1976–1977). In subsequent years he was a frequent visitor at CERN, Caltech and ITP Santa Barbara, before becoming a full professor of theoretical physics at Chalmers in 1986, which under his guidance became an internationally leading centre for string theory and supersymmetric field theories. 

Lars held numerous other appointments, in particular as a member and chairperson on the board of NORDITA, the International Center for Fundamental Physics in Moscow, and later as the chairperson of the advisory board of the Solvay Foundation in Brussels. Since 2004 he was an external scientific member of the Max Planck Institute for Gravitational Physics in Golm. During his numerous travels Lars was welcomed by many leading institutions all over the world. He also engaged in many types of community service, such as the coordination of the European Union network “Superstring Theory” since 2000. Most importantly, he served on the Nobel Committee for physics many years, and as its chairperson for the 2013 Nobel Prize in Physics awarded to François Englert and Peter Higgs. 

Lars was a world-class theoretical physicist, with many pioneering contributions, especially to the development of supergravity and superstring theory, as well as many other topics. One of his earliest contributions was a beautiful derivation of the critical dimension of the bosonic string (with Holger Bech Nielsen), obtained by evaluating the formally divergent sum over zero-point energies of the infinitely many string oscillators; this derivation is now considered a standard textbook result. In 1976, with Paolo Di Vecchia and Paul Howe, he presented the first construction of the locally supersymmetric world-sheet Lagrangian for superstrings (also derived by Stanley Deser and Bruno Zumino) which now serves as the basis for the quantisation of the superstring and higher loop calculations in the Polyakov approach. His seminal 1977 work with Joel Scherk and John Schwarz on the construction of maximal (N = 4) supersymmetric Yang–Mills theory in four dimensions laid the very foundation for key developments of modern string theory and the AdS/CFT correspondence that came to dominate string-theory research only much later. Independently of Stanley Mandelstam, he proved the UV finiteness of the N = 4 theory in the light-cone gauge in 1983, together with Olof Lindgren and Bengt Nilsson – another groundbreaking result. Equally influential is his work with Michael Green and John Schwarz on deriving supergravity theories as limits of string amplitudes. More recently, he devoted much effort to a reformulation of N = 8 supergravity in light-cone super-space (with Sudarshan Ananth and Pierre Ramond). His last project before his death was a reevaluation and pedagogical presentation of Yoichiro Nambu’s seminal early papers (with Ramond).

Lars received numerous honours during his long career. In spite of these achievements he remained a kind, modest and most approachable person. Among our many fondly remembered encounters we especially recall his visit to Potsdam in August 2013, when he revived an old tradition by inviting the Nobel Committee to a special retreat for its final deliberations. The concluding discussions of the committee thus took place in Einstein’s summer house in Caputh. Of course, we were all curious for any hints from the predictably tight-lipped Swedes in advance of the official Nobel announcement, but in the end the only useful information we got out of Lars was that the committee had crossed the street for lunch to eat mushroom soup in a local restaurant!

He leaves behind his wife Åsa, and their daughters Jenny and Maria with their families, to whom we express our sincere condolences. We will remember Lars Brink as a paragon of scientific humility and honesty, and we miss a great friend and human being.

Neutrino scattering sizes up the proton

More than a century after its discovery, physicists are still working hard to understand how fundamental properties of the proton – such as its mass and spin – arise from its underlying structure. A particular puzzle concerns the proton’s size, which is an important input to understand nuclei, for example. Inelastic electron–proton scattering experiments in the late 1950s revealed the spatial distribution of charge inside the proton, allowing its radius to be deduced. A complementary way to determine this “charge radius”, and which relies on precise quantum-electrodynamics calculations, is to measure the shift it produces in the lowest energy levels of the hydrogen atom. Over the decades, numerous experiments have measured the proton’s size with increasing precision. 

By 2006, based on results from scattering and spectroscopic measurements, the Committee on Data for Science and Technology (CODATA) had established the proton charge radius to be 0.8760(78) fm. Then, in 2010, came a surprise: the CREMA collaboration at the Paul Scherrer Institut (PSI) reported a value of 0.8418(7) fm based a novel, high-precision spectroscopic measurement of muonic hydrogen. Disagreeing with previous spectroscopic measurements, and lying more than 5σ below the CODATA world average, the result gave rise to the “proton radius puzzle”. While the most recent electron–proton scattering and hydrogen-spectroscopy measurements are in closer agreement with the latest muonic-hydrogen results, the discrepancies with earlier experiments are not yet fully understood.

Now, the MINERνA collaboration has brought a new tool to gauge the proton’s size: neutrino scattering. Whereas traditional scattering measurements probe the proton’s electric or magnetic charge distributions, which are encoded in vector form factors, scattering by neutrinos allows the analogous axial-vector form factor FA, which characterises the proton’s weak charge distribution, to be measured. In addition to providing a complementary probe of proton structure, FA is key to precise measurements of neutrino-oscillation parameters at experiments such as DUNE, Hyper-K, NOvA and T2K.

MINERνA is a segmented scintillator detector with hexagonal planes made from strips of triangular cross-section, which are assembled into planes perpendicular to the incoming beam. By studying how a beam of muon antineutrinos produced by Fermilab’s NuMI neutrino beamline interacts with a polystyrene target, which contains hydrogen closely bonded to carbon, the MINERνA researchers were able to make the first high-statistics measurement of the νμ p → μ+ n cross-section using the hydrogen atom in polystyrene. Extracting FA from 5580 ± 180 signal events (observed over an estimated background of 12,500), they measured the nucleon axial charge radius to be 0.73(17) fm, in agreement with the electric charge radius measured with electron scattering.

“If we weren’t optimists, we’d say [this measurement] was impossible,” says lead author Tejin Cai, who proposed the idea of using a polystyrene target to access neutrino-hydrogen scattering while a PhD student at the University of Rochester. “The hydrogen and carbon are chemically bonded, so the detector sees interactions on both at once. But then, I realised that the very nuclear effects that made scattering on carbon complicated also allowed us to select hydrogen and would allow us to subtract off the carbon interactions.”

A new experiment called AMBER, at the M2 beamline of CERN’s Super Proton Synchrotron, is about to open another perspective on the proton charge radius. AMBER is the successor to COMPASS, which played a major role towards resolving the proton “spin crisis” (the finding, by the European Muon Collaboration in 1987, that quarks account for less than a third of the total proton spin) by studying the contribution to the proton spin from gluons. Instead of electrons, AMBER will use muon scattering at unprecedented energies (around 100 GeV) to access the small momentum-transfer needed to measure the proton radius. A future experiment at PSI called MUSE, meanwhile, aims to determine the proton radius through simultaneous measurements of muon– and electron–proton scattering.

AMBER is scheduled to start with a pilot run in September 2023 and to operate for up to three years, with the goal to find a value for the proton radius in the range 0.84–0.88 fm, as expected from previous experiments, and with an uncertainty of about 0.01 fm. “Some colleagues say that there is no proton-radius puzzle, only problematic measurements,” says AMBER spokesperson Jan Friedrich of TU Munich. “The discrepancy between theory and experiments, as well as between individual experiments, will have to shrink and align as much as possible. After all, there is only one true proton radius.” 

TeV photons challenge standard explanations

GRB 221009A

Gamma-ray bursts (GRBs) are the result of the most violent explosions in the universe. They are named for their bright burst of high-energy emission, mostly in the keV to MeV region, which can last from milliseconds to hundreds of seconds, and are followed by an afterglow that covers the full electromagnetic spectrum. The extreme nature and important role in the universe of these extragalactic events – for example in the production of heavy elements, potential cosmic-ray acceleration or even mass-extinction events on Earth-like planets – makes them one of the most studied astrophysical phenomena. 

Since their discovery in 1967, detailed studies of thousands of GRBs show that they are the result of cataclysmic events, such as neutron-star binary mergers. The observed gamma-ray emission is produced (through a yet-unidentified mechanism) within relativistic jets that decelerate when they strike interstellar matter, resulting in the observed afterglow. 

But interest in GRBs goes beyond astrophysics. Due to the huge energies involved, they are also a unique lab to study the laws of physics at their extremes. This once again became clear on 9 October 2022, when a GRB was detected that was not only the brightest ever but also appeared to have produced an emission that is difficult to explain using standard physics.

Eye-catching emission

“GRB 221009A” immediately caught the eye of the multi-messenger community, its gamma-ray emission being so bright that it saturated many observatories. As a result, it was also observed by a wide range of detectors covering the electromagnetic spectrum, including at energies exceeding 10 TeV. Two separate ground-based experiments – the Large High Altitude Air Shower Observatory (LHAASO) in China and the Carpet-2 air-shower array in Russia – claimed detections of photons with an energy of 18 TeV and 251 TeV, respectively. This is significantly higher, by an order of magnitude, than the previous record for TeV emission from GRBs reported by the MAGIC and HESS telescopes in 2019 (CERN Courier January/February 2020 p10). Adding further intrigue, such high-energy emission from GRBs should not be able to reach Earth at all.

For photons with energies exceeding several TeV, electron–positron pair-production with optical photons starts to become possible. Although the cross section for this process only just exceeds its threshold at an energy of 2.6 TeV, it is compensated by the billions of light years of space filled with optical light that the TeV photons need to traverse before reaching us. Despite uncertainties in the density of this so-called extragalactic background light, a rough calculation using the distance of GRB 221009A (z = 0.151) suggests that the probability for an 18 TeV photon to reach Earth is around 10–8. 

Clearly we need to wait for the detailed analyses by LHAASO and Carpet-2 to confirm the measurements 

The reported measurements have thus far only been provided through alerts shared among the multi-messenger community, while detailed data analy­ses are still ongoing. Their significance, however, led to tens of beyond-the-Standard Model (BSM) explanations being posted on the arXiv preprint server within days of the alert. While each differs in the specific mechanism hypothesised, the overall idea is similar: instead of being produced directly in the GRB, the photons are posited to be a secondary product of BSM particles produced during or close to the GRB. Examples range from light scalar particles or right-handed neutrinos produced in the GRB and decaying within our galaxy, to photons that converted into axions close to the GRB and turned back into photons in the galactic magnetic field before reaching Earth.

Clearly the community needs to wait for the detailed analyses by the LHAASO and Carpet-2 collaborations to confirm the measurements. The published energy resolution of LHAASO keeps open the possibility that their results can be explained with Standard Model physics, while the 251 TeV emission from Carpet-2 is more difficult to attribute to known systematic effects. This result could, however, be explained by secondary particles resulting from an ultra-high energy cosmic-ray (UHECR) produced in the GRB which, although would not represent new physics, would still confirm GRBs as a source of UHECRs for the first time. Analysis results from both collaborations are therefore highly anticipated.

bright-rec iop pub iop-science physcis connect